In a small, humid apartment in Lagos, a woman named Amara (a composite of several developers I’ve interviewed) stares at a screen that refuses to see her. She is trying to build a healthcare app that identifies skin conditions in tropical climates. But every time she runs her data through a pre-trained image recognition model, the results come back skewed. The system is flawless at identifying a rash on pale skin in a London winter. On a dark-skinned child in the Nigerian sun, the software stutters. It labels life-saving diagnostic data as "noise."
This is not a glitch. It is a mirror. Learn more on a related subject: this related article.
Artificial Intelligence is often described as a god-like entity descending from the cloud, but it is actually more like a precocious, impressionable toddler. It learns from what we show it. It mimics our tone. It adopts our blind spots. Right now, we are raising this toddler in a house where the windows are boarded up, and only one type of person is allowed to speak.
We are building the future of human intelligence without the people who have spent centuries navigating the complexities of survival, care, and social cohesion. We are building it without women. Further journalism by ZDNet delves into related views on the subject.
The Architect’s Blind Spot
History is a heavy ghost. When the first seatbelts were designed, they were tested on "standard" crash test dummies modeled after the average male body. For decades, women involved in car accidents were 47% more likely to be seriously injured because the safety world didn’t account for their height, weight, or bone density. It wasn’t malice. It was a lack of presence.
We are repeating this exact mistake with code.
When an algorithm determines who gets a bank loan, who is shortlisted for a high-stakes job, or who receives a shorter prison sentence, it isn't "neutral." It is an aggregation of historical data. If the history of the world has been one of systemic exclusion, the AI will naturally view exclusion as the optimal path forward.
Consider the "Echo Effect." If an AI is trained on a massive library of internet text where "doctor" is predominantly associated with men and "nurse" with women, the AI doesn't just learn the association—it amplifies it. It decides that a woman being a doctor is a statistical anomaly to be corrected. In a world increasingly governed by automated decisions, being a statistical anomaly is a dangerous place to live.
The Invisible Stakes of the Home
Think about the "Smart Home." It sounds like a convenience—a luxury of lights that dim and ovens that preheat. But for a woman in an abusive relationship, a smart home can become a digital cage. Abusers have already begun using connected devices to change door codes, blast music at 3:00 AM, or track movements through smart thermostats.
The engineers who built these features were likely thinking about the joy of automation. They weren’t thinking about the terror of a thermostat controlled by a vengeful ex-partner three states away.
Why? Because they haven't lived that reality.
Safety isn't a feature you can patch in later with a software update. It has to be baked into the blueprint. To make AI safe, we have to stop treating "women’s issues" as a niche sub-category of ethics. They are the core of the human experience. If a technology isn't safe for a teenage girl in a rural village or a single mother in a high-rise, it isn't safe. Period.
The Labor of Care
There is a specific kind of intelligence that comes from the labor of care—the ability to read non-verbal cues, to manage conflicting needs, and to prioritize long-term stability over short-term gain. This is the "invisible labor" that women have performed for millennia.
In the tech industry, we call this "edge cases."
In reality, these aren't edges. They are the fabric of society. When we exclude women and girls from the leadership of AI development, we lose the perspective that understands how a community actually functions. We get "disruption" without "responsibility." We get tools that move fast and break things, only to realize that the things being broken are people’s lives, reputations, and safety.
The data supports the urgency. Reports from organizations like UN Women and the World Economic Forum consistently show that women are underrepresented in AI research and development. Only about 22% of AI professionals globally are women. This isn't just a diversity metric for a corporate slide deck. It is a systemic failure of design.
A room full of people with identical life experiences will always produce a product with the same set of vulnerabilities. They will miss the ways a chatbot might give dangerous advice to a girl struggling with an eating disorder. They will miss the ways a recruitment tool might penalize a resume for having a "gap" for maternity leave.
Redefining the Heart of the Machine
So, how do we move beyond the dry checklists of "ethical AI"?
It starts by moving the "Heart" of the technology. Usually, the heart is considered the processing power—the sheer speed of the GPU. But the true heart of any tool is its purpose.
If the purpose of AI is to maximize "engagement" (a polite word for addiction), it will continue to exploit the insecurities of young girls to keep them scrolling. If the purpose of AI is to maximize "efficiency," it will continue to steamroll the nuances of human empathy.
But imagine a different trajectory.
Imagine an AI designed by the Amaras of the world. Imagine a system where the primary goal is not just output, but protection. This requires more than just hiring a few female consultants. It requires a fundamental shift in who holds the steering wheel. It means investing in STEM education for girls in the Global South with the same ferocity we invest in Silicon Valley startups. It means creating "Safety by Design" frameworks that treat domestic violence and gender-based harassment as primary threats, not secondary concerns.
The Cost of Silence
The stakes are higher than we want to admit. We are currently in the middle of a land grab for the human mind. The companies that win this race will dictate how we learn, how we work, and how we see ourselves.
If women and girls are not at the center of this transition, the digital world will become a refined version of the old world’s worst habits. We will have built a gleaming, automated patriarchy where the biases are harder to see because they are hidden behind the "objective" mask of mathematics.
Mathematics is never objective when the variables are human.
I remember talking to a researcher who was trying to teach an AI to understand "consent." The machine struggled. It wanted a binary—a yes or a no. It couldn't grasp the hesitation, the power dynamics, or the silence that isn't an agreement. The researcher, a woman who had spent years working in crisis centers, knew that the silence was the most important part of the data.
She understood the "invisible stakes" because she had felt them.
The machine needs that intuition. It needs the lived experience of those who have had to be more observant, more careful, and more resilient just to navigate the physical world.
We are at a crossroads where we can either automate our prejudices or use this moment to finally outgrow them. The difference between those two futures isn't found in a faster chip or a larger dataset. It’s found in the person sitting at the keyboard.
If we keep the doors to the lab closed, we aren't just failing women. We are building a world that is fundamentally broken for everyone. A house built with only half the blueprints will always collapse in the storm.
Amara is still there, in the heat of Lagos, staring at her screen. She isn't waiting for a seat at the table anymore. She is building her own. The only question is whether the systems we are creating will be brave enough to recognize her when she arrives.