

Gender and Artificial Intelligence
OpinionBiasGenderposted by Ambar Kleinbort February 10, 2020 Ambar Kleinbort

Society’s view of gender is at a critical point. The same is true for AI. So, how are these issues intersecting and what challenges are we facing in the realm of gender and artificial intelligence?
[Related article: Apple Pay Card’s Credit Determining AI: Gender Biased?]
1. AI is Overwhelmingly Female
In the US, 94.6% of secretaries and administrative assistants are women. Unsurprisingly, so is Siri. There are almost no major AI assistance projects that avoid this pitfall, and there isn’t much push for improvement. While a few AI teams have focused on representation, others have presented further controversies.
These teams are turning to the development of “genderless” voices to avoid criticism, but in reality, what does it mean to have a “genderless” voice? Is it something that we cannot recognize as human, or is it a voice that belongs to the non-binary/queer community? Are we unwittingly pushing LGBTQ+ people into the second-class citizen category? We must grapple with these questions in order to develop socially responsible AI.
On the other hand, there are also visual AIs, and the problem here is not just that most are women, but also the hyper-sexualization. Unsurprisingly, the tone of this sexualization is geared towards white, heterosexual men. Fictitious AIs like ex-Machina have led to “real-world” celebrities like Lil Miquela, an Instagram celebrity/pop-star made with AI.
While she may be a manager’s dream, women in this business have expressed this “woman-with-no-will” as their nightmare. This is the plot of Miley Cyrus’s Black Mirror episode, where they turn her into a robot so her manager can have total control over her…Where are the male AI celebrities? When will women escape the innocent, submissive servant becomes a femme fatale narrative? AI doesn’t seem to be helping.
2. Bias
Fill in the blank:
Man is to King as woman is to Queen.
Father is to Doctor as Mother is to _________.
I say Doctor, AI usually says nurse.
This issue is caused by something known as bias. The problem originates from having a poor dataset, which underrepresents or misrepresents a certain group. When we train our models on this data, the model becomes biased as well. Here, we have a fill-in-the-blank model that is biased towards women because the text it was trained on contained data that reflected women this way. This problem is common not just for women, but for non-binaries and racial minorities also.
Perhaps the most popular example of bias is facial-recognition, where error-rates are higher for women’s faces. It’s even higher for people with darker skin-tones and Asian women. The answer to these issues seems to be to collect better data, more representative data. Furthermore, we should continue to evaluate error-rates for different demographics separately so we can improve our AI models.
However, if we eliminated this sort of biases, the question becomes: Should we use gender to inform AI decisions at all? A lot of recommender systems guess your gender (using your online behavior) and then make suggestions based on that guess. Is that okay? Does it re-enforce gendered stereotypes? Do we need to understand gender better so we can classify male/female bodies and use that to inform medicine? Can we ask the same questions about race?
We are only beginning to understand the importance of these questions of gender and artificial intelligence, and the ways that we can answer them. AI teams who are responsible tend to focus on the data collection to eliminate bias, but fall short of tackling these implications, and we must bring them further into the spotlight.
3. Gendered Issues in the Workplace
The gender and artificial intelligence pay gap is not particularly bad in data science roles, but it is in data analytics roles (these are usually lower-paying jobs). Moreover, since the field is new, there is a great disparity when it comes to the psychology of the job market. Some people say that everyone in data science has imposter syndrome, but it’s especially the women. Furthermore, men are likely to apply to a job where they meet 50% of the requirements or more, while women usually only apply if they meet 90% of the requirements or more.
Other common workplace issues also sip into the AI world, like harassment, and awkward, non-sensical power dynamics. While these aren’t special to AI, they haunt the tech field with inequality and all its consequences.
There is also a disparity in the number of patents females file, and inappropriate maternal leave policies. If we look at the numbers, AI workplaces are still clearly extremely male-dominated. For example→
Even more shocking is venture capital →
Only 1–2% of the startups that receive VC funding are led by female founders, even though female-led companies make 200% returns on investment. This is the most gender-biased segment in the tech industry. — Pascale Fung
Yes, you read that right, ~2% of VC funded startups, like Facebook, are led by women.
The numbers speak for themselves.
4. Commodifying Real Life Women with AI
You may have heard of DeepNude, a particular instance of deep fakes that processes pictures of a person you choose and returns images of them completely naked. This is a perverse use for neural networks, where Anne Hathaway’s take on what the market has done with sexuality — “We live in a culture that commodifies the sexuality of unwilling participants” — is jarringly deepened by AI.
Would this have happened if the disparities in AI workplaces were not so extreme? Does AI-based harassment or abuse have a place in the #MeToo movement? I believe it does, and feminism, like every other area of life, still has some catching up to do to technology in order to keep it ethical.