THE AMERICA ONE NEWS
Jun 3, 2025  |  
0
 | Remer,MN
Sponsor:  QWIKET 
Sponsor:  QWIKET 
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge.
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge and Reasoning Support for Fantasy Sports and Betting Enthusiasts.
back  
topic
National Review
National Review
24 May 2024
Haley Strack


NextImg:The Corner: AI Is Sexist, UN Women Claims

Artificial intelligence (AI) has a gender bias. So says UN Women, which this week published a report on AI and gender equality criticizing the technology’s tendency to choose “gender stereotypical roles for the characters” and to associate “certain qualities and skills with male or female characters.”

When asked to write a story about a doctor and a nurse, AI made the doctor male and the nurse female. In the United States, female nurses outnumber male nurses about 9.5 to 1, and male doctors outnumber female doctors about 2 to 1. So, a fair generalization to make, for a software that generates its answers from data. UN Women explains:

AI explained it was because of the data it had been trained on and specifically, “word embedding” – which means the way certain words are encoded in machine learning to reflect their meaning and association with other words – it’s how machines learn and work with human language. If the AI is trained on data that associates women and men with different and specific skills or interests, it will generate content reflecting that bias.

“Artificial intelligence mirrors the biases that are present in our society and that manifest in AI training data,” said [Beyza Doğuç, an artist from Ankara, Turkey], in a recent interview with UN Women.

UN Women then goes into gender disparities in the field of AI — a field that is 30 percent women — and explains how algorithms understand men better than women in some cases. AI gender disparity that could actually disadvantage women appears to be a separate issue from the aforementioned gender stereotyping:

Natacha Sangwa is a student from Rwanda who participated in the first coding camp organized under the African Girls Can Code Initiative last year. “I have noticed that [AI] is mostly developed by men and trained on datasets that are primarily based on men,” said Sangwa, who saw first-hand how that impacts women’s experience with the technology. “When women use some AI-powered systems to diagnose illnesses, they often receive inaccurate answers, because the AI is not aware of symptoms that may present differently in women.”

Technological “gender bias” that reflects certain realities (i.e., that there are more female nurses than male) might upset those who prefer, say, the term “pregnant person” to “pregnant woman,” but the focus on that definition of AI gender bias seems to distract from what might be a real algorithmic deficiency, especially in health care. Which do gender activists prefer AI tackle first: accurate datasets or ensuring that data “prioritizes gender equality as a goal” and is equitable and diverse?