Siri, Sophia, Grace, Alexa, Cortana – the voices of Artificial Intelligence platforms are overwhelmingly created with a female sounding voice.
Is this an example of how to build Trusted AI?
Big Tech companies tell us that research (sometimes their own) shows “people” prefer female sounding voices.
You can read more about this research with these links (and a responsible use link policy).
This is not a fully inclusive list, but a representative selection of available and current research:
But here is a different perspective.
Of the top 20 narrators for audible books, 13 of those narrators are men. Are these the same “people” who defined a preference for female sounding voices?
This is the insidious nature of bias and prejudice.
On the surface, it seems so innocent, but even a slight scratch of the surface shows how clever marketing and underlying bias, come together to perpetuate stereotypes, and reinforce gender inequality pathways, especially in the fields of Information and Emerging Technologies.
If most retail ice cream stores only offered vanilla ice cream, with one or two other options, would we all really prefer vanilla ice cream, or is the choice so limited, we don’t entertain the idea that there are other choices.
It’s critical that we confirm our data sources in artificial intelligence. When we hear about algorithms gone wrong, there needs to be a full stop. The algorithm isn’t wrong, and the algorithm doesn’t have any prejudice. We are programming our own prejudice and bias into the algorithm. We previously published A how to for Ethical Leadership and Data Policies.
AIGB (Artificial Intelligence Gone Bad) can have devastating effects –
- Supports the portrayal of women in subservient roles of assistants, caretakers, and other “helpful” occupations. According to UNESCO’s 2019 report, I’d Blush if I Could, gender biases risk further stigmatizing and marginalizing women on a global scale.
- Artificial Intelligence in sex robots can increase the potential for gender- based violence.
- Lack of diversity in the field(s) of Artificial Intelligence (80% of professors are male) limit the influence that women, and under-represented groups can have on the direction that Artificial Intelligence takes.
- In crash testing results, women are 47% more likely to be seriously injured. Why? Crash testing results based on men’s physiques and their seating position.
The solution to build Trusted AI is easier than we might all think.
- Question the conclusions you read about and the stories you hear. Do we really prefer female voices for our Artificial Intelligence platforms? Or is that just our only option?
- Join us in the discussion. As a member of #WomeninTrustedAI, I’m excited to hear perspectives, talk about problems and build powerful, inclusive solutions.
- Buy products from vendors who do more than talk about gender and racial equality, so do the research by reading Big Tech companies diversity reports – look at them!
- Demand and Support research and participation from under-represented groups in the areas of emerging technologies.
Humanity has a long-standing history of turning a blind eye to emerging technologies being way laid by private industry and government for nefarious purposes.
There is an opportunity, and a demand, here to make sure that we don’t allow that to happen for artificial intelligence platforms. You can read more about Solutions for the Ethical use of Technology