


The story that OpenAI may have “borrowed” the voice of Scarlett Johansson is not an edifying one.
Rachel Metz, writing in Bloomberg:
[Johansson] received an offer from OpenAI Chief Executive Officer Sam Altman in September about voicing an audio feature for ChatGPT. She said she decided not to participate in the project, but was “shocked, angered and in disbelief” after hearing demos OpenAI released last week featuring a voice named Sky — a voice she thought sounded “eerily similar” to hers.
Creepy, if intentional. What’s more, as Metz explains, that there are also good business reasons why Johansson might be upset:
For any actor, their voice is essential: It’s a key part of what makes them able to play characters so different from their natural selves. For some actors, such as Johansson, it is also iconic — evocative and instantly recognizable — and that makes it valuable.
To me, Sky sounds more like one of the voices of a character on the “My Little Pony: Friendship Is Magic” cartoon. But this is a matter that will now be left to a small army of highly paid lawyers . . .
Altman has denied the allegations, but there’s another issue (apparently) that this story brings up.
Metz:
Sky follows a long history of tech companies casting female-sounding voices as digital assistants, especially when it comes to voice assistants (see Amazon’s Alexa, Apple’s Siri, and Microsoft’s Tay for just a few recent examples). Early on, female voices were thought to sound calming and non-threatening while delivering emergency messages for flight systems, according to Mar Hicks, a technology historian and associate professor of data science at the University of Virginia. Over time, Hicks said, an association developed between women’s voices being seen as calm and non-threatening, and for their use as our digital servants.
And this idea of seeing women’s voices as a source of comfort in technological settings apparently held true for the overture OpenAI made to Johansson: In Altman’s offer, “He told me that he felt that by my voicing the system, I could bridge the gap between tech companies and creatives and help consumers to feel comfortable with the seismic shift concerning humans and AI,” Johansson said in her statement.
And:
The saga also brings up new questions about how and why women in particular are highlighted in hopes of making people comfortable with technology — questions tech companies must grapple with if they continue to push customers to use ever-more-capable AI assistants.
They must, must they?
Customers can choose male or female voices for both Siri and Alexa’s technology, but that, it seems, is not enough.
A few years back, the U.N.’s UNESCO was on the case.
CNN (from 2019, a time when Siri’s technology, but not Alexa’s, offered male voices):
Siri, Alexa and other female-voiced AI assistants are perpetuating gender stereotypes and encouraging sexist and abusive language from users, a UN report has said.
The report by UNESCO warns of the negative consequences of the personal assistants, claiming they perpetuate the idea that “women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command.” . . .
“What emerges is an illusion that Siri — an unfeeling, unknowing, and non-human string of computer code — is a heterosexual female, tolerant and occasionally inviting of male sexual advances and even harassment. It projects a digitally encrypted ‘boys will be boys’ attitude.” . . .
Saniye Gülser Corat, UNESCO’s Director for Gender Equality, said much greater attention should be paid “to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”
Not really.
And attention should be paid by whom?
This issue — one that only has been dreamt up by would-be thought-controllers or the rent-seekers that feed off them — is a matter for AI companies and their customers. And only for them.