Startup Launches 'Genderless' AI To Fight Gender Bias In Technology

The creative team behind Vice has launched what they're calling a "genderless" artifcial intelligence voice in order, they say, to combat the gender bias that pervades modern technology.

Ad Week reports that Vice's creative team unveiled "Q," the "genderless AI," at the South by Southwest Interactive festival this week in Austin, Texas. The voice itself, which they hope will take the place of female-identified smart assistances like Apple's Siri and Microsoft's Cortina, "is a composite of dozens of voices of people who identify as non-binary."

The team behind the voice recorded around two dozen people, developed four separate voices, and then tested them on 4,500 Europeans to determine which voice was the most "neutral."

The result is a voice that can't readily be identified as fully male or female, that sits between the two accepted ranges of "gendered" voices, and eliminates speech quirks typically associated with either "male" or "female" voices.

The company contends that a genderless voice was needed because technology companies are "pushing the boundaries of technology, but are doing so with outdated definitions of gender."

“They’ve got such a huge influence on the world because it’s in everyone’s pockets and in everyone’s homes,” Ryan Sherman, the senior creative in charge of the product, told AdWeek. “So our goal is to give people more options within those assistants rather than to create a new one.”

Feminists and social justice warriors in the field of technology have long been concened about the "sexism" that assistants like Siri and Cortana engender in their users, according to a report in WIRED from last year. Female voices are more appealing to users, and both Apple and Google's marketing teams reportedly found that users are more content to ask a "female" AI for assistance, even if Siri and Cortana both "identify" as genderless.

But those preferences, experts say, are the result of "cultural biases" and that people generally prefer female assistants (even artificial ones) because they are inculcated with sexist ideas. That sexism then plays out in the world of technology; companies looking to incorporate AI into their products are more likely to include female voices to appeal to the masses, thereby perpetuating a cycle of misogyny.

In some cases, feminists contend, Alexa and Siri, who don't always respond negatively to overtly sexist questions, train those who use the AI assistants to treat human assistants with the same lack of respect.

"Giving digital assistants human personalities has a long tradition, with things like Clippy the paperclip,” one professor told Huffington Post last year. “They’re meant to help the user form a better relationship with the machine so they can be guided through the system. I think giving them humanlike traits is not necessarily a bad thing, but the gender aspect is particularly insidious.”

To break that habit, Vice's creative team says, you need "nonbinary" options.

“It is because Q is likely to play with our minds that it is important,” one expert told WIRED. “It plays with our urge to put people into boxes and therefore has the potential to push people’s boundaries and broaden their horizons.”

( Source )