Ask your phone, or computer something, or call your bank and talk to the automated menu. Whatever you end up asking, a synthesized version of a woman is likely to answer you, polite and pleasant, no matter the tone or topic.
That is because Siri, Alexa, Cortana and their foremothers have been doing this work for years, ready to answer serious inquiries and deflect ridiculous ones. Though they lack bodies, they embody what we think of when we picture a personal assistant: a competent, efficient and reliable woman.
One might think that using an emotionless AI as a personal assistant would erase concerns about outdated gender stereotypes. But companies have repeatedly launched these products with female voices and in some cases named them. But when we can only see a woman, even an artificial one, in that position, we enforce a harmful culture.
Consumers still expect a friendly, helpful female in this scenario and that is what companies give them. "We tested many voices with our internal beta program and customers before launching and this voice tested best," an Amazon spokesperson told an online source.
Apple's Siri and the Google Assistant does offer the option to switch to a male voice, but Alexa and Cortana do not have male counterparts. Women use more pronouns and tentative according to Psychologist James W. Pennebaker. Pronoun use, particularly of the word "I" is indicative of lower social status.
AI assistants are very prone to use the word "I", particularly in taking responsibility for mistakes. Ask Siri a question she can't process and she says, "I'm not sure I understand."
It is critical that we challenge these stereotypical gender roles in our personal assistants. Our interactions with AI teach and train it, but we are also shaped by these experiences. It is why parents are concerned about unintentionally raising rude children when Alexa does not require a "please" or "thank you" to carry out a task.
Our relationships with technology are entering a new stage of intimacy, it is worrying to think of what will happen when some people's primary sexual experience will be a sexually acquiescent robot. Humans aiming for linguistic style matching in their social interactions, meaning they try to match the language patterns of the human – and now AI – with which they are speaking.
But as we enter the AI realm, there are serious personal and social consequences for treating it in a degrading manner. The companies behind AI are cashing in on the bias, which is not the way to a utopia, tech or otherwise.