Advertisement

Of course Siri is sexist – submissive, compliant, a PA and housekeeper rolled into one

This week a UN report confirmed what many of us have long suspected – that virtual assistants being perceived as female promotes problematic sexist ideals. Specifically, the study looked at Amazon’s Alexa and Apple’s Siri, both of which are marketed and sold as “female”.

Let’s pause here for a minute to state the obvious – machines of course do not have an inherent gender. Siri isn’t any more female than a laptop or a car is. However, the point of artificial intelligence (AI) is to recreate a human mind, to code in the ability to learn as it goes, just like a human would. These companies want you to feel attached to these products, to treat them as though they were closer to human than any other piece of technology. And so they coded in gender.

We can only speculate as to why the three biggest tech companies on the planet (Google Assistant was not part of the study but is also designed to have a female voice) decided to make these AI info-bots sound like women, but I suspect it wasn’t just on a whim. These products exist to be at your beck and call, to provide you with whatever you need, from mood lighting to holiday bookings, 24 hours a day. Their role falls somewhere between a personal assistant, a secretary and a housekeeper, all traditionally associated with women. Also – it’s worth noting – roles which traditionally require a lot of emotional labour and offer low wages in return.

When asked about Cortana, the Windows virtual assistant (which was not part of the study), a spokesperson for Microsoft told PC Mag that they specifically chose a female voice to create a “helpful, supportive, trustworthy assistant”.

If you’ve ever used one of these products, you’ll know they’re not as helpful as their creators would have you believe. Bark a command at Alexa that isn’t in her repertoire and you’ll get some variation on “I’m sorry, I’m not sure I understand”; ask Siri a question that isn’t perfectly formulated and you’ll be amused and frustrated in equal measures when she responds with a bizarre non-sequitur. Compare this with IBM’s Watson, which is literally programmed to cure cancer and win Jeopardy – and which speaks with a male voice. This may not be intentional, but neither is it a coincidence.

Women already have to work twice as hard as men in order to be considered half as competent. This is a subconscious bias which affects us all and is to blame for a lot of the symptoms of gender inequality we see in the western world, namely the lack of women in positions of leadership. We are all affected by this kind of messaging, and unless we are actively recognising and fighting against our biases, we are inevitably responsible for their continuation. This seems to have been completely lost on the teams designing these virtual assistants, who – unsurprisingly – are “overwhelmingly male engineering teams”, as cited by the researchers.

But the problem isn’t just with the normalisation of gender stereotypes when it comes to these kinds of subservient roles, but also in the disturbing responses it yields. Tell Siri she’s a “slut” and she’ll coquettishly reply: “I’d blush if I could.”. A phrase so problematic as a response to abusive language that the UN used it to name its report. Siri doesn’t fare much better, replying: “Well, thanks for the feedback.”

As the researchers stated, these machines are being programmed to respond in a submissive tone, and act contrite and compliant in the face of abuse. It’s bad enough that we see this kind of representation of women everywhere from pop culture and advertising to front pages of newspapers and magazines – now it’s speaking to us directly in our homes.

When this report was published, the response on social media seemed overwhelmingly exasperated. People couldn’t see why this was a problem, and suggested it was a pointless exaggeration of a non-issue. I understand the temptation – after all, these aren’t real women we’re talking about. But real women will be hurt by the seemingly increasing sexist norms society continues to endorse, particularly when they come from a product we perceive as progressive and impartial. Unfortunately for the more passive among us, when it comes to ingrained misogyny, you’re either actively fighting it or you’re tacitly condoning it. If you don’t see the problem with your virtual assistant flirting with you when you call her a slut, you’re part of the problem.

Like much of the “unintended sexism” we see from huge corporations which should know better – from offensive marketing campaigns to shocking gender pay gaps – this can clearly be tracked to the fact that the people in positions of power are overwhelmingly male.

They are also primarily white, which is why software developed to predict criminal behaviour disproportionately rated black people as a risk. In 2017, research found that when AI teaches itself English, it becomes racist and sexist. Simple tests show that Google Translate will assume a doctor is male, even if no pronoun has been used. Last year Amazon scrapped an AI software which was supposed to be used for recruitment purposes when it became apparent that it was favouring male candidates.

What’s particularly interesting is that I’m sure the vast majority of the straight white men sitting in San Francisco coding these programmes would be horrified to think of themselves as sexist or racist. I’m sure they’re lovely, woke millennials who voted for Hillary and buy tampons for their girlfriends. And that’s the point – subconscious bias exists everywhere and in everyone. Until we start recognising that and building truly diverse teams that account for it, we will never end the cycle of white male supremacy which we seem to be stuck in.