Are AI-based digital assistants reinforcing harmful gender stereotypes?

A new study published by the United Nations Educational, Scientific and Cultural Organization (UNESCO) claims that artificial-intelligence powered voice assistant culture is reinforcing harmful gender stereotypes. The paper published by the UN argues that most of the voice assistants are on the female mode by default. Additionally, their names are also similar to a female – like Alexa, Siri, and Cortana.

The paper titled, “I’d blush if I could,” after a response by Siri when receiving certain sexually explicit comments explores the bias system in AI development. According to the UN, tech companies are lacking when it comes to building proper safeguards against hostile, abusive, and gendered language. In fact, most assistants take the ‘inappropriate commands’ with sly jokes of their own.

For example, ask Siri to make you a sandwich and it will reply with, “I can’t! I don’t have any condiments.”

The report states that the voice of most digital assistants are female by default and the verbal flirtation perceives the women as ‘obedient’ and ‘eager to please’ helpers. It also shows that the assistant has no control beyond the commands of the user and honors each query regardless of the tone or even the hostility.

We are all aware of the tech companies using AI-generated platforms to enhance their functionality. In the future, we can expect an enhanced use of voice assistants for even the minutest tasks.

However, as the machines take on a more substantial role in our daily life, would our way of interaction effect on how we communicate with other human beings?


As per the report by Business Insider published last September, Amazon chose a female sounding voice for their assistant since their research indicated that it would receive more sympathy from the users. Alternatively, Microsoft named its assistant Cortana to bank on the existing recognition of the female-identifying AI character in its Halo video game franchise. Because of this, you cannot change the voice of Cortana to a male one and Microsoft has also not explained when it would allow the change. Siri is also a popular Scandinavian female name that means ‘beautiful victory’ in Old Norse.

Simply put the decision to integrate female voices assistants were made on purpose and after what seems like extensive feedback.

Nevertheless, some tech companies are making an effort to move away from the early-developed designs to break the stereotypes. Take for instance, Google who now has different Assistant options including accents for both male and female. Different colors are also assigned to its eight voice assistant while the company has also launched an initiative where good manners like saying ‘please’ and ‘thank you’ are rewarded.

Amazon also introduced a similar concept last year for its Alexa Assistant.

Yet, the UN claims that the features and gender voice options are baked into the AI and tech industries. Their research found that eighty percent of the AI academics are men while only 15 and 10 percent of AI researchers at Facebook and Google are women, respectively.

UNESCO says that the solution to the problem would be to develop assistants with voices that are as neutral as possible. The report says that tech companies are unable to condition users to treat AI politely and this is why it is important that they remake voice assistants as purposefully non-human entities.

United Nations study finds female voice assistants reinforce harmful stereotypes

Read next: Google and Facebook know what you are about to do, even before you think of doing it, thanks to the new Algorithms!
Previous Post Next Post