The men and women of Mad Men

Is Gender Bias Driving the Development of AIs and Even Sex Robots?

Examining a possible link between sexism and artificial intelligence. 


Don’t ask Microsoft’s virtual assistant Cortana about her sex life, unless you want a harsh rebuke from the artificial intelligence (AI) software.

A significant number of sexually charged questions from users has pushed developers to make Cortana respond angrily to such remarks, raising questions about sexism in our attitudes toward AIs and robots.

Microsoft AI advocate Deborah Harrison announced the programming update at the ReWork Virtual Assistant Summit in San Francisco in January.

“If you say things that are particularly a**holeish to Cortana, she will get mad,” she said, according to the tech publication Techaeris.

“We wanted to be very careful that she didn’t feel subservient in any way… or that we would set up a dynamic we didn’t want to perpetuate socially.”

When Cortana launched in 2014, many users reportedly decided to ask her sex-related questions.

Techaeris journalist Jason Bouwmeester said “some people can be quite rude, obnoxious bullies, and, as the flare ups involving the Gamergate controversy has shown, feel perfectly fine sexually harassing others from the comfort of sitting behind their computers.”

While some companies create robots that are designed to flirt, others see sexual interaction as an unwelcome use of their products. For example, in September 2015, developers of the companion robot Pepper inserted a“no sex” clause into buyers’ contracts to ensure people didn’t sexualize or sexually harass the robot.

Some people might harass robots for entertainment, while others might feel genuine sexual attraction toward them. Yet do inappropriate sexual comments show that AI has a woman problem?

AI reflecting gender stereotypes

The two best-known virtual assistants, Cortana from Microsoft and Siri from Apple, are both female-voiced by default.

There’s never been much controversy about this. Apparently people feel that a female voice for a virtual assistant is “normal.”

Perhaps people feel like Cortana and Siri are basically secretaries and that’s a “woman’s job.” The decision to give them female voices could be a reflection of stereotypes in society.

As Deborah Harrison acknowledged, how people treat AI is entwined with how people treat other humans, especially when it comes to gender, power, and sexism.

The men and women of Mad Men

To develop methods for Cortana to respond to sexual harassment, Microsoft spoke to several female human assistants. Jason Bouwmeester at Techaeris reckons there would be “no doubt” the human assistants would have encountered sexual harassment. It seems that workplaces haven’t progressed entirely beyond the attitudes shown in Mad Men.

People design AIs as a reflection of humanity. If people design AI assistants that are mostly female, it reflects an attitude that human assistants “should” be female.

If humans think it’s acceptable to sexually harass an AI, it shows that certain people think it’s acceptable to sexually harass other humans. The difference is they might have fewer inhibitions, because an AI can’t sue them for harassment—at least not yet.

Sexism and sex robots

Another area of AI facing potential criticism for gender bias is the fledgling sex robot industry.

The most popular high-tech and realistic sex dolls nowadays are female. Predictions and representations of future sex robots, across various media, are mostly female as well.

In film, Gigolo Joe from the movie A.I. remains the only standout male sex robot, compared to a long list of female examples. Consider Cherry 2000, the Fembots from Austin Powers, or Pris from Blade Runner.

One major criticism from the Campaign Against Sex Robots is that sex robots will reinforce the “inferiority of women.”

Although most futurists aren’t panicking about sex robots, the assumption that the subservient robots will be mostly female reveals something about the development of sex robots.

Maybe it’s easier to picture a female sex robot because those are the images people keep seeing in the cinema or in other media? And perhaps this assumption is driving the identity of sex dolls and sex robots all the way to development? It’s like the recurrent stereotype of the female assistant.

Or are sex dolls and sex robots simply being designed by men, for men?

Male sex dolls exist (Abyss Creations, the makers of the RealDoll, for example, receives orders for male sex dolls), yet the predominant image in the media and in the technology world is the female sex companion.

Either developers are ignoring a big part of their target market or they’re stuck with a “man’s eye view” of the world that can only see female sex dolls and female sex robots.

Other explanations

It’s a strange situation because many sex toys, both “smart” and traditional, are marketed to women. In many ways the sex aid industry has been a great leveller, allowing women to feel more open about their sexuality.

Perhaps there is an explanation that has nothing to do with gender bias.

For example, a person might make unwelcome sexual comments to a female AI not because it’s female, but because it’s an AI. The person won’t feel guilty about objectifying the “woman” because to them she’s literally an object. They might be curious about how an AI will react to sexualized comments, but would never make unacceptable comments to a female human.

Alternatively, many people still find the idea of AI threatening. They are still stuck on the overdone science-fiction plot where sentient robots try to kill or overthrow humans.

In this way, female AIs might come across as less threatening, whether it’s just a voice from a phone or a fully-formed robot. Maybe women would feel uncomfortable letting a humanlike male sex robot live in their home, while men wouldn’t worry about a humanlike female sex robot.

It’s hard to identify any themes in the areas of AI and sex robots because they’re still young industries. There might be a pro-male bias or there might not. Nevertheless, those in the industry should remain alert for any signs of sexism and prepared to respond if necessary.

Images: Bhupinder Nayyar,