Most AI assistants are female – and that fuels dangerous stereotypes and abuse

Most AI assistants are female – and that fuels dangerous stereotypes and abuse

By Ramona Vijeyarasa
Publication Date: 2026-01-26 18:54:00

In 2024, there were more than 8 billion artificial intelligence (AI) voice assistants worldwide, more than one per person on the planet. These assistants are helpful, polite – and almost always female.

Their names also carry gender connotations. Apple’s Siri, for example – a Scandinavian female name – means “beautiful woman who leads you to victory.”

When IBM’s Watson for Oncology launched in 2015 to help doctors process medical data, it received a male voice. The message is clear: women serve and men teach.

This isn’t harmless branding – it’s a design choice that reinforces existing stereotypes about the roles of women and men in society.

This isn’t just symbolic either. These decisions have real consequences as they normalize gender subordination and pose the risk of abuse.

The dark side of “friendly” AI

Current research shows the extent of harmful interactions with feminized AI.

A 2025 study found that up to 50% of human-machine…