Siri and friends perpetuate damaging gender stereotypes: hostess, maid, secretary, muse, consort.
Obedient and obliging, the default female AI voice assistant has become a tech industry meme. From Apple’s Siri to Microsoft’s Cortana and Amazon’s Alexa, data science has provided us with the new virtual slave – she is a woman. Identified in name and by voice, and sometimes even marketed with sensuous form, their functions and behaviours perpetuate damaging gender stereotypes: hostess, maid, secretary, muse, consort.
Unopposed submission cultivates disrespect. In the face of verbal abuse, AI assistants lack assertion, Passive tolerance of misogyny moves the needle no closer to mutual respect, a thin silicon line between user and abuser. Research conducted by my own Social Policy Group in February 2023 shows we can change how we ask prompts based on whether a male or female voice is selected for the AI assistant.