UNESCO voiced concerns about female assistant voices
UN goes gender-nuts
Voice assistants with a female voice, such as Siri or Alexa, confirm gender bias. According to a study by the United Nations, writes The Guardian.
The research of UN organization Unesco describes that submissive and flirty answers confirm prejudices that women are of service.
“Because the voices of assistants are often female, it sends a signal that women are compliant and docile,” writes Unesco. “And with the push of a button or a bot command like ‘hey’ or ‘okay'”.
The study explains that it is worrying because the female voices apologize after verbal sexual harassment.
Unesco calls for digital assistants not to be set by women as standard and for the assistants to be negative about sexist remarks. The organization also advises technology companies to develop gender-neutral artificial intelligence.
Voice assistants are part of, among other things, telephones and smart speakers. For example, users can request music or the weather forecast.