AI voice assistants reinforce harmful gender stereotypes, new UN report says

Tech

Artificial intelligence-powered voice assistants, many of which default to female-sounding voices, are reinforcing harmful gender stereotypes, according to a new study published by the United Nations.

Titled “I’d blush if I could,” after a response Siri utters when receiving certain sexually explicit commands, the paper explores the effects of bias in AI research and product development and the potential long-term negative implications of conditioning society, particularly children, to treat these digital voice assistants as unquestioning helpers who exist only to serve owners unconditionally. It was authored by the United Nations Educational, Scientific, and Cultural Organization, otherwise known as UNESCO.

The paper argues that by…

Continue reading…

Comments