AI Programs Are Learning to Exclude Some African-American Voices

Coverage Type: 

Some artificial intelligence (AI) systems are learning to be prejudiced against some dialects. And as language-based AI systems become ever more common, some minorities may automatically be discriminated against by machines, warn researchers studying the issue.

Anyone with a strong or unusual accent may know what it’s like to have trouble being understood by Siri or Alexa. This is because voice-recognition systems use natural-language technology to parse the contents of speech, and it often relies on algorithms that have been trained with example data. If there aren’t enough examples of a particular accent or vernacular, then these systems may simply fail to understand you. The problem may be more widespread and pernicious than most people realize. Natural-language technology now powers automated interactions with customers, through automated phone systems or chatbots. It’s used to mine public opinion on the Web and social networks, and to comb through written documents for useful information. This means that services and products built on top of language systems may already be unfairly discriminating against certain groups.


AI Programs Are Learning to Exclude Some African-American Voices