What The Kids’ Game “Telephone” Taught Microsoft About Biased AI

By Suzanne LaBarre

Microsoft illustrates unintentional bias using familiar childhood scenarios. The result? A guide to recognizing–and hopefully reducing–exclusion in AI.

Can artificial intelligence be racist? Let’s say you’re an African-American student at a school that uses facial recognition software. The school uses it to access the building and online homework assignments. But the software has a problem. Its makers used only light-skinned test subjects to train its algorithms. Your skin is darker, and the software has trouble recognizing you. Sometimes you’re late to class, or can’t get your assignments on time. Your grades suffer. The result is discrimination based solely on skin color.

Source:: fastcodesign