Archive

ShareThis Page
Researchers say Amazon face-detection technology shows bias | TribLIVE.com
U.S./World

Researchers say Amazon face-detection technology shows bias

The Associated Press
679916614820c44eef08193f451c8cbe636f1111efdb
FILE - This Oct. 23, 2018, file photo shows an Amazon logo atop the Amazon Treasure Truck The Park DTLA office complex in downtown Los Angeles.

NEW YORK — Facial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto.

Privacy and civil rights advocates have called on Ama­zon to stop marketing its Rekog­nition service because of worries about discrimination against minorities. Some Amazon investors have asked the company to stop out of fear that it makes Amazon vulnerable to lawsuits.

The researchers said that in their tests, Amazon’s technology labeled darker-skinned women as men 31 percent of the time. Lighter-skinned women were misidentified 7 percent of the time. Darker-­skinned men had a 1 percent error rate, while lighter-­skinned men had none.

Artificial intelligence can mimic the biases of their human creators as they make their way into everyday life. The new study, released late Thursday, warns of the potential of abuse and threats to privacy and civil liberties from facial-detection technology.

Matt Wood, general manager of artificial intelligence with Amazon’s cloud-computing unit, said Amazon has updated its technology since the study and done its own analysis with “zero false positive matches.”