- Amazon “Rekognition” tech identifies tens of millions of faces
- Error rate of 0.8% for light-skin men and 34.7% for dark-skin women
Amazon (AMZN +0.44%) is facilitating its facial recognition program to police departments across the United States. This is raising concerns about whether or not the e-commerce giant is exacerbating racial profiling and bias.
Why This Matters: Amazon’s “Rekognition” system is part of a technology that embodies the idea “all black people look the same.” For example, research shows commercial artificial intelligence systems tend to have higher error rates for women and black people. Some facial recognition systems would only confuse light-skin men 0.8% of the times and would have an error rate of 34.7% for dark-skin women. This potential glitch could lead to more racial profiling with minorities and immigrants.
Amazon claims this product can quickly scan information it collects from databases featuring tens of millions faces and it can track individuals in real-time. Rekognition could also recognize 100 faces in a crowded image.
Police departments across America still struggle to improve relationships with the black community. African Americans are already incarcerated 5 times more than other races. Black women are incarcerated at twice the rate of their counterparts. When driving black people are more likely to get stopped by police officers and have a 20% increased chance of getting a ticket following a stop. Given these vulnerabilities, Rekognition would enhance an already existing systemic double-whammy.
What’s Next: The ACLU is calling out Amazon and stated the tool is a threat to civil liberties. Meanwhile, democratic lawmakers are calling for safeguards so police departments with Rekognition do not abuse their power.
CBx Vibe: “The Games We Play” Pusha T