Algorithmic Injustice? Racial bias and facial recognition
Research shows that facial recognition software often misidentifies people of color at a much higher rate than white individuals.
Facial recognition technology can help police officers identify—and ultimately charge—criminals caught on camera. But its critics argue that it's discriminatory. Now, Detroit, Michigan is facing lawsuits for the false arrests of two Black men, both misidentified by facial recognition tech.
This video was produced in collaboration with NOVA and is made possible in part by the Corporation for Public Broadcasting, a private corporation funded by the American people.