Inequitable Algorithms: Facial Recognition’s Alarming Pattern of Misidentifying Black Individuals Sparks Calls for Reform

The report’s authors stated that they believe these results from factors that include the “lack of Black faces in the algorithms’ training data sets, a belief that these programs are infallible and a tendency of officers’ own biases to magnify these issues.”

0
NNPA Website

By Stacy M. Brown, NNPA Website

Recent research conducted by Scientific American online supported fears that facial recognition technology (FRT) can worsen racial inequities in policing. The research found that law enforcement agencies that use automated facial recognition disproportionately arrest Black people.

The report’s authors stated that they believe these results from factors that include the “lack of Black faces in the algorithms’ training data sets, a belief that these programs are infallible and a tendency of officers’ own biases to magnify these issues.”

FRT was again cast in a negative light after the arrest of a 61-year-old grandfather, who is now suing Sunglass Hut’s parent company after the store’s facial recognition technology mistakenly identified him as a robber. Harvey Eugene Murphy Jr. was subsequently held in jail, where he says he was sexually assaulted, according to a lawsuit.

The robbery occurred at a Sunglass Hut store in Houston, Texas, when two gun-wielding bandits stole thousands of dollars in cash and merchandise. Houston police identified Murphy as a suspect – even though he lived in California. They arrested Murphy when he returned to Texas to renew his driver’s license. His lawsuit claims that, while in jail, he was sexually assaulted by three men in a bathroom, causing him to suffer lifelong injuries.
The Harris County District Attorney’s office later determined Murphy was not involved in the robbery – but the damage was already done while he was in jail, his lawyers said in a news release.

“This is precisely the kind of situation we’ve been warning about for years; that these systems, whatever their theoretical reliability, are in practice so finicky, and so consequential, that they cannot be fixed,” Os Keyes, an Ada Lovelace Fellow, and PhD Candidate at the University of Washington, told Vice News.

“The only thing I’d push back on is Murphy’s lawyer’s claim that it could happen to anyone; these systems are attractive precisely because they promise to automate and speed up ‘business as usual,’ which includes laundering existing police biases against people who are already in the system, minority groups, and anyone else who doesn’t fit. This outcome is as inevitable as it is horrifying and should be taken as a sign to restrict and reconfigure policing in general as well as FRT in particular.”

Scientific American researchers noted that the algorithms used by law enforcement “are typically developed by companies like Amazon, Clearview AI and Microsoft, which build their systems for different environments.” They argued that, despite massive improvements in deep-learning techniques, federal testing shows that most facial recognition algorithms perform poorly at identifying people besides white men.

In 2023, the Federal Trade Commission prohibited Rite Aid from using FRT after the company wrongly accused individuals of shoplifting. CBS News noted that, in one incident, an 11-year-old girl was stopped and searched by a Rite Aid employee based on a false match. Also last year, the Detroit Police Department was sued by a woman whom their technology misidentified as a carjacking suspect. Eight months pregnant at the time, Porcha Woodruff was jailed after police incorrectly identified her using FRT.

The FTC acknowledged that people of color are often misidentified when using FRT. “Disproportionate representation of white males in training images produces skewed algorithms because Black people are overrepresented in mugshot databases and other image repositories commonly used by law enforcement,” Scientific American researchers determined. “Consequently, AI is more likely to mark Black faces as criminal, leading to the targeting and arresting of innocent Black people.

“We believe that the companies that make these products need to take staff and image diversity into account. However, this does not remove law enforcement’s responsibility. Police forces must critically examine their methods if we want to keep this technology from worsening racial disparities and leading to rights violations.”