Thursday, Mar 28, 2024 | Last Update : 04:42 PM IST

  Technology   In Other news  24 Jun 2020  Calls to abandon facial recognition use as wrongly identified Black man seeks Detroit police apology

Calls to abandon facial recognition use as wrongly identified Black man seeks Detroit police apology

AFP
Published : Jun 24, 2020, 6:44 pm IST
Updated : Jun 24, 2020, 6:44 pm IST

The Detroit police has been challenged over face recognition flaws, bias, especially as it is prone to errors identifying people of colour.

A Black man who says he was unjustly arrested because facial recognition technology mistakenly identified him as a suspected shoplifter is calling for a public apology from Detroit police, and for the department to abandon its use of the controversial technology. Several cities, led by San Francisco last year, have banned the use of facial recognition. (Photo | bigbrotherwatch.org.uk)
 A Black man who says he was unjustly arrested because facial recognition technology mistakenly identified him as a suspected shoplifter is calling for a public apology from Detroit police, and for the department to abandon its use of the controversial technology. Several cities, led by San Francisco last year, have banned the use of facial recognition. (Photo | bigbrotherwatch.org.uk)

A Black man who says he was unjustly arrested because facial recognition technology mistakenly identified him as a suspected shoplifter is calling for a public apology from Detroit police. And for the department to abandon its use of the controversial technology.

The complaint by Robert Williams is a rare challenge from someone who not only experienced an erroneous face recognition hit, but was able to discover that it was responsible for his subsequent legal troubles.

The Wednesday complaint filed on Williams’ behalf alleges that his Michigan driver license photo _ kept in a statewide image repository — was incorrectly flagged as a likely match to a shoplifting suspect. Investigators had scanned grainy surveillance camera footage of an alleged 2018 theft inside a Shinola watch store in midtown Detroit, police records show.

That led to what Williams describes as a humiliating January arrest in front of his wife and young daughters on their front lawn in the Detroit suburb of Farmington Hills.

“I can’t really even put it into words,” Williams said in a video announcement describing the daytime arrest that left his daughters weeping. “It was one of the most shocking things that I ever had happen to me.”

The 42-year-old automotive worker, backed by the American Civil Liberties Union, is demanding a public apology, final dismissal of his case and for Detroit police to scrap its use of facial recognition technology. Several studies have shown current face-recognition systems more likely to err when identifying people with darker skin.

The ACLU complaint said Detroit police “unthinkingly relied on flawed and racist facial recognition technology without taking reasonable measures to verify the information being provided.” It called the resulting investigation “shoddy and incomplete,” the officers involved “rude and threatening,” and said the department has dragged its feet responding to public-information requests for relevant records.

Detroit police and Wayne County prosecutors didn’t immediately return emailed requests for comment Wednesday.

DataWorks Plus, a South Carolina company that provides facial recognition technology to Detroit and the Michigan State Police, also couldn’t immediately be reached for comment.

Police records show the case began in October 2018 when five expensive watches went missing from the flagship store of Detroit-based luxury watchmaker Shinola. A loss-prevention worker later reviewed the video footage showing the suspect to be a Black man wearing a St. Louis Cardinals baseball cap.

“Video and stills were sent to Crime Intel for facial recognition,” says a brief police report. “Facial Recognition came back with a hit” _ for Williams.

At the top of the facial recognition report, produced by Michigan State Police, was a warning in bold, capitalized letters that the computer’s finding should be treated as an investigative lead, not as probable cause for arrest.

But Detroit detectives then showed a 6-photo lineup that included Williams to the loss-prevention worker, who positively identified Williams, according to the report. It took months for police to issue an arrest warrant and several more before they called Williams at work and asked him to come to the police department. It’s not clear why.

Williams said he thought it was a prank call. But they showed up soon after at his house, took him away in handcuffs and detained him overnight. It was during his interrogation the next day that it became clear to him that he was improperly identified by facial recognition software.

“The investigating officer looked confused, told Mr. Williams that the computer said it was him but then acknowledged that ‘the computer must have gotten it wrong,”” the ACLU complaint says.

Prosecutors later dismissed the case, but without prejudice _ meaning they could potentially pursue it again.

The case is likely to fuel a movement in Detroit and around the U.S. protesting police brutality, racial injustice and the death of George Floyd at the hands of police in Minneapolis. Detroit activists have presented reforms to the city’s mayor and police chief that include defunding the police department and ending its use of facial recognition.

Providers of police facial recognition systems often point to research showing they can be accurate when used properly under ideal conditions. A review of the industry’s leading facial recognition algorithms by the National Institute of Standards and Technology found they were more than 99% accurate when matching high-quality head shots to a database of other frontal poses.

But trying to identify a face from a video feed _ especially using the ceiling-mounted cameras commonly found in stores _ can cause accuracy rates to plunge. Studies have also shown that face recognition systems don’t perform equally across race, gender and age _ working best on white men and with potentially harmful consequences for others.

Concerns about bias and growing scrutiny of policing practices following Floyd’s death led tech giants IBM, Amazon and Microsoft to announce earlier this month they would stop selling face recognition software to police, at least until Congress can establish guidelines for its use. Several cities, led by San Francisco last year, have banned use of facial recognition by municipal agencies.

Tags: facial recognition, ibm, microsoft, amazon, us police