Fri. Jan 31st, 2025

Michigan resident Robert Williams was arrested for a crime he didn’t commit because a facial recognition system incorrectly suggested that he was the suspect seen in security camera footage. (Courtesy of the ACLU)

In January 2020, Farmington Hills, Michigan, resident Robert Williams spent 30 hours in police custody after an algorithm listed him as a potential match for a suspect in a robbery committed a year and a half earlier.

The city’s police department had sent images from the security footage at the Detroit watch store to Michigan State Police to run through its facial recognition technology. An expired driver’s license photo of Williams in the state police database was a possible match, the technology said.

But Williams wasn’t anywhere near the store on the day of the robbery.

Williams’ case, now a settled lawsuit which was filed in 2021 by the American Civil Liberties Union and Michigan Law School’s Civil Rights Litigation Initiative, was the first public case of wrongful arrest due to misuse of facial recognition technology (FRT) in policing.

But the case does not stand alone. Several more documented cases of false arrests due to FRT have come out of Detroit in the years following Williams’ arrest, and across the country, at least seven people have been falsely arrested after police found a potential match in the depths of FRT databases.

Williams’ lawsuit was the catalyst to changing the way the Detroit Police Department may use the technology, and other wrongful arrest suits and cases are being cited in proposed legislation surrounding the technology. Though it can be hard to legislate technology that gains popularity quickly, privacy advocates say unfettered use is a danger to everyone.

“When police rely on it, rely on them, people’s lives can be turned upside down,” said Nate Wessler, one of the deputy directors of the Speech, Privacy, and Technology Project at the national ACLU.

How are police using FRT?

Facial recognition technology has become pervasive in Americans’ lives, and can be used for small, personal tasks like unlocking a phone, or in larger endeavors, like moving thousands of people through airport security checks.

The technology is built to assess a photo, often called a probe image, against a database of public photos. It uses biometric data like eye scans, facial geometry, or distance between features to assess potential matches. FRT software converts the data into a unique string of numbers, called a faceprint, and will present a set of ranked potential matches from its database of images.

When police use these systems, they are often uploading images from a security camera or body-worn camera. Popular AI company Clearview, which often contracts with police and has developed a version specifically for investigations, says it hosts more than 50 billion facial images from public websites, including social media, mugshots, and driver’s license photos.

Katie Kinsey, chief of staff and tech policy counsel for the Policing Project, an organization focused on police accountability, said that she’s almost certain that if you’re an adult in the U.S., your photo is included in Clearview’s database, and is scanned when police are looking for FRT matches.

“You’d have to have no presence on the internet to not be in that database,” she said.

The use of FRT by federal law enforcement agencies goes back as long as the technology has been around, more than two decades, Kinsey said, but local police departments began using it in the last 10 years.

Usually, police are using it in the aftermath of a crime, but civil liberties and privacy concerns come from the idea that the technology could be used to scan faces in real time, with geolocation data attached, she said. Kinsey, who often meets with law enforcement officers to develop best practices and legislative suggestions, said she believes police forces are wary of real-time uses.

Boston Police attempted to use it while searching for the suspects in the 2013 Boston Marathon bombing, for example, but grainy imaging hindered the technology in identifying the culprits, Kinsey said.

Wrongful arrests

FRT’s role in wrongful arrest cases usually come from instances where police have no leads on a crime other than an image captured by security cameras, said Margaret Kovera, a professor of psychology at the John Jay College of Criminal Justice and an eyewitness identification expert.

Before the technology was available, police needed investigative leads to pin down suspects – physical evidence, like a fingerprint, or an eyewitness statement, perhaps. But with access to security cameras and facial recognition technology, police can quickly conjure up several possible suspects that have a high likelihood of a match.

With millions of faces in a database, the pool of potential suspects feels endless. Because the technology finds matches that look so similar to the photo provided, someone choosing a suspect in a photo array can easily make a wrong identification, Kovera said. Without further investigation and traditional police work to connect the match chosen by the technology to a crime scene, the match is useless.

“You’re going to up the number of innocent people who are appearing as suspects and you’re going to decrease the number of guilty people,” Kovera said. “And just that act alone is going to mess up the ratio of positive identifications in terms of how many of them are correct and how many of them are mistaken.”

In the seven known cases of wrongful arrest following FRT matches, police failed to conduct sufficient followup investigation, which could have prevented the incidents. One man in Louisiana spent a week in jail, despite being 40 pounds lighter than a thief allegedly seen in surveillance footage. A woman who was eight months pregnant in Detroit was held in custody for 11 hours after being wrongfully arrested for carjacking, despite no mention of the carjacker appearing pregnant.

When Williams was arrested in January 2020, he was the ninth-best match for the person in the security footage, Michael King, a research scientist with the Florida Institute of Technology’s (FIT) Harris Institute for Assured Information, testified in the ACLU’s lawsuit. And detectives didn’t pursue investigation of his whereabouts before making the arrest.

Detroit police used the expired license image in a photo array presented to a loss-prevention contractor who wasn’t present at the scene of the crime. The loss prevention contractor picked Williams as the best match to the security cameras. Without further investigation of Williams’ whereabouts in October 2018, Detroit Police arrested him and kept him in custody for 30 hours.

The lawsuit says Williams was only informed after several lines of questioning that he was there because of a match via facial recognition technology. As part of the settlement, which Williams reached in the summer of 2024, Detroit Police had to change the way it uses facial recognition technology. The city now observes some of the strictest uses of the technology across the country, which is legislated on a state-by-state basis.

Police can no longer go straight from facial recognition technology results to a witness identification procedure, and they cannot apply for an arrest warrant based solely on the results of a facial recognition technology database, Wessler said. Because there can be errors or biases in the technology, and by its users, guardrails are important to protect against false arrests, he said.

Emerging laws

At the start of 2025, 15 states – Washington, Oregon, Montana, Utah, Colorado, Minnesota, Illinois, Alabama, Virginia, Maryland, New Jersey, Massachusetts, New Hampshire, Vermont, and Maine – had some legislation around facial recognition in policing. Some states, like Montana and Utah, require a warrant for police to use facial recognition, while others, like New Jersey, say that defendants must be notified of its use in investigations.

At least seven more states are considering laws to clarify how and when the technology can be used – lawmakers in Georgia, Hawaii, Kentucky, Massachusetts, Minnesota, New Hampshire, and West Virginia have introduced legislation.

Like all AI technologies, facial recognition can have baked-in bias, or produced flawed responses. FRT has historically performed worse on groups of Black faces than on white, and has shown gender differences, too. AI is trained to get better over time, but people seem to think that simply by involving humans in the process, we’ll catch all the problems, Wessler said.

But humans actually tend to have something called “automation bias,” Wessler said – “this hardwired tendency of people to believe a computer output’s right as many times as you tell somebody the algorithm might get it wrong.”

So when police are relying on facial recognition technology as their primary investigative tool, instead of following older law enforcement practices, it’s “particularly insidious” when it goes wrong, Wessler said.

“I often say that this is a technology that is both dangerous when it works and dangerous when it doesn’t work,” Wessler said.

Kinsey said in her work with the Policing Project, she’s found bipartisan support for placing guardrails on police using this technology. Over multiple meetings with privacy advocates, police forces, lawmakers and academics, the Policing Project developed a legislative checklist.

It outlines how police departments could use the technology with transparency, testing and standards strategies, officer training, procedural limits and disclosure to those accused of crimes. It also says legislation should require vendors to disclose documentation about their FRT systems, and that legislation should provide ways to address violations of their use.

The Policing Project also makes similar recommendations for congressional consideration, and while Kinsey said she does believe federal guidelines are important, we may not see federal legislation passed any time soon. In the meantime, we’ll likely continue to see states influencing each other, and recent laws in Maryland and Virginia are an example of a broad approach to regulating FRT across different areas.

Kinsey said that in her meetings with police, they assert that the technologies are essential to crime solving. She said she believes there is space for FRT, and other technologies used by police like license plate readers and security cameras, but that doing so unfettered can do a lot of harm.

“We think some of them can absolutely provide benefits for solving crime, protecting victims,” Kinsey said. “But using those tools, using them according to rules that are public, transparent, and have accountability, are not mutually exclusive goals. They can actually happen in concert.”