How Flawed AI Facial Recognition Led to Wrongful Arrest
One day in April, Trevis Williams was unexpectedly stopped by subway police in Brooklyn and taken into custody, completely unaware of the reason for his arrest.
He would spend the next two days in jail, accused of exposing himself to a woman in a Manhattan building—a location 19 kilometers from where he actually was. The physical discrepancies were stark: Williams stands at 1.88 meters and weighs 104 kilograms, while the victim described the suspect as being around 1.68 meters tall and weighing only 73 kilograms.
The single piece of evidence connecting Williams to the crime was a faulty AI facial recognition match derived from a grainy surveillance video.
Credit: Natalie Keyssa/New York Times
The Genesis of a Mistake A Faulty AI Match
The New York Police Department (NYPD) has utilized facial recognition technology since 2011, investing billions in surveillance tools. In this case, investigators followed a common but problematic procedure. They fed a blurry still from the CCTV footage into their system, which generated six potential matches—all of whom were Black men with dreadlocks and facial hair.
Because Williams had a prior misdemeanor arrest, his mug shot was in the system. An examiner flagged his photo as a “possible match,” even though the report explicitly warned this was “not probable cause to arrest.” Despite this, detectives included his photo in a lineup, and the victim identified him. A detective noted she was “confident it is him.”
This identification was enough for the police to make an arrest, and they failed to conduct basic due diligence, such as checking his phone records or verifying his alibi with his employer. When Williams insisted the person in the photo wasn't him, a detective dismissively replied, “Of course you’re going to say that wasn’t you.”
A Case of Mistaken Identity
The incident occurred on February 10, when a delivery worker exposed himself to a woman in her Manhattan apartment building. However, at that exact time, cell phone tower records confirmed Trevis Williams was in Marine Park, Brooklyn, driving home from his job in Connecticut where he works with autistic adults.
Despite his confirmed alibi, he was jailed for over two days. Prosecutors eventually dropped the charges in July, but the ordeal had already taken its toll. “In the blink of an eye, your whole life could change,” Williams reflected.
A Troubling Pattern of Wrongful Arrests
Trevis Williams's experience is not an isolated incident. Across the United States, at least 10 individuals, predominantly people of color, have been wrongfully arrested based on flawed facial recognition matches. In Detroit, three Black men were falsely identified, with one man spending over a month in jail for an attempted murder he didn't commit.
Civil rights organizations like the ACLU have repeatedly warned about these dangers. Nathan Wessler of the ACLU told the New York Times that this pattern is a primary danger of the technology. A 2023 study from the National Institute of Standards and Technology (NIST) found that while AI can be highly accurate with clear, controlled photos, its error rate soars when dealing with blurry, poorly lit, or angled images common in real-world surveillance.
The Perils of Unchecked Technology
Unlike other cities that require corroborating evidence before using an AI match in a lineup, the NYPD has no such safeguard. The department also fails to track how often its facial recognition tool leads to wrongful arrests. An NYPD spokesperson claimed that arrests are never made “solely using facial recognition technology,” but Williams's lawyer, Diane Akerman, argues that traditional police work could have easily exonerated her client.
The Legal Aid Society has since called for an investigation into the NYPD's practices, suggesting these known cases are just the “tip of the iceberg.” The group also alleges the NYPD bypasses its own policies by having other agencies, like the Fire Department, run scans using controversial software like Clearview AI.
The Human Cost of a Digital Error
Before his arrest, Williams was in the process of becoming a correctional officer at Rikers Island, a process that has now stalled. The arrest has left him with lasting anxiety and panic attacks. “I hope people don’t have to sit in jail or prison for things that they didn’t do,” he shared with ABC7.
The case against Williams has been closed, with no other suspects charged. His story is a stark reminder that when powerful technology is used without proper oversight and basic investigative guardrails, it can create new victims instead of catching criminals.