Angela Lipps spent 6 months in jail because a piece of software decided she looked like a bank fraud suspect who lived 1,200 miles away. Not a hunch from a detective. Not a witness ID. An algorithm. Six months of her life, gone, because the tech was wrong and nobody stopped to ask whether it could be.
I cover consumer tech, so I know what it feels like when a product ships before it is ready. You get a buggy app, a glitchy speaker, a phone that drops calls. Annoying. You leave a 2-star review and move on. But when the buggy product is a facial recognition system and the "user error" is a wrongful arrest, the stakes are a little different than my Bluetooth headphones cutting out on the subway.
The Error Rate Is Not a Footnote
MIT's Gender Shades study found commercial facial recognition misidentifies light-skinned men at a 0.8% error rate. For darker-skinned women, that number is 34.7%. A 2019 NIST evaluation found African American and Asian faces were misidentified 10 to 100 times more often than white male faces. These are not edge cases. They are the product working as designed, on the data it was trained on.
Robert Williams got a wrongful arrest settlement in Detroit in June 2024, the first of its kind tied to facial recognition misidentification. LaDonna Crutchfield was wrongfully arrested for attempted murder in January 2024. Nearly every documented wrongful arrest from this technology involves a Black defendant. That is not a coincidence. Over-policed communities generate more data, which feeds higher risk scores, which generates more policing. The feedback loop is the feature.
Retired Riverside Police Chief Tom Weitzel calls facial recognition "one of the most important investigative tools to come along in policing in 50 years." I get it. Chicago PD used it to help identify a suspect in the March 2026 killing of Loyola freshman Sheridan Gorman. Real crimes, real victims, real pressure on departments to solve them fast. That is a fair point. But a tool that works great on 65% of faces and catastrophically fails on the rest is not a good tool. It is a liability with a marketing budget.
Illinois Is Doing the Thing Nobody Else Will
Illinois House Bill 5521, the Biometric Surveillance Act, would ban police use of facial recognition and other biometric systems, with narrow exceptions for arrest fingerprints and crime scene forensics. Democratic Reps. Kelly Cassidy, Lilian Jiménez, and Kevin Olickal introduced it this month. Republican Rep. Patrick Sheehan says it would send law enforcement "back to the Stone Age." The Stone Age, apparently, is a place where innocent people are not jailed for 6 months because an algorithm had a bad day.
The ACLU of Illinois put it plainly: Illinois already regulates companies collecting biometric data, but law enforcement operates in the shadows with no equivalent rules. That gap is not an oversight. It is a choice.
Meanwhile, Kodex's latest threat intelligence report flags criminals using AI to generate fake police credentials for fraudulent data requests, sold on dark web marketplaces. So the same technology that wrongfully jails innocent people is also being weaponized against the agencies using it. Both sides of that equation are bad.
15 states have restrictions on law enforcement facial recognition. The EU bans real-time government biometric surveillance outright. The U.S. has no federal framework. Congress should pass one, and it should look a lot more like Illinois HB 5521 than the current nothing. Angela Lipps already paid the price for our inaction. She should be the last one.