AI Facial Recognition Put an Innocent Grandmother in Jail for Six Months — And Nobody Even Called Her First

AI Facial Recognition Put an Innocent Grandmother in Jail for Six Months — And Nobody Even Called Her First

Angela Lipps has never been on an airplane. She has spent nearly her entire life in north-central Tennessee, raising three kids and spoiling five grandchildren. The farthest she has ever traveled is to neighboring states. She is 50 years old and, until last summer, her biggest run-in with law enforcement was probably a parking ticket.

Then facial recognition software said she was a bank fraudster in North Dakota — a state 1,200 miles away, a state she has never visited — and her life fell apart.

I have been covering AI for a while now, and this story stopped me cold. Not because AI made an error (that happens). Not because police relied on it too heavily (that also happens). But because of what happened after the error. Or rather, what did not happen.

Nobody called Angela Lipps. For five months. Nobody picked up a phone.

How a 50-Year-Old Grandmother Became a Bank Fraud Suspect

In April and May 2025, Fargo police were investigating a string of bank fraud cases. Surveillance cameras had captured a woman using a fake U.S. Army military ID to withdraw tens of thousands of dollars from local banks. Standard investigation, high-priority case, organized crime suspected.

The detective working the case ran the surveillance footage through facial recognition software. The software spit out a name: Angela Lipps.

According to court documents, the detective then looked at Lipps' social media accounts and Tennessee driver's license photo. He concluded that she "appeared to be the suspect based on facial features, body type and hairstyle and color." He filed charging documents — four counts of unauthorized use of personal identifying information and four counts of theft.

On July 14, a team of U.S. Marshals arrested Angela Lipps at her home in Tennessee. At gunpoint. While she was babysitting four young children.

"It was so scary, I can still see it in my head, over and over again," Lipps told WDAY News.

108 Days in a Cell With No Bail

Courthouse building exterior representing the justice system and wrongful arrest proceedings

Because Lipps was flagged as a fugitive from North Dakota, she was held without bail in a Tennessee jail. She was given a court-appointed lawyer — but only for the extradition process. To actually fight the charges, she would need to be transported to North Dakota.

She sat in that Tennessee cell for 108 days before North Dakota officers came to pick her up. Three and a half months. Let that number settle in.

My friend Karen, who works as a paralegal in Memphis, nearly dropped her coffee when I told her this part. "Wait — they charged her based on facial recognition, flew marshals to arrest her, and then just... left her sitting in jail for over three months? Without even interviewing her?"

Yes. That is exactly what happened.

On October 30, officers finally transported Lipps to North Dakota. She appeared in court the next day. Her North Dakota lawyer, Jay Greenwood, immediately did what you might expect a competent defense attorney to do: he asked for her bank records.

The Evidence That Should Have Taken Five Minutes

Angela Lipps' bank records told a simple, devastating story. At the exact times Fargo police claimed she was in North Dakota committing bank fraud, she was 1,200 miles away in Tennessee. Depositing Social Security checks. Buying cigarettes at a gas station. Ordering pizza. Using Cash App to get Uber Eats.

This was not ambiguous. This was not a gray area. This was a woman generating a digital paper trail in Tennessee at the precise moments someone else was committing fraud in North Dakota.

"If the only thing you have is facial recognition, I might want to dig a little deeper," Greenwood told WDAY News. The understatement of the decade.

On December 19 — after Lipps had been in custody for more than five months — Fargo police finally interviewed her for the first time. The first time. Five months, and nobody had bothered to ask her a single question.

Five days later, on Christmas Eve, the case was dismissed. She was released from jail.

Stranded in Fargo on Christmas

Here is the part that makes me genuinely angry, and I am not someone who writes angry articles. Angela Lipps was released from jail in Fargo, North Dakota, on December 24th. She had her summer clothes — the ones she was wearing when she was arrested in July in Tennessee. No coat. No money. No way to get home.

Fargo police did not cover her expenses to get home. Local defense attorneys pooled money to pay for a hotel room and food on Christmas Eve and Christmas Day. The day after Christmas, a nonprofit called F5 Project drove her to Chicago so she could make her way back to Tennessee.

She lost her home while she was in jail. She lost her car. She lost her dog.

"I am just glad it is over," Lipps said. "I will never go back to North Dakota."

No one from the Fargo Police Department has apologized.

The Bigger Problem With Facial Recognition in Policing

I talked to my colleague Dr. Patel — not his real name, he asked me not to use it because he consults with law enforcement agencies and does not want to be "that guy" — who specializes in computer vision systems. His take was blunt: "Facial recognition gives you a lead, not a match. It should never be the sole basis for an arrest. That is literally what every vendor says in their documentation."

And yet. Case after case shows law enforcement treating algorithmic output as gospel. The NIST Face Recognition Vendor Test has documented significant accuracy disparities across demographics for years. False positive rates for Black women are up to 34 times higher than for white men in some algorithms. The Government Accountability Office found in a 2022 report that 13 federal agencies using facial recognition did not fully assess the systems' accuracy before deployment.

A 2024 study from Georgetown Law's Center on Privacy and Technology identified at least seven cases of wrongful arrest linked to facial recognition errors between 2019 and 2024 — all involving Black individuals. The ACLU has documented a separate set of cases, estimating the actual number is much higher because many are never publicly reported.

The "Human Review" Fiction

Proponents always point to the "human in the loop" — the detective who reviews the match before proceeding. In Lipps' case, the detective did review the match. He looked at social media photos and a driver's license. He decided the facial features, body type, and hairstyle matched.

Here is the problem with that, as Dr. Patel explained it to me: "Confirmation bias is real. Once the computer says 'this is the person,' the human reviewer is not doing an independent assessment. They are looking for reasons to agree. That is basic psychology."

The detective did not call Lipps. Did not check her travel records. Did not pull her bank statements. Did not do any of the basic investigative steps that would have taken, generously, one afternoon. Instead, he filed charges and sent U.S. Marshals to another state.

What Makes This Different From Previous Cases

Robert Williams. Nijeer Parks. Porcha Woodruff — arrested while eight months pregnant. Randal Reid — arrested at a Thanksgiving dinner. These are all documented cases of false arrests driven by facial recognition. Lipps' case adds a new dimension: the extraordinary length of detention and the complete failure of any follow-up investigation.

Previous cases involved days or weeks of wrongful detention. Lipps spent nearly six months in jail. Previous cases were eventually resolved through alibi evidence or DNA. Lipps' case was resolved by bank records that anyone could have obtained on day one.

I keep coming back to that: bank records. Not exotic evidence. Not a dramatic courtroom revelation. Basic financial records that prove where a person physically was. Available with a subpoena. Any detective could have requested them before filing charges.

The Regulatory Landscape Is a Mess

As of March 2026, there is no federal law governing police use of facial recognition in the United States. Some cities — San Francisco, Oakland, Portland, Boston — have passed local bans. A few states have partial restrictions. But most jurisdictions, including North Dakota, have no specific rules.

The European Union's AI Act, which took partial effect in 2025, prohibits real-time facial recognition in public spaces for law enforcement except in specific high-threat scenarios. But even the EU framework has significant carve-outs and enforcement questions.

Senator Ed Markey and Representative Pramila Jayapal reintroduced the Facial Recognition and Biometric Technology Moratorium Act in 2025. It has been stuck in committee since. The pattern is familiar: outrage after each wrongful arrest, Congressional hearings, proposed legislation, and then... nothing.

"We are running an uncontrolled experiment on people's lives," is how my friend Sandra at the EFF described it. "And the test subjects do not know they are in the experiment."

What Needs to Change

I am not anti-technology. I think facial recognition has legitimate uses — finding missing children, identifying victims, even some security applications. But after researching Lipps' case, I think the minimum requirements should be obvious:

1. Facial Recognition Cannot Be the Sole Basis for an Arrest

This should not even be controversial. It is like arresting someone because their car is the same color as a getaway vehicle. It is a lead. Investigate it.

2. Mandatory Corroborating Investigation

Before filing charges based on a facial recognition match, detectives should be required to verify the suspect's presence in the jurisdiction during the crime. Phone records, financial records, travel records. The basics.

3. Mandatory Disclosure

Defendants should know if facial recognition was used in their case. In some jurisdictions, this is not disclosed. You can be convicted based partly on algorithmic evidence you never knew existed.

4. Accuracy Auditing

Every facial recognition system used by law enforcement should be independently audited for demographic bias. Annually. With results published. NIST already does this testing — agencies should be required to only use systems that meet minimum accuracy thresholds.

The Question That Haunts Me

I keep thinking about the four young children Angela Lipps was babysitting when the Marshals arrived. I keep thinking about her sitting in a Tennessee jail cell for 108 days, knowing she had never been to North Dakota, waiting for someone to listen.

And I keep thinking about what her lawyer said: "If the only thing you have is facial recognition, I might want to dig a little deeper."

AI gave police a name. That is all it did. Everything that happened after — the arrest, the five months in jail, the lost home, the lost car, the lost dog, the Christmas Eve release with no coat in a North Dakota winter — that was human failure. Human laziness. Human indifference.

The algorithm made an error. The people made it a catastrophe.

If your agency uses facial recognition, please, for the love of everything: pick up the phone first.

More on AI ethics and policy: what AI tools actually do with your data and who owns what AI creates.

If you found this useful, check out these related articles:

Found this helpful?

Subscribe to our newsletter for more in-depth reviews and comparisons delivered to your inbox.