Facial recognition jails Tennessee grandmother for 6 months as AI misidentification cases mount

2 Sources

Share

Angela Lipps spent nearly six months behind bars after facial recognition software wrongly identified her in a North Dakota bank fraud case. She was arrested at gunpoint while babysitting, held without bail, and later cleared when bank records proved she was 1,200 miles away. This marks at least the ninth documented case of wrongful arrest driven by AI-driven misidentification, revealing a troubling pattern where law enforcement skips basic verification steps despite explicit warnings from AI vendors.

Facial Recognition Software Leads to Six-Month Wrongful Incarceration

Angela Lipps, a 50-year-old Tennessee grandmother, endured nearly six months behind bars after facial recognition technology incorrectly linked her to a bank fraud investigation in Fargo, North Dakota

1

. U.S. Marshals arrested her at gunpoint on July 14 while she was babysitting four children, charging her with four counts of unauthorized use of personal identifying information and four counts of theft

2

. The Angela Lipps case represents at least the ninth documented instance of wrongful arrest stemming from AI misidentification, exposing critical failures in how law enforcement deploys these systems.

Source: TechSpot

Source: TechSpot

The Fargo Police Department was investigating a series of incidents from April and May where a woman used a fake U.S. Army ID to withdraw tens of thousands of dollars from banks

1

. Detectives ran surveillance footage through facial recognition software, which returned a match to Lipps. A detective then compared her Tennessee driver's license photo and social media images to the suspect, concluding she was the perpetrator based on facial features, body type, and hair

1

. Critically, no one from the department contacted Lipps before her arrest.

Bank Records Prove Innocence After Months of Wrongful Incarceration

Lipps spent 108 days in a Tennessee county jail without bail before North Dakota officers transported her for the bank fraud investigation

1

. Her attorney, Jay Greenwood, immediately requested bank records that would prove decisive. When Fargo police finally met with Greenwood and Lipps on December 19—five months after her arrest—the records showed she had been buying cigarettes and depositing Social Security checks in Tennessee at the time police placed her in Fargo, more than 1,200 miles away

1

2

.

Source: Tom's Hardware

Source: Tom's Hardware

The case was dismissed on Christmas Eve, but the damage proved catastrophic. Lipps was released in Fargo with no money, no coat, and no way home

1

. Local defense attorneys provided funds for a hotel room and food on Christmas Eve and Christmas Day, while the F5 Project, a local non-profit, helped her return to Tennessee

2

. Unable to pay bills from jail, she lost her house, her car, and even her dog. The Fargo Police Department has yet to issue an apology .

Pattern of Police Use of Facial Recognition Without Proper Verification

A January 2025 Washington Post investigation documented at least eight instances of Americans experiencing wrongful arrest after police found a possible facial recognition match . In every case, investigators skipped fundamental steps like checking alibis and comparing physical descriptions that would have cleared suspects before arrest. This pattern of inaccurate AI identification reveals a systemic problem where law enforcement treats AI results as definitive rather than investigative leads requiring corroborating evidence.

Vendors like Clearview AI attach explicit caveats to their systems, requiring agencies to acknowledge that results "are indicative and not definitive" and that officers must conduct further research before acting . According to an April 2024 ACLU submission to the U.S. Commission on Civil Rights, in at least five of seven wrongful arrest cases, police had received explicit warnings that facial recognition results don't constitute probable cause but made arrests anyway .

Limited Legal Protections and the Robert Williams Precedent

Robert Williams, whose 2020 wrongful arrest in Detroit was the first publicly reported case of a facial recognition false-positive, reached a landmark settlement with the city in June 2024 . Williams was arrested after his driver's license photo was flagged as a likely match to a man stealing designer watches in 2018, and during questioning was told the arrest was based solely on facial recognition results

2

. Detroit later agreed to pay him $300,000, and the city now requires independent corroborating evidence before any facial recognition match can be used to seek an arrest warrant .

However, only 15 states had enacted any FRT laws covering law enforcement at the start of 2025, and North Dakota is not among them . This legislative gap leaves millions of Americans vulnerable to similar experiences. As facial recognition technology becomes more widespread in law enforcement operations, the absence of mandatory verification protocols and accountability measures raises urgent questions about civil liberties and due process. The grandmother wrongfully jailed for six months now awaits an apology that hasn't come, while her case underscores the human cost of deploying AI systems without adequate safeguards or investigative rigor.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo