AI Mistake Throws Innocent Grandmother in Jail for Nearly Six Months

Sign up to see the future, today: Can’t-miss innovations from the bleeding edge of science and tech. Yet, sometimes, the future looks alarmingly like a step backward into injustice, especially when nascent technologies are wielded without due diligence. An AI system’s little oopsie, coupled with a police department’s staggering incompetence, landed an innocent grandmother in jail for nearly half a year, shattering her life and exposing critical flaws in the modern justice system.

Harrowing reporting by North Dakota radio station WDAY details the shocking ordeal of 50-year-old Angela Lipps. She spent nearly six months incarcerated after Fargo police, utilizing an AI facial recognition tool, mistakenly identified her as a suspect in a bank fraud case in the state. This wasn’t a minor inconvenience; it was a devastating loss of freedom, dignity, and nearly everything she owned, all stemming from a technological misstep compounded by egregious human error.

The Arrest: A Nightmare Begins

Angela Lipps, a devoted mother of three and grandmother of five, had lived her entire life in north-central Tennessee. This tranquil existence was shattered last July when US Marshals, acting on a warrant derived from the AI’s faulty match, descended upon her doorstep. She was in the midst of babysitting four of her grandchildren when the heavily armed officers arrived, their presence a terrifying intrusion into her peaceful home. Arrested at gunpoint, Lipps was immediately plunged into a bewildering nightmare, accused of crimes committed a thousand miles away in a state she had never once visited.

The shock and confusion were overwhelming. Imagine being suddenly apprehended, handcuffed, and taken away from your grandchildren, all while vehemently protesting your innocence to officers who seem disinclined to listen. Lipps’ immediate thoughts were of her family, her life in Tennessee, and the utter impossibility of the accusations. How could she, a woman with no criminal record and no ties to North Dakota, be implicated in a bank fraud scheme in Fargo?

Months of Misery: Held Without Bail

Initially, Lipps was booked in a Tennessee county jail, not as a suspect in a local crime, but as a “fugitive from justice” from North Dakota. This designation had dire consequences: she was held without bail, deemed a flight risk despite her deep roots in Tennessee and complete lack of knowledge about the alleged crime. For nearly four agonizing months, Angela Lipps languished in that jail, a prisoner of a system that had failed to verify even the most basic facts.

During this period, she was assigned a court-appointed lawyer for the extradition process. The legal advice she received was stark: if she wanted to fight the charges, she would have to travel to North Dakota. This presented an impossible dilemma for Lipps, who had no financial resources to make such a journey, let alone the legal means to mount a defense in a distant state. “I’ve never been to North Dakota, I don’t know anyone from North Dakota,” Lipps repeatedly told her lawyer and anyone else who would listen, her pleas of innocence seemingly falling on deaf ears within the rigid confines of the legal system.

Her isolation grew with each passing day. Separated from her family, her home, and her familiar surroundings, she faced the crushing despair of being unjustly incarcerated, her pleas of innocence dismissed by a bureaucratic machine set in motion by a faulty algorithm. The psychological toll of this prolonged detention, knowing she was innocent but powerless to prove it from her cell, was immense.

The Police Blunder: Blind Trust in AI

According to Fargo Police Department files obtained by WDAY, the colossal error originated from surveillance footage. Detectives were investigating bank fraud cases in April and May 2025, where a woman was seen using a fake US Army military ID to withdraw tens of thousands of dollars. To generate leads, the detectives turned to an AI facial recognition software, which promptly identified Angela Lipps as the person in the video. This single, unverified algorithmic match became the bedrock of their entire investigation.

What followed was a stunning display of investigative negligence. The police seemingly did little to verify the AI’s lead. Court documents indicate that a detective merely “agreed that the suspect’s facial features, body type, and hair were a match to Lipps.” This superficial comparison, made without any further corroborating evidence or even a simple phone call, was deemed sufficient to justify an interstate arrest warrant. Lipps herself confirmed that no one from the Fargo police department ever contacted her to question her, to seek an alibi, or to conduct any form of due diligence that might have prevented this catastrophic mistake.

Adding insult to injury, the Fargo police department took an additional 108 days to pick up Lipps from her Tennessee jail after she had already spent months in detention. This inexplicable delay further extended her unjust incarceration. Once finally transported to North Dakota, she made a court appearance, but it wasn’t until December – after more than five months behind bars – that she was finally interviewed by Fargo police. This timeline illustrates a profound breakdown in basic investigative procedures and a shocking disregard for an individual’s liberty.

Exoneration and Abandonment: A Bitter Release

It was Jay Greenwood, a lawyer representing Lipps in North Dakota, who finally brought common sense and proper investigative work to the case. “If the only thing you have is facial recognition, I might want to dig a little deeper,” Greenwood astutely observed to WDAY. He didn’t need advanced technology or complex forensics. Instead, he simply produced bank records proving that Angela Lipps was more than 1,200 miles away in Tennessee at the exact time the bank fraud was perpetrated in North Dakota. This irrefutable alibi, which any diligent investigation could have uncovered months earlier, exposed the Fargo police’s profound negligence.

With Greenwood having essentially done their job for them, the police were left with no choice but to release Lipps from jail on Christmas Eve, dropping all charges against her. The moment of her release, however, was far from the joyous reunion one might expect. Lipps was now free, but utterly stranded. The police, having ripped her life apart based on a faulty AI match and their own incompetence, offered no assistance for her return home. With no money to her name, she was left in Fargo, North Dakota, in the dead of winter, dressed only in her summer clothes.

“I had my summer clothes on, no coat, it was so cold outside, snow on the ground, scared, I wanted out but I didn’t know what I was going to do, how I was going to get home,” Lipps recounted, describing her terrifying predicament. It was not the police, but a network of compassionate individuals and organizations that stepped in. Sympathetic local defense attorneys pooled together money to pay for a hotel room, and a local nonprofit called the F5 Project arranged her trip back to Tennessee. This act of community kindness stands in stark contrast to the cold indifference shown by the very authorities who had imprisoned her.

A Life Shattered, No Apology Offered

The consequences of this six-month ordeal are catastrophic for Angela Lipps. She lost her home, her car, and even her beloved dog as a direct result of her unjust incarceration. These are not mere inconveniences; they represent the complete dismantling of her life, her financial stability, and her emotional well-being. The psychological trauma of being falsely accused, unjustly imprisoned, and then abandoned, is immeasurable.

Perhaps most galling, Lipps states that no one from the Fargo police department has offered a single apology for the disastrous mix-up. This lack of accountability only deepens the wound, reinforcing the perception that the system, once it makes a mistake, is unwilling to acknowledge or rectify its wrongs beyond the bare minimum of releasing the innocent.

A Systemic Problem: AI’s Flaws in Law Enforcement

Angela Lipps’ case is not an isolated incident; it serves as a chilling testament to the inherent dangers of unchecked AI adoption in law enforcement, particularly facial recognition technology. This isn’t the only criminal case of mistaken identity caused by AI tools, highlighting a systemic problem that demands urgent attention.

In April of last year, for instance, the New York Police Department arrested a man named Trevis Williams based on a facial recognition match from grainy CCTV footage. This occurred despite glaring physical discrepancies, including Williams being over half a foot taller than the suspect captured in the video. The reliance on the AI’s output seemingly overshadowed basic human observation and critical thinking.

Similarly, in February of the same year, a woman in Detroit sued the city’s police department, alleging that she was arrested after a facial recognition tool identified her as a murder suspect. Again, there were similarly blatant discrepancies in her physical appearance compared to the actual perpetrator. These cases underscore a disturbing pattern: AI, while powerful, is not infallible, and its errors can have devastating real-world consequences, especially when human oversight is minimal or biased by automation.

Experts and civil liberties advocates have long warned about the inherent flaws and biases within facial recognition technology. Studies have shown that these systems often exhibit lower accuracy rates for individuals with darker skin tones and for women, leading to a disproportionate impact on marginalized communities. The promise of efficiency and crime reduction offered by AI must be weighed against the fundamental rights to due process, privacy, and freedom from unjust detention.

The “guilty until proven innocent” dynamic that AI can inadvertently create is a direct threat to the bedrock principles of justice. When a machine’s output is taken as gospel, without rigorous verification, it erodes trust in the justice system and jeopardizes the lives of innocent individuals. There is a growing call for stricter regulations, independent audits, and even moratoriums on the use of facial recognition technology by law enforcement until its accuracy, fairness, and accountability mechanisms can be guaranteed.

Conclusion: The Urgent Need for Reform

Angela Lipps’ harrowing six-month ordeal in jail due to an AI mistake and police incompetence is a stark warning. Her story is a powerful reminder that while technology offers incredible potential, its deployment in sensitive areas like law enforcement demands meticulous scrutiny, robust human oversight, and an unwavering commitment to civil liberties. The human cost of unchecked automation and negligent investigation is simply too high. Urgent reforms are needed to ensure that no one else suffers the profound injustice Angela Lipps endured, and that the future of justice is built on principles of fairness and accuracy, not flawed algorithms and unquestioning trust.