The Computer Picked the Culprit, But It Wasn’t Me. The Story That Changed the Law ⚖️
How an algorithmic error led to the arrest of an innocent father in front of his children—and why, as AI enthusiasts, we must learn humility.

As a huge AI enthusiast, I spend my mornings tracking new language models and my evenings testing image generators. I am fascinated by how AI pushes the boundaries of what is possible 🚀. However, for our love of technology to be mature, we must look it straight in the eye—even when its "vision" proves to be flawed and unjust.
The story of Robert Williams from Detroit is not just a common technical glitch. It is a sobering wake-up call for anyone who believes that an algorithm is always more objective than a human 🌧️.
A Driveway That Became a Jail Cell 🚔
It was January 2020. Robert Williams, a husband and father of two young daughters, was returning home from work in Farmington Hills, a suburb of Detroit. The scene that followed moments later resembled an action movie, though it lacked any logic. As soon as he parked his car, a patrol car blocked his path.
Police officers approached Robert, handcuffed him, and informed him he was under arrest. In front of his wife and crying daughters (aged 2 and 5), who watched everything from the window, he was put into the back of the police car. Robert had no idea what was happening. He had no criminal record, hadn’t been in any fights, and hadn’t so much as stolen a candy bar.
It wasn’t until he reached the precinct that he learned the charges: the theft of five luxury Shinola watches from a store, which had occurred... two years earlier ⌚.
When the Algorithm "Commits" a Crime 🤖
How did the police find Robert after two years? The answer lies in Facial Recognition software.
In October 2018, a man in a baseball cap walked out of a store with nearly $4,000 worth of watches. A surveillance camera captured his face, but the image was grainy and blurred. The Michigan State Police used an algorithm to compare that footage against a database of driver's license photos.
The system "spat out" a name: Robert Williams.
For the algorithm, this was merely a statistical probability. However, for the detectives, the computer’s result acted like a final verdict. Instead of looking for physical evidence to confirm his identity—such as fingerprints or an analysis of height and build—the officers assumed the technology couldn’t be wrong. They created a photo lineup, showed it to the store's security guard, and the guard—influenced by the AI's prior selection—pointed to Robert 📸.
"I Guess the Computer Got It Wrong" 💬
The climax of this story happened in the interrogation room. A detective placed three photos in front of Williams. The first was a printout from the surveillance footage showing the thief. The other two were photos of Robert.
Robert looked at the picture and immediately knew he was looking at someone else. He held the photograph up to his face and asked the detectives directly: "Do you really think this is me? Or do you think all Black men look alike?"
Then, something extraordinary happened. One of the detectives, looking closely at the photo, muttered under his breath: "I guess the computer got it wrong." Despite this, the machinery of the law kept moving. Robert spent 30 hours in detention, sleeping on a concrete bench. The trauma his family experienced stayed with them for years.
Why Did the AI Fail? 🧠🔍
As fans of technology, we must understand that the Williams case wasn't just bad luck. It was a systemic issue known as "data bias."
Most algorithms were trained on photo databases dominated by people with fair complexions. The result? AI is terrifyingly accurate at identifying white men, but its effectiveness drops drastically for other ethnic groups.
For the algorithm, Robert was simply "similar enough" within a statistical margin of error. The problem is that the police succumbed to "automation bias"—the tendency to uncritically trust machines, even when they contradict common sense 🛑.
The 2024 Breakthrough 🏛️
Robert Williams' story finally saw a major epilogue. In June 2024, after a years-long legal battle, Robert reached a settlement with the City of Detroit.
This is a historic victory. The Detroit Police Department is now completely banned from arresting anyone based solely on an AI match. Every use of this technology must now be supported by other, tangible evidence. This is the first case of its kind in the US where a "victim of the algorithm" has fundamentally changed the law ✨.
Lessons for Us, the Enthusiasts 💡
I love AI for how it helps us write code or diagnose diseases. But the Robert Williams case is a lesson in humility.
Technology is a reflection of ourselves—our data and our prejudices. If we allow AI to operate without human oversight, we will create systems that harm rather than help. Let’s build AI, let’s be amazed by it, but let’s never forget that there is always a human being on the other side of the algorithm. And sometimes, that person just wants to get home safely for dinner with their family 🏠👨👩👧👧.
About the Creator
Piotr Nowak
Pole in Italy ✈️ | AI | Crypto | Online Earning | Book writer | Every read supports my work on Vocal


Comments
There are no comments for this story
Be the first to respond and start the conversation.