Detroit PD Now Linked To Two Bogus Arrests Stemming From Facial Recognition False Positives

from the no-problem,-it's-just-rights-and-freedom dept

Late last month, the first known false arrest linked to facial recognition software was reported. But that first in AI police work now appears to be merely a repeat offender. There have been two bogus arrests linked to facial recognition false positive. And both bogus arrests were performed by the same law enforcement agency, the Detroit Police Department. Elisha Anderson of the Detroit Free Press has the details on the first blown call by the PD's software.

The high-profile case of a Black man wrongly arrested earlier this year wasn't the first misidentification linked to controversial facial recognition technology used by Detroit Police, the Free Press has learned.

Last year, a 25-year-old Detroit man was wrongly accused of a felony for supposedly reaching into a teacher’s vehicle, grabbing a cellphone and throwing it, cracking the screen and breaking the case.

Detroit Police used facial recognition technology in that investigation, too.

This man, Michael Oliver, was charged with larceny for the May 2019 incident he didn't actually participate in. The report by the Free Press contains photos of both the person caught on the phone's camera and Michael Oliver. They highlight one major problem with facial recognition software: even if one could be persuaded the two faces are a close match (and they don't appear to be), the recording used by investigators to search for a match showed the suspect's bare arms. The person committing the crime had no tattoos. Michael Oliver's arms are covered with tattoos, running from the top of his hands all the way up to his shoulders.

The facial recognition software delivered its mistake to investigators, who included this mismatch in the photos they presented to the person whose phone had been grabbed.

During the investigation, police captured an image from the cellphone video, sent it for facial recognition and the photo came back to Oliver, the police report said.

After Oliver was singled out, a picture of his face was included in a photo lineup of possible suspects that was presented to the teacher.

A second person, a student, was also captured in the video with the suspect. The officer in charge of the case testified he didn’t interview that person though he'd been given that student’s name.

Once again, the Detroit PD and local prosecutors are promising the thing that has already happened twice won't happen again. There are new processes in place, although it's unclear when those policies went into effect. Oliver was arrested late last year. The other bogus arrest occurred earlier this year. In both cases, reporters were assured by law enforcement spokespeople that things have changed.

Here's what official say is now in place, even though it's too little too late for two Black men arrested and charged for crimes they didn't commit. There are "stricter rules" in effect. Matches returned by the system are not probable cause for anything, but can only be used as "investigative leads." Supposedly, this software will now only be used to identify people wanted for violent felonies.

Prosecutors are doing things a bit differently, too. But it's a reaction, rather than a proactive effort. t's only now -- after two false arrests -- that the prosecutor's office is mandating review of all facial recognition evidence by the city's top prosecutor, Kym Worthy. Investigators must also produce corroborating evidence before seeking to arrest or charge someone based on a facial recognition match.

This seems unlikely to change anything. Outside of the limitation to violent crimes, both of these cases could have gone through the review process -- along with the limited corroborating evidence (in both cases, crime victims picked the AI's mismatches out of a lineup) -- and still resulted in the arrest of these two men. In both cases, investigators ended their investigations after this step, even though they were given the opportunity to interview other witnesses. If corroboration is nothing more than discovering humans are just as bad at identifying people as the PD's software is, the mistakes the PD claims will never happen again will keep on happening.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: detroit, detroit police department, facial recognition, false arrest, michael oliver

Reader Comments

Subscribe: RSS

View by: Thread

  1. icon
    PaulT (profile), 14 Jul 2020 @ 11:08pm

    Re: Pffffew

    False positive corona testing exist, as it will with any test. What doesn't lie is the death rate, which is steadily climbing in the US consistently with the rise in infections over the last few weeks.

Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here

Subscribe to the Techdirt Daily newsletter

Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat

Warning: include(/home/beta6/deploy/itasca_20201215-3691-c395/includes/right_column/ failed to open stream: No such file or directory in /home/beta6/deploy/itasca_20201215-3691-c395/includes/right_column/ on line 8

Warning: include(): Failed opening '/home/beta6/deploy/itasca_20201215-3691-c395/includes/right_column/' for inclusion (include_path='.:/usr/share/pear:/home/beta6/deploy/itasca_20201215-3691-c395:/home/beta6/deploy/itasca_20201215-3691-c395/..') in /home/beta6/deploy/itasca_20201215-3691-c395/includes/right_column/ on line 8
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it

Email This

This feature is only available to registered users. Register or sign in to use it.