Detroit Police Chief Says Facial Recognition Software Involved In Bogus Arrest Is Wrong '96 Percent Of The Time'

from the good-to-have-that-all-out-in-the-open dept

The law enforcement agency involved with the first reported false arrest linked to facial recognition software is talking about its software. The Detroit Police Department — acting on a facial recognition “match” handed to it by State Police investigators — arrested resident Robert Williams for allegedly shoplifting watches from an upscale boutique.

Williams did not commit this robbery. He even had an alibi. But the investigators weren’t interested in his answers or his questions. They had a lo-res screen grab from the store’s CCTV camera — one that software provided by DataWorks Plus said matched Williams’ drivers license photo. Thirty hours later, Williams was cut loose by investigators, one of which said “I guess the computer screwed up” after rewatching the camera footage with Williams present.

The officers ignored the bold letters on top of the “match” delivered by the software. The writing said “This document is not a positive identification.” It also said the non-match was not “probable cause for arrest.” Unfortunately for the misidentified Michigan resident, the cops who arrested him treated the printout as both: positive identification and probable cause.

The policies governing law enforcement’s use of this tech have changed since Williams’ arrest in January. Under the current policy, lo-res images like the one that led to this arrest are no longer allowed to be submitted to the facial recognition system. That fixes a very small part of the problem. The larger problem is that the tech is mostly good at being bad. This isn’t a complaint from critics. This comes directly from the top of the DPD.

In a public meeting Monday, Detroit Police Chief James Craig admitted that the technology, developed by a company called DataWorks Plus, almost never brings back a direct match and almost always misidentifies people.

“If we would use the software only [to identify subjects], we would not solve the case 95-97 percent of the time,” Craig said. “That’s if we relied totally on the software, which would be against our current policy … If we were just to use the technology by itself, to identify someone, I would say 96 percent of the time it would misidentify.”

So, it’s a bad idea to rely on the software and nothing else when attempting to identify criminal suspects. What happened to Robert Williams supposedly should never happen again… according to policy. But there’s the question of why the PD still uses the software if it’s not adding much value to the investigative process. It seems officers don’t care much for the tech and yet it’s still in use. The police chief thinks it’s still worth keeping around, even if it has yet to show any positive results.

Craig and his colleague, Captain Ariq Tosqui, said that they want to continue using facial recognition because they say it can be a tool to assist investigators even if it doesn’t often lead to arrest. But even when someone isn’t falsely arrested, their misidentification through facial recognition can often lead to an investigator questioning them, which is an inconvenience at best and a potentially deadly situation at worst. According to Tosqui, the technology has been used on a total of 185 cases throughout the years. “The majority of the cases the detective reported back that [the software] was not useful.”

What it can do, apparently, is add another layer of surveillance on top of what’s already in place. And it will do what most law enforcement surveillance creep does: put more eyes on Black residents. Part of the mild reforms Detroit enacted — rather than implement a facial recognition ban — mandate the publication [PDF] of facial recognition data by the department. Since the beginning of this year, the PD has used the tech 70 times. All but two of those instances involved a Black resident’s photos being submitted.

One of the main sources for uploaded photos is the Project Green Light (PGL) network of cameras [PDF]. These are installed in or around businesses that participate in the project. Participation comes at a cost: $1-6,000 for the initial investment and $1,600 a year in video storage fees. The 550 participants’ cameras are supposedly monitored 24/7 at the PD’s “real time crime center.” Incoming footage can be subjected to other surveillance tech, like facial recognition software and automatic license plate readers.

Here’s the panopticon the city wanted:

The City of Detroit put forth a Request for Proposals for a contractor to work closely with the city, DPD, and Motorola (Company that help set up the RTCC) to set up a “turn-key” facial recognition system that would work with the already existing infrastructure of the RTCC. They specifically asked that the facial recognition work on at least 100 concurrent real-time video feeds, be integrated into the PGL system, and can be used by officers with a mobile app.

It got the software it needed from DataWorks: a blend of two algorithms known as “Face Plus.” It’s scary stuff.

Face Plus is capable of automatically searching all faces that enter camera frames against photos in the entity’s database, alerting authorities to any algorithmic matches. Additionally, there is a “watchlist” option where persons of interest can be monitored and alerted for…

According to DataWorks Plus, in 2017, this repository contained 8 million criminal pictures and 32 million “DMV” pictures. As the Free Press reported in March 2019, almost every Michigan resident has a photo of them in this system.

This is the system the police chief calls 96% inaccurate and DPD investigators call useless. Perhaps this false arrest, which has made national news, will put the brakes on the planned expansion of PGL and Face Plus to all public transit stops and vehicles. Or maybe it will convince the city it’s paying too much for something that barely works and take facial recognition away from the PD until something better — and less biased — comes along.

Filed Under: , , , ,
Companies: dataworks plus

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Detroit Police Chief Says Facial Recognition Software Involved In Bogus Arrest Is Wrong '96 Percent Of The Time'”

Subscribe: RSS Leave a comment
13 Comments
That One Guy (profile) says:

'Can't blame us, it's the software's fault.'

If the goal is more efficient use of time and resources then software throwing out almost nothing but false positives is a terrible idea, so either the department is run by idiots or the software is being used for something else.

Given an almost 100% failure rate will ensure a lot of wasted time the only reason I can think to keep using it is as a scapegoat, a way to excuse any bogus investigations and/or arrests by claiming that since the software noted a match they were just following up, ‘just to be sure’.

Upstream (profile) says:

Facial recognition software is garbage!

Facial recognition software is so full of nonsense as to be farcical. The inherent, often racial, biases built into the systems are widely known. And while Detroit Police Chief James Craig admits the software is junk, most do not, particularly the companies selling the software (I’m especially looking at you, Clearview and Hoan Ton-That). One of the many holes in their bogus claims of facial recognition software’s accuracy is a logical fallacy called base rate neglect. It is counter-intuitive, and involves some math, but you can find some good explanations of it here and here. Bottom line;tl;dr While the apparently high numbers claimed for facial recognition software’s accuracy may be technically true, they mask just how horrible the stuff really is. It’s one of those "lying with statistics" things.

Anonymous Coward says:

Or maybe it will convince the city it’s paying too much for something that barely works and take facial recognition away from the PD until something better — and less biased — comes along.

I’m sorry? Where did the "biased" come from?

Perhaps:

Or maybe it will convince the city it’s paying too much for something that barely works and take … away … the PD until something better — and less biased — comes along.

Is that what you meant?

TheResidentSkeptic (profile) says:

TV shows are supposed to be fantasy...

This RFQ "…Request for Proposals for a contractor to work closely with the city, DPD, and Motorola (Company that help set up the RTCC) to set up a “turn-key” facial recognition system that would work with the already existing infrastructure of the RTCC. They specifically asked that the facial recognition work on at least 100 concurrent real-time video feeds, be integrated into the PGL system, and can be used by officers with a mobile app …"

Seems to come right out of Person of Interest. Someone took it a little too seriously and really wants their own "The Machine"

crogs says:

Facial recognition in China can now recognize foreigners with sunglasses on, and a mask on (said foreigners ) chin….

Be careful what you wish for chicken Little.

“Craig and his colleague, Captain Ariq Tosqui, said that they want to continue using facial recognition because they say it can be a tool to assist investigators even if it doesn’t often lead to arrest.”

Gotta love how the ADL and its spawn have managed to “product place ” every foreign sounding name into US headlines though….

That sure didnt work out well for that Somali cop in Minneapolis, who shot the white privilege -banging -on -squad -car woman.

I bet Offcr Craig will be our next Derek Chauvin.

Any takers?

The Bronfman, etc. klan/syndicate rolls like that.

Coyne Tibbets (profile) says:

Police are the bad guys

An inaccuracy of 94%? I could do better with a Ouija Board. I could do better throwing darts at a Michigan map blindfolded.

Let’s say that what he means is that 94% of matches generated prove to be false. Now I’m sure that’s not what the sales pitch said, but who trusts those? If I had a piece of software that failed 94% of the time, I would ashcan it and a lawsuit against the vendor would follow. Not the police department, no siree. It must be meeting some need of theirs.

What could that be? In this case, it probably produced seventeen "wild geese." Which brings us to the problem of picking out the Robert Williams, the goose. I wasn’t there, but I’m betting it went something like this, as they looked through the seventeen candidates: "Too rich. Too sympathetic. Too professional. Too connected. Wait…here’s a one that’s poor…don’t you think he looks right?"

Which brings us to the eyewitness. Of course, the police probably helpted that along as well… "Which one of these six men is the shoplifter? Not sure? Which one looks most like him? Still not sure? Well did you look at #4?" [wink, wink, nudge, nudge]

So now they have a candidate, and credit for the collar…and who cares about the goose’s — Robert William’s rights? They probably figured he did something even if he was innocent. They certainly didn’t expect him to bond out, and probably expected him to plead guilty to avoid a trial.

Who wants to waste time finding real criminals when convicting an innocent is so easily?

The point of all of this is…the bad guys here are the police. Yes the software is crap, but it probably wouldn’t exist if if the police everywhere weren’t so gung ho about it. It wouldn’t still be installed in Detroit and being used, if police there didn’t regard it as good enough. It comes down to the same old story: Police violating civil rights wholesale, and looking to expand the franchise — with just a pinch of facial recognition companies to help them out.

Uriel-238 (profile) says:

This reminds me of Chicago police dogs

Detection dogs in Chicago will false-positive signal when sniffing Latins at a rate in the 90%+ range.

And yet they’re still a legal viable gateway to probable cause.

Given a false positive facial recognition led to an arrest, it sounds like it’s the same thing.

Next thing they’re going to start using Michael Shermer’s dousing rods.

Anonymous Coward says:

It’s awful, but we want to continue using it, i mean, paying large amounts of cash to some company. Hopefully one day we will make facerec scary accurate and track everyone 24-7, generally for non-criminal purposes. Unless we want to create a criminal out of someone.

Then there’s this: $1,600 a year in video storage fees. Lolwut? GTAFO.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...