Documents Show Hundreds Of Cops Have Run Clearview Searches, Often Without Their Employers' Knowledge Or Permission

from the doing-stuff-just-because-no-one-said-not-to dept

An impressive trove of public records obtained by BuzzFeed shows just how pervasive facial recognition tech is. Law enforcement agencies are embracing the tech, often with a minimum of accountability or oversight. That’s how toxic tech purveyors like Clearview — whose software relies on a multi-billion image database scraped from the web — get their foot in the door to secure government contracts.

Despite being used for years, facial recognition tech has yet to prove it’s capable of recognizing the right faces more often than the wrong ones. The accuracy gets even worse when it’s deployed to recognize faces of women and minorities — and given law enforcement’s history of disproportionate enforcement — it will be minorities harmed by the inaccurate tech more often than not.

What BuzzFeed has done with these Clearview records is compile a searchable database that allows readers to see if their local agencies have tried out the tech. Clearview’s tech has yet to be subjected to outside review and its method of obtaining images — scraping them from public posts on the web — leaves a lot to be desired in terms of accuracy. (Unfortunately, as the EFF’s Dave Maas points out, this doesn’t mean BuzzFeed has made the dataset public — only its interpretation of the data. But we’ll take what we can get.)

The upshot? Lots and lots of experimentation. The downside? Very little oversight or explicit permission. According to the information BuzzFeed obtained, more than 335 US law enforcement agencies have at least tried out Clearview’s facial recognition AI, and many of those searches had nothing to do with investigations.

Several of the responding agencies appear to be paying little attention to the actions of their employees:

Officials at 34 of those organizations said they were unaware that their employees had signed up for free trials until our questions prompted them to look.

Meanwhile, others pretended they had no responsive documents until asked twice:

Officials at another 69 entities at first denied their employees had used Clearview but later determined that some of them had.

While a tally of 335 law enforcement agencies may seem minute in comparison to the total number of law enforcement agencies in the United States, it would be wise to remember this dataset is far from complete. Nearly 100 agencies refused to answer definitively whether or not their employees had explored Clearview’s offerings. Nearly 1,200 agencies have yet to turn over requested documents.

Meanwhile, Clearview has been out there touting law enforcement successes few law enforcement agencies will acknowledge. It also encourages the perception it is currently partnering with hundreds of cop shops while trying to divert attention away from its willingness to grant access to anyone interested in its unproven tech, including government agencies in countries known for their human rights abuses.

This data shows there’s more interest in Clearview than law enforcement (as a whole) is willing to admit publicly. It also shows cops are playing with unproven tech — quite possibly using images of people who aren’t suspected of anythingwithout the knowledge of their supervisors or government officials charged with overseeing their actions. What’s left unsaid — or unresponded to — is at least as concerning as what has been admitted publicly. There’s a rogue AI on the loose that links cops to a database filled with billions of images scraped from websites without their — or their end users’ — approval.

Clearview is entirely problematic. But these agencies’ willingness to exploit and examine this tech is even more so, considering the damage to rights and civil liberties the careless use of unproven facial recognition tech can cause.

Filed Under: , ,
Companies: clearview

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Documents Show Hundreds Of Cops Have Run Clearview Searches, Often Without Their Employers' Knowledge Or Permission”

Subscribe: RSS Leave a comment
4 Comments
That Anonymous Coward (profile) says:

Re: Re:

Sadly no.

Clearview only exists to take a photo of an unknown & tell you who they are by doing some voodoo with the billions of scraped images.

Given the history of police using the information databases they already have access to to stalk ex’s, hit on cuties they pulled over, & other really creepy rule violating actions (that end up with no real punishments) its more likely they were using them to try and get laid.

Sadly no one knows how good Clearview is, on the ‘upside’ their training dataset includes multiple skin tones & facial features, but its still a black box that shouldn’t exist in legal proceedings.
Their sketchy history & actions combined with the founder being besties with notorious floor pooper Chuck isn’t helping.

Given how police like to omit facts from cases once they’ve decided on a target, one wonders if one of them was dumb enough to use this tech to secure a conviction. Given the departments who claimed oh we never did that & then admitted oh I guess we did one has to wonder if they’ll lie about verifiable facts can they be trusted in court.

For all of these advancements in tech that they keep saying we need… can anyone actually produce cases where it worked?
I mean we’ve got people railroaded into jails again b/c they relied on unproven tech that has a racial bias a few times now, but no real success stories. I mean I wonder if anyones run it on the Jan 6 protesters the way they’ve used them in other cases… I mean all of those white faces are perfect for the racist tech to id.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...