Congress Members Demand Answers From, Investigation Of Federal Facial Rec Tech Users

from the software-terrible-with-names,-worse-with-faces dept

The ACLU’s test of Amazon’s facial recognition software went off without a hitch. On default settings, the software declared 28 Congressional members to be criminals after being “matched” with publicly-available mugshots. This number seemed suspiciously low to cynics critical of all things government. The number was also alarmingly high, as in an incredible amount of false positives for such a small data set (members of the House and Senate).

Amazon argued the test run by the ACLU using the company’s “Rekognition” software was unfair because it used the default settings — 80% “confidence.” The ACLU argued the test was fair because it used the default settings — 80% confidence. Amazon noted it recommended law enforcement bump that up to 95% before performing searches but nothing in the software prompts users to select a higher setting for more accurate results.

This upset members of Congress who weren’t used to be called criminals… at least not by a piece of software. More disturbing than the false positives was the software’s tendency to falsely match African-American Congressional reps to criminal mugshots, suggesting the act of governing while black might be a criminal activity.

Congressional members sent a letter to Amazon the same day the ACLU released its report, demanding answers from the company for this abysmal performance. Ron Wyden has already stepped up to demand answers from the other beneficiaries of this tech: federal law enforcement agencies. His letter [PDF] reads like an expansive FOIA request, only one less likely to be arrive with redactions and/or demands the scope of the request be narrowed.

Wyden is asking lots of questions that need answers. Law enforcement has rushed to embrace this technology even as multiple pilot programs have generated thousands of bogus matches while returning a very small number of legitimate hits. Wyden wants to know what fed agencies are using the software, what they’re using it for, and what they hope to achieve by using it. He also wants to know who’s supplying the software, what policies are governing its use, and where it’s being deployed. Perhaps most importantly, Wyden asks if agencies using facial recognition tech are performing regular audits to quantify the software’s accuracy.

That isn’t the only facial recognition letter-writing Wyden has signed his name to. The Hill reports Congressional reps have also sent one to the Government Accountability Office, asking it to open an investigation into facial recognition software use by federal agencies.

“Given the recent advances in commercial facial recognition technology – and its expanded use by state, local, and federal law enforcement, particularly the FBI and Immigration and Customs Enforcement – we ask that you investigate and evaluate the facial recognition industry and its government use,” the lawmakers wrote.

The letter, signed by Rep. Jerrold Nadler and Sens. Ron Wyden, Cory Booker, Christopher Coons (D-Del.) and Ed Markey (D-Mass.), asks the GAO to examine “whether commercial entities selling facial recognition adequately audit use of their technology to ensure that use is not unlawful, inconsistent with terms of service, or otherwise raise privacy, civil rights, and civil liberties concerns.”

The public has a right to know what public surveillance methods are being deployed against it and how accurate or useful these tools are in achieving agencies’ stated goals. Privacy expectations all but vanish when the public goes out in public, but that doesn’t mean their daily movements can automatically be considered grist for a government surveillance mill. Whatever privacy implications there are likely have not been addressed pre-deployment if recent surveillance tech history is any indication. Before the government wholeheartedly embraces tech with unproven track results, federal agencies need to spend some quality time with the people they serve and their overseers that act as a proxy for direct supervision.

Filed Under: , , , , ,
Companies: amazon

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Congress Members Demand Answers From, Investigation Of Federal Facial Rec Tech Users”

Subscribe: RSS Leave a comment
37 Comments
David says:

I'd expect law enforcement to turn down confidence to 20%

After all, you can’t prosecute someone for looking like a criminal. The purpose is to have a reasonable suspicion pretense with more scienciness than a drug dog. The latter are getting old. Flagging more blacks is a feature, not a bug. As you turn down confidence levels, you match more general features than individual ones, spending your limited time and favor mostly on those anatomically similar to the current prison populace, cementing the status quo.

Racial profiling does not get a dim view by the courts because it wouldn’t deliver results but because it has nothing to do with individual justice and personal responsibility and the results it delivers partly are a self-fulfilling prophesy.

Now maximizing short-term law enforcement results when faced with a populace with 2% crime rate and a populace with 1% crime rate does not mean focusing 2/3 of your attempts on group 1 and 1/3 of your attempts on group 2 but rather focusing 100% of your attempts on group 1.

That’s troubling. It’s also one of the reasons prejudice is not just stupid but actually effective, and still we cannot afford to entertain it systematically in a society based on individual responsibility and justice.

At any rate, the problem of being misidentified that is worrying the given congress members is that such misidentification comes with physical danger: you may get beaten up to the point of death or shot dead. And that’s not actually something that should happen to even someone correctly identified as a criminal.

Because the job of the police is to deliver the suspects to justice, not to deliver justice. If the consequences of misidentification weren’t considerably more fatal in the U.S. than in civilized countries, this would be easier to shrug off.

Uriel-238 (profile) says:

Re: "regular audits to quantify the software's accuracy"

If we should be having frequent audits for reasonable-suspicion mechanisms that are sciencier than a drug dog, we should have them for drug dogs. And cheap field tests.

That we don’t have them is indicative the court system doesn’t want its methods scrutinized too closely, yet it’s been thoroughly established they cannot be trusted without oversight.

So where’s our oversight of the DoJ and the courts?

PaulT (profile) says:

“Amazon noted it recommended law enforcement bump that up to 95% before performing searches but nothing in the software prompts users to select a higher setting for more accurate results.”

But.. can they demand that they turn it up… or presumably don’t turn it down? Or, are they given free range with a contract that just says they’re not responsible for false positives if it’s set too low?

That’s going to be one of several problems going forward with this tech. Like tasers, speed cameras, body cameras and so forth, law enforcement in the US do seem to have a tendency to misuse the tools. If this becomes popular, I’d expect numerous court battles where lawyers for one side try to prove that the setting was too low, while Amazon have to decide whether to protect their exiting clients or their reputation.

Then, of course, the big one – when somebody’s recognised, how do the cops react? In a decent world, they should simply be using this as a tool to help them do things they couldn’t normally do easily, like identifying suspects in large crowds. But, we all know it’ll be another case of “the computer says X” replacing common sense. It should be like a GPS being able to mark out a route more quickly and accurately than a person with a map, but it may end up be more like people driving into rivers because the GPS told them to.

The racial aspect is the final one – the tech itself will improve as more people are scanned, and the discrepancy is more likely to be one of the data collected than a deliberate bias. But, those using it could certainly wish to be biased, and when you combine being able to set “we don’t really care about accuracy” with “we’ll react as if it’s 100% accurate every time”, you do have some dangerous and deadly situations coming up.

Rekrul says:

Re: Re:

But.. can they demand that they turn it up… or presumably don’t turn it down? Or, are they given free range with a contract that just says they’re not responsible for false positives if it’s set too low?

I can tell you exactly what’s going to happen;

Cop: [Has blurry image of suspect] Run this guy through facial rec.

Tech: Sorry, no hits.

Cop: Well try turning down the confidence.

Tech: It’s at 80% and there’s still no hits.

Cop: So go lower!

Tech: If I go any lower, it’s even more likely to match some random person.

Cop: Just do your damn job and get me a match!

Tech: OK, it says this is your suspect. 45 year old father of two with no record.

Cop: Hey guys, we got a name for the asshole who took a shot at Jimmy last night. Let’s go fuck him up!

Uriel-238 (profile) says:

Re: Re: Wearing a mask is actually illegal

How about wearing hijab with face covering such as bushiyya? After SCOTUS has issued rulings that freedom of religion is more important than protecting employees or public accommodations¹ are we going to say that the police need to scan your face is more important than religion?

Then there’s the matter of facepaint, given Juggalo clown paint defeats facial recognition. Another intersection between first amendment rights and [the color of] national security.

¹ SCOTUS’ ruling opinions emphatically insisted this is not what we’re saying but its rulings have triggered new lawsuits based on implications nonetheless, ones that have not been dismissed straight away based on the SCOTUS disclaimers.

Anonymous Anonymous Coward (profile) says:

Default Settings

What is the advantage to Amazon for making the default setting at 80% confidence. Why not start with 100%, with instruction on how and when to turn it down? Do they want more ‘hits’ in order to make sales? Seems like that would backfire in the long run. Actually, looks like it is backfiring now.

The other question that pops, is if Amazon recommends that law enforcement use a 95% confidence rating, and law enforcement is their primary sales target, then why isn’t the default setting 95%?

PaulT (profile) says:

Re: Default Settings

“Why not start with 100%”

Because the software will only be as good as the input it gets, so 100% likely means that any photo it receives with any kind of artefact or blur will never return a match. That would make it effectively worthless, since anything taken outside of a studio would likely have things that only make it a 99.99% match at best.

Think of when you search on Google and you misspell a word – the software can detect that the a degree and return what it thinks you’re looking for. Now imagine if it always returned zero results every time you did that instead.

“The other question that pops, is if Amazon recommends that law enforcement use a 95% confidence rating, and law enforcement is their primary sales target, then why isn’t the default setting 95%”

I’d imagine because the software performs best at the lower setting, but for law enforcement purposes less false positives are preferred. Setting things that high for other purposes might make the software not reliable enough for customers who have less stringent requirements.

That One Guy (profile) says:

Re: Re: Default Settings

I’d imagine because the software performs best at the lower setting, but for law enforcement purposes less false positives are preferred.

Ideally, yes, however more false positives means more excuses to engage in a search and/or ‘have a chat’ with someone. Much like ‘probable cause on four legs’ I suspect that a low accuracy would not be seen as a negative by a good number of those making use of the tech.

PaulT (profile) says:

Re: Re: Re: Default Settings

Indeed, which is why I imagine that Amazon’s response here has been to state that they recommend higher settings for law enforcement. That will presumably get them out of a bunch of “it’s not our fault, it’s Amazon’s software” defences/lawsuits when the cops start doing that.

The problem is that I had to say “when” and not “if” in the above sentence, and sadly that’s not something this software can solve one way or another, unless they make it useless. Which could also be a good thing depending on your point of view, but they are not going to do that.

Anonymous Coward says:

Re: Re:

It is a common character trait with Senators (and narcissists) – they don’t care until it directly affects them in some way. that is why I think the ACLU’s test was so smart. These people literally do not care at all until they see it hurt them or their personal fortunes.

I had a boss who was notoriously abusive for this. They did not care at all about something until it directly affected them.

Childcare support? Flex hours to take care of personal needs? They were an abuse of company resources and signs of greedy employees. Employees should pay out of their own pocket for those things since the company did not benefit from the employee using them. Yah this Boss was a selfish asshole.

I’ll give you one guess what happened when that boss finally had a kid of their own.

Did you guess that suddenly those perks were one of the best things to ever happen to the company? why yes you would be right! They very loudly patted themselves on the back for coming up with those company perks.

I’ll give you one more guess what happened to those perks once the boss no longer needed to pay for childcare when their kids went to public school.

David says:

Re: Re:

Congress Members Demand Answers From Federal Facial Rec Tech Users, and Congress Members Demand Investigation Of Federal Facial Rec Tech Users.

That comma stands for “as well as an” and it’s not like the distraction of an overdone (I’d downcase “From” and “Of”) headline capitalization helps picking the grammar apart.

So formally this is grammatical, and the grammar nazis will just rough up the headline writer and leave him in the gutter for mocking them rather than actually charging him with an actionable crime.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...