UK Says South Wales Police's Facial Recognition Program Is Unlawful

from the too-many-faces,-too-few-legal-protections dept

The South Wales Police has been deploying a pretty awful facial recognition program for a few years now. Back in 2018, documents obtained by Wired showed its test deployment at multiple events attended by thousands was mostly a mistake. The system did ring up 173 hits, but it also delivered nearly 2,300 false positives. In other words, it was wrong about 92% of the time.

Civil liberties activist Ed Bridges sued the South Wales Police after his image was captured by its camera system, which is capable of capturing up to 50 faces per second. Bridges lost at the lower level. His case was rejected by the UK High Court, which ruled capturing 50 faces per second was "necessary and proportionate" to achieve its law enforcement ends.

Fortunately, Bridges has prevailed at the next level. The Court of Appeal has ruled in favor of Bridges and against the SWP's mini-panopticon.

The decision [PDF] opens with a discussion of the automated facial recognition technology (AFR) used by the SWP, which runs on software developed by NEC called "NeoFace Watch." Watchlists are compiled and faces that pass SWP's many cameras are captured and compared to this list. On the list are criminal suspects, those wanted on warrants (or who have escaped from custody), missing persons, persons of interest for "intelligence purposes," vulnerable persons, and whatever this thing is: "individuals whose presence at a particular event causes particular concern."

Here's how it works:

The CCTV camera records footage for the duration of any AFR Locate deployment. AFR Locate is capable of scanning 50 faces per second (although that does not necessarily mean 50 different people). Beyond those technical limitations, there is no limit on the number of persons who may have their facial biometrics captured during any given deployment. It is SWP’s intention during each deployment to allow AFR Locate to process as many individuals as possible. It is clear that the numbers of persons processed are very large. Over the 50 deployments that were undertaken in 2017 and 2018, around 500,000 faces may have been scanned. The overwhelming majority of persons whose biometrics are captured and processed by SWP using AFR Locate are not suspected of any wrongdoing and are not otherwise of interest to the police.

People in the area where the cameras operate are notified via placards and handouts by police officers. Obviously, there's no way of notifying everyone in the area their face might be captured and run against a watchlist that is only limited by tech, rather than desire. (It currently maxes out at 2,000 individuals.)

Is this all lawful? Well, some of it is, but most of it isn't. The court appears to believe the deployment is outpacing what the law actually allows. There's not much in the law that specifically allows for this deployment and use.

The fundamental deficiencies, as we see it, in the legal framework currently in place relate to two areas of concern. The first is what was called the “who question” at the hearing before us. The second is the “where question”. In relation to both of those questions too much discretion is currently left to individual police officers. It is not clear who can be placed on the watchlist nor is it clear that there are any criteria for determining where AFR can be deployed.

The court says it's troubling there's no specific internal guidance either, which allows officers to determine where cameras are placed. They rarely seem to be placed in areas where suspects are most likely to be present. Most often, they appear to be deployed where officers feel they might be able to capture the largest number of faces.

The court also says the system violates UK privacy rights as defined in Article 8. It appears the South Wales Police made no effort to mitigate this rights violation. Its data protection impact assessment made only the slightest reference to these rights before deciding it was probably ok because the ends of catching criminals justifies pretty much any means the SWP chooses to deploy.

First, the DPIA contained no assessment of the impact of the deployment of AFR on the protection of the personal data of members of the public who might be affected by the measures. Secondly, it contained no assessment of the risks to their rights and freedoms, so that, for example, there was little or no engagement with the fact that SWP’s use of AFR involved the collection of data on a blanket and indiscriminate basis, and it placed too little weight on the interference posed by the initial collection itself and the plainly ambitious scale of the collection, particularly bearing in mind that it involved the sensitive processing of biometric data. Thirdly, the assessment did not adequately address the risk that a false positive would result in innocent members of the public having their biometric data retained for longer periods and place them at risk of being subjected to more intrusive interventions by police. Fourthly, the assessment of the right to privacy under Article 8 of the Convention, privacy risks, and possible mitigation of those risks, was negligible at best, being more concerned with the technical operation of AFR.

The court then points out the system used by the SWP may be prone to bias. Or it may not be. There's no way to know because the company that made the software refused to discuss it in court.

The fact remains, however, that SWP have never sought to satisfy themselves, either directly or by way of independent verification, that the software program in this case does not have an unacceptable bias on grounds of race or sex. There is evidence, in particular from Dr Jain, that programs for AFR can sometimes have such a bias. Dr Jain cannot comment on this particular software but that is because, for reasons of commercial confidentiality, the manufacturer is not prepared to divulge the details so that it could be tested. That may be understandable but, in our view, it does not enable a public authority to discharge its own, non-delegable, duty under section 149.

Since there is the potential for bias to seep into the software and result in unjustified targeting of races or sexes, the program may violate equality protections set down in the Public Sector Equality Duty (PSED), which requires government agencies to eliminate discriminatory processes and procedures. Since the company has not been forthcoming -- and the SWP doesn't appear to have anything in place to address possible bias -- the AFR program violates the law.

This is a big ruling. London is infested with CCTV cameras and the government has warmly embraced domestic surveillance. If this particular ideation of facial recognition is unlawful, there's a good chance the programs deployed elsewhere are as well. As long as facial recognition systems continue to rack up more false positives than legal wins, innocent UK citizens are at risk.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: cctv, ed bridges, facial recognition, neoface watch, privacy, south wales, uk


Reader Comments

Subscribe: RSS

View by: Thread


  • identicon
    Annonymouse, 20 Aug 2020 @ 4:51am

    So?

    It took going up the ladder before one judge pointed out the obvious.
    This raises so many questions.
    Were the other courts biased, incompetent, corrupt or all three.
    What are the punishments for breaking either law if any?
    Will the police administrators and officers be held personally accountable for knowingly breaking the law.
    Will the company be punished in any way for aiding and abetting criminal acts?

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 20 Aug 2020 @ 6:52am

    Reading this, I'm wondering if the tech is really so bad that in can only find one face per frame. I'm pretty sure algorithms exist that can find more than one face per frame of video in less than 1/50 seconds.

    Not only should the tech not be used, but they bought the cheap stuff!

    reply to this | link to this | view in chronology ]

  • icon
    Tanner Andrews (profile), 20 Aug 2020 @ 7:30am

    What are the punishments for breaking either law if any?

    None. This appears to be a declaratory judgment action, and the court gave a declaration that says the South Wales deployment of automatic facial recognition did not comply with surveillance law or with anti-discrimination law. This is very rough summary, the ultimate results are at the end of the opinion and before the appendix.

    Will the police administrators and officers be held personally accountable for knowingly breaking the law

    Pull the other one, it's got bells on.

    In this case, however, that may not be as unfair as it seems. It is arguable, at least from the lower court views, that the question of legality was not well settled.

    reply to this | link to this | view in chronology ]

  • identicon
    Whoever, 20 Aug 2020 @ 12:14pm

    Out of the EU, now those rights can go away.

    Don't worry, without the protection of the EU, Cummings' government will soon start stripping away individual rights.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Insider Shop - Show Your Support!

Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Recent Stories
.

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.