September 20, 2020

Volume X, Number 264

September 18, 2020

Subscribe to Latest Legal News and Analysis

UK Court of Appeal Finds Automated Facial Recognition Technology Unlawful in Bridges v South Wales Police

On August 11, 2020, the Court of Appeal of England and Wales overturned the High Court’s dismissal of a challenge to South Wales Police’s use of Automated Facial Recognition technology (“AFR”), finding that its use was unlawful and violated human rights.

In September 2019, the UK’s High Court had dismissed the challenge to the use of AFR, determining that its use was necessary and proportionate to achieve South Wales Police’s statutory obligations. Mr. Bridges, the civil liberties campaigner who originally brought judicial review proceedings after South Wales Police launched a project involving the use of AFR (“AFR Locate”) appealed the High Court’s dismissal. With AFR Locate, South Wales Police deployed AFR technology at certain events and in certain public locations where crime was considered likely to occur, and images of up to 50 faces per second. The police subsequently matched the captured images with “watchlists” of wanted persons in police databases using biometric data analysis. Where a match was not made with any of these watchlists, the images were immediately and automatically deleted.

Mr. Bridges challenged AFR Locate on the basis that it was unlawfully intrusive, including under Article 8 of the European Convention on Human Rights (“ECHR”) (right to respect for private and family life) and data protection law in the UK. His appeal was based on the following five grounds:

  1. The High Court had erred in its conclusion that South Wales Police’s use of AFR and interference with Mr. Bridges’ rights was in accordance with the law under Article 8(2) of the ECHR.

  2. The High Court had incorrectly concluded that the use of AFR and interference with Mr. Bridges’ rights was proportionate under Article 8(2) of the ECHR.

  3. The High Court was wrong to consider the DPIA carried out in relation to the processing sufficient for the purposes of Section 64 of the DPA 2018.

  4. The High Court should not have declined to reach a conclusion as to whether South Wales Police had an “appropriate policy document” in place regarding the use of AFR Locate that was within the meaning of Section 42 of the DPA 2018 for carrying out sensitive data processing.

  5. The High Court was wrong to hold that South Wales Police had complied with the Public Sector Equality Duty (“PSED”) under Section 149 of the Equality Act 2010, on the grounds that the Equality Impact Assessment carried out was “obviously inadequate” and failed to recognize the risk of indirect discrimination on the basis of sex or race.

The Court of Appeal granted the appeal on grounds 1, 3 and 5, but rejected grounds 2 and 4.

Ground 1

On the first ground, the Court of Appeal overturned the High Court’s determination, finding “fundamental deficiencies” in the legal framework around the use of AFR, specifically the policies that governed its use. The Court found that South Wales Police’s policies gave too much discretion to individual police officers to determine which individuals were placed on watchlists and where AFR Locate could be deployed. The Court commented that “the current policies do not sufficiently set out the terms on which discretionary powers can be exercised by the police and for that reason do not have the necessary quality of law.” The Court further described the discretion as “impermissibly wide”, for example because the deployment of the technology was not limited to areas in which it could be thought on reasonable grounds that individuals on a watchlist might be present. The Court implied that this should be a significant factor in determining where AFR Locate should be deployed, stating, “it will often, perhaps always, be the case that the location will be determined by whether the police have reason to believe that people on the watchlist are going to be at that location.”

Ground 2

Since the Court decided that AFR Locate’s use was not lawful, it was not necessary for the Court to decide the second ground of appeal on proportionality. Regardless, the Court chose to address this question and rejected it. Mr. Bridges argued that the balancing test between the rights of the individual and the interests of the community, which forms part of the proportionality analysis, should not only consider the impact on Mr. Bridges, but also the impact on all other individuals whose biometric data was processed by the technology on the relevant occasions. The Court of Appeal disagreed, commenting that Mr. Bridges had only detailed the impact on himself, not the wider public, in his original complaint and that the impact on each of the other relevant individuals was as negligible as the impact on Mr. Bridges and should not be considered cumulatively. The Court stated, “An impact that has very little weight cannot become weightier simply because other people were also affected. It is not a question of simple multiplication. The balancing exercise which the principle of proportionality requires is not a mathematical one; it is an exercise which calls for judgement.”

Ground 3

On the third ground of appeal relating to South Wales Police’s failure to carry out a sufficient DPIA, Mr. Bridges argued that the DPIA was defective in three specific ways. First, it failed to recognize that the personal data of individuals not present on a watchlist (whose data was therefore immediately and automatically deleted) was nonetheless “processed” within the meaning of data protection law. Second, the DPIA also did not acknowledge that the rights of individuals under Article 8 of the ECHR were engaged by the processing, and third, it was silent as to other risks that may have been raised by AFR Locate’s use, such as the right to freedom of expression or freedom of assembly.

The UK Information Commissioner’s Office (“ICO”), an intervener in the case, also criticized the DPIA undertaken by South Wales Police on the basis that it did not contain an assessment of “privacy, personal data and safeguards,” failed to acknowledge that AFR involves the collection of personal data on a “blanket and indiscriminate basis” and that the risk of false-positive results may in fact result in longer retention periods rather than data being immediately deleted. In addition, the DPIA failed to address potential gender and racial bias that could arise from AFR Locate’s use. As such, the ICO stated that the DPIA failed to appropriately assess the risks and mitigation of them as required under Section 64 of the DPA 2018.

The Court of Appeal did not accept all of these arguments. For example, it highlighted that the DPIA had specifically referred to the relevance of Article 8 of the ECHR. However, based on its conclusion that the deployment of the technology was not lawful, the Court found that South Wales Police was wrong to conclude in its DPIA that Article 8 of the ECHR was not infringed. The Court of Appeal stated, “The inevitable consequence of those deficiencies is that, notwithstanding the attempt of the DPIA to grapple with the Article 8 issues, the DPIA failed properly to assess the risks to the rights and freedoms of data subjects and failed to address the measures envisaged to address the risks arising from the deficiencies we have found, as required by section 64(3)(b) and (c) of the DPA 2018.”

Ground 4

With regards to the requirement to have an “appropriate policy document” in place under Section 42 of the DPA 2018, Mr. Bridges argued that the assessment of the document’s sufficiency should not have been referred back to South Wales Police for consideration in light of guidance from the ICO, but instead, the High Court should have found it to be insufficient. The Court of Appeal rejected this argument on the basis that, at the time of AFR Locate’s deployment, the DPA 2018 was not yet in force, and therefore, there could not have been a failure to comply with the law. In relation to AFR Locate’s future use and the requirement for an appropriate policy document, the Court of Appeal commented that, “[A] section 42 document is an evolving document, which, in accordance with section 42(3), must be kept under review and updated from time to time.” Since ICO guidance had not been issued on the drafting of this type of document at the time of the High Court hearing, and given that South Wales Police had updated the document in light of the ICO’s subsequently published guidance, the Court of Appeal found that the High Court’s approach in this respect had been appropriate. It also referred to the fact that the ICO had repeatedly expressed the view that the original version of the document met Section 42 requirements, though it would ideally contain more detail.

Ground 5

On the final ground of appeal concerning the PSED under Section 149 of the Equality Act 2010, the Court found that South Wales Police had not gathered sufficient evidence to establish whether or not AFR Locate was inherently biased prior to its use for two reasons: (1) because the data of individuals whose images did not match those on the watchlists were automatically deleted (and therefore could not be analyzed for the purpose of assessing bias), and (2) because South Wales Police was not aware of the dataset on which AFR Locate had been trained and could not establish whether there had been a demographic imbalance in the relevant training data. Although it was not alleged that AFR Locate produced biased results, the Court determined that South Wales Police, “never sought to satisfy themselves, either directly or by way of independent verification, that the software program in this case does not have an unacceptable bias on grounds of race or sex.” The Court added, “We would hope that, as AFR is a novel and controversial technology, all police forces that intend to use it in the future would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias.”

South Wales Police has stated that it will not appeal the decision. The Court of Appeal’s full judgement may be viewed here.

Copyright © 2020, Hunton Andrews Kurth LLP. All Rights Reserved.National Law Review, Volume X, Number 225

TRENDING LEGAL ANALYSIS


About this Author

In today’s digital economy, companies face unprecedented challenges in managing privacy and cybersecurity risks associated with the collection, use and disclosure of personal information about their customers and employees. The complex framework of global legal requirements impacting the collection, use and disclosure of personal information makes it imperative that modern businesses have a sophisticated understanding of the issues if they want to effectively compete in today’s economy.

Hunton Andrews Kurth LLP’s privacy and cybersecurity practice helps companies manage data and...

212 309 1223 direct