The Start-UP Business Advisors

Articles

INDUSTRY INSIGHTS, NEWS AND KNOW-HOW

 

Police use of facial recognition

image-asset.jpeg

The Court of Appeal recently handed down its decision in the case of R v Bridges [2020] EWCA Civ 1058, the first case of its kind in the world dealing with law enforcement use of live facial recognition. 

Live automated facial recognition (AFR) is a technology that is overlaid onto facial images in real time. The technology captures biometric features of a face to confirm the identity of an individual by matching the image with a base set of images.

The use of AFR by law enforcement globally is increasingly common and has attracted controversy primarily because of the unprecedented intrusive nature of the technology, the lack of regulatory controls and the possible inherent bias in its results.

The Court of Appeal decision went to the heart of the debate as the lack of regulatory control over police use was the main take away from the judgment.

R v Bridges at first instance

In 2017 and 2018, the South Wales Police (SWP) trialled the use of overt AFR in a number of public places. An individual who was caught by this system on two particular occasions, Mr Bridges, sought judicial review of the use by SWP of such a system citing breaches of European Convention of Human Rights (ECHR) and the UK’s privacy framework governing law enforcement.

The decision at first instance, heard by the Divisional Court of the Queen’s Bench Division, dismissed the claim for judicial review.

For more information about the judgment at first instance, see my article on Facial recognition in public spaces in the September 2019 issue of the Newsletter.

Essentially the Court of Appeal found that the legal framework governing police use of AFR, the Data Protection Act 2018 (DPA 2018), the Surveillance Camera Code of Practice (the Code) and the SWP’s own policies were insufficient and did not have the requisite quality of law to justify interference with privacy rights under the ECHR.

In the Court of Appeal

Mr Bridges appealed aspects of the Divisional Court’s findings including that:

  • The use of AFR was “in accordance with law” as required by Article 8(1) of the ECHR and s 6 of the Human Rights Act 1998 (Ground 1);

  • The use of AFR by the SWP struck a fair balance between the rights of the individual and the interests of the community and was proportionate (Ground 2);

  • The Data Protection Impact Assessment (DPIA) undertaken by SWP complied with s 64 of the DPA 2018 (Ground 3); and

  • The SWP had complied with the public sector equality duty (PSED) required by the Equality Act 2010 (Ground 5).

Mr Bridges also alleged that the Divisional Court erred in not forming a conclusion as to whether the SWP had in place an appropriate policy document as required by s 35(5) of the DPA 2018 and that complied with s 42 of the DPIA (Ground 4).

USE NOT IN ACCORDANCE WITH THE LAW (GROUND 1)

The most important outcome of the appeal concerned Ground 1. The Court concluded that the cumulative effect of the DPA 2018, the Code and the SWP’s own policies were insufficient to determine that the use of AFR was “in accordance with law”.

The Court recited the applicable relevant principles set out in Re Gallagher [2019] 2 WLR 509 and R (Catt) v Association of Chief Police Officers [2015] UKSC 9 for an interference with privacy to be justified under Article 8(2). At paragraph 55 of the Appeal the Court distilled the principles that the use must:

  • have some basis in domestic law;

  • be compatible with the rule of law – this requires that the law must be accessible to the person concerned (ie be comprehensible);

  • be foreseeable in that it must be possible for a person to foresee its consequences and should not “confer a discretion so broad that its scope is in practice dependent on the will of those who apply it, rather than on the law itself”;

  • afford adequate legal protection against arbitrariness; and

  • where discretionary, need not be an over-rigid regime but provide “safeguards in order to guard against overbroad discretion resulting in arbitrary and disproportionate interference with Convention rights”.

In respect of this last point, the appellant had argued that in order for the interference to be in accordance with law there must be safeguards which have the effect of enabling the proportionality of the interference to be adequately examined.

The Court declined to consider hypothetical scenarios but stated that what must be examined is the particular interference which has arisen in the present case and in particular whether that interference is in accordance with the law (at para 60). The Court said it was not concerned with possible use of AFR in the future on a national basis.

The Court found that the legal framework did contain safeguards which enabled proportionality to be examined but “on further analysis that the legal framework was insufficient” (para 90).

The core complaint of the Court was that the framework did not address the “who” question and the “where” question.

The “who” question related to the decision as to who would be included in the police watch list. The “where” question related to the decision as to the location of the AFR deployment.

The Court determined that “the current policies” did not sufficiently set out the terms on which the discretionary power in request to such questions could be exercised.

The Court commented that it was never suggested that the DPA 2018 alone could satisfy the requirement and so turned to the Code and the SWP policies.

The Court took the view that although the Code did specially deal with facial recognition, the fact that the Code did not mandate what the content of local policies should contain as to who can be put on a watch list or where AFR should  be employed were the “two critical defects in the current legal framework” (para 121).

 Interestingly whilst the Court noted that it was not a matter for it to determine whether the Code or a local policy should contain such guidance, it noted that it might be prudent for it to be amended to ensure consistency in the content of local policies (para 118).

In respect of the SWP policy, the Court held that “too broad a discretion” was given and that the “documents leave the question of the location simply to the discretion of individual police officers, even if they would have to be of a certain rank” (para 124 and 130).

Grounds 2 to 5

Given that the Court had already held that the use of AFR was not lawful, it was not required to consider the subsequent grounds of the appeal but chose to address these.

The Court dismissed Ground 2 and found the Divisional Court did not err when:

  • it took into account not only the actual results of an AFR operation but the anticipated benefits; and

  • it considered that the impact on the appellant and others affected by AFR were minimal, and the number of those impacted did not raise the severity of the impact.

The Court upheld Ground 3 simply because the DPIA proceeded on the basis that Article 8 of the ECHR was not infringed.

The Court rejected Ground 4 given the uncertainty surrounding what a policy under s 42 of the DPA 2018 was required to contain. At the time of the first instance decision the ICO had not published any guidance on what was required but had taken the view as intervener that the SPW policy was not deficient. For those reasons, it was appropriate for the Divisional Court not to rule on this point.

The Court upheld Ground 5 and stated that the PSED had not been complied with by the SWP. The PSED required SWP to have due regard to eliminate any discrimination. It required that “reasonable steps be taken to make enquiries about the potential impact of a proposed decision or policy on people with relevant characteristics” (para 181) and that the whole purpose of the PSED was to ensure that a public authority does not inadvertently overlook information which it should take into account.

The fact that there was a “human failsafe” component (ie a positive match made by AFR will never by itself lead to human intervention) was not sufficient and did not address the process that should be followed, that is, the information about the possible bias which the software they used may have (para 185).

The Court helpfully summarised the essence of the PSED in this case; the SWP should have sought to satisfy themselves directly or by way of independent verification that the software program did not have an unacceptable bias on grounds of race or sex (para 199).

As a general comment, the Court counselled that they “hoped that as AFR is a novel and controversial technology, all police forces that intend to use it in the future would satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias.”

Impact of decision

The fallout from the decision has not yet settled on various stakeholders although both the ICO and the Surveillance Commission have issued independent statements supportive of the decision.

The Surveillance Commissioner in a statement issued on 11 August 2020 committed to “consider how I can amend my guidance to ensure police forces are aware for the potential bias in systems and also consider what more can be done with manufacturers of the technology to eliminate it” and called on the Home Office to update the Code in light of the decision.

Given that the Court has left it open for police to develop their own policies to ensure that their use of AFR is “in accordance with law”, we may now see local police swiftly amending their policies to address the “who” question and the “where” question.

Certainly those police forces pro-actively trialling live facial recognition such as the Metropolitan Police, will need to consider postponing their programs to take legal advice.

On its website the Met has stated that they will carefully consider the judgment and act on any relevant points.

Wider context

This decision comes as police use of live facial recognition is increasingly being scrutinised. The Scottish Parliament courted the idea of suspending the use by its police force until further regulations were adopted with some critics calling for an all-out ban.

A total ban on use by police and public authorities is not unprecedented; it has already occurred in some US cities such as San Francisco and recently Portland.

The concern appears to remain the same; without clear and specific laws governing the way police are permitted to employ the technology, the risk of misuse is heightened.

A wider concern is that it appears law enforcement has been outsourcing its AFR programs to the private sector. According to the New York Times, more than 600 law enforcement agencies in the US have engaged private data firm, Clearview AI, to deploy its facial recognition system on their behalf.  Clearview AI evidently uses its own data collected from millions of “scraped” images from social media platforms such as Facebook, Twitter and YouTube.

In July 2020 the ICO announced it had launched a joint investigation into Clearview AI’s practices with the Australian Privacy Commissioner.

Similarly, in June 2020, the European Data Protection Board shared its concerns about Clearview AI in its plenary session. With echoes of the R v Bridges Court of Appeal decision, the EDPB says it has “doubts as to whether any Member State law provides a legal basis such that use by EU law enforcement could be justified under Article 8.”

As the Court of Appeal in R v Bridges noted,  a relativist approach may be necessary to guard against the risks inherent in police use of such a novel and intrusive technology: “the more intrusive the act complained of, the more precise and specific must be the law said to justify it” (para 82).

As AFR technology becomes more widely used and available, the law will need to keep pace to ensure that the identified risks of bias and misuse are addressed.

This article, written by Chrysilla de Vere, Partner at Clarkslegal and adviser at forburyTECH, was originally published in the Internet Newsletter for Lawyers September 2020 at infolaw 

Clarkslegal LLPComment