News & Insight

Privacy October 18, 2019
Rise of the Big Brother state? High Court opines on police use of automatic facial recognition

Rise of the Big Brother state? High Court opines on police use of automatic facial recognition

On 4 September 2019, the Cardiff High Court handed down the world’s first ever judgment challenging the use of automatic facial recognition (“AFR”) technology.  Despite finding the use of AFR was an infringement of ECHR Article 8(1) and that South Wales Police had in fact processed the claimant’s personal data, the use of AFR by South Wales Police was ruled lawful and the Court decided “the current legal regime is adequate to ensure the appropriate and non-arbitrary use of AFR Locate”.

This decision comes amidst growing concerns regarding the increasing use of AFR technology. For example, the public backlash at the developer of a King Cross’s site who revealed they had been using AFR technology between May 2016 to March 2018 with data supplied by the Metropolitan and British Transport Police has propelled the issue of AFR technology into the spotlight and raised concerns about whether there is a sufficient legal framework in place to regulate such technology.  Further, in a recent survey of public opinion on the use of AFR technology by the Ada Lovelace Institute, 46% of the 4,109 adults surveyed wanted the right to opt-out of its use and 55% wanted the government to impose restrictions on the use of such technology.

Background

The judicial review concerned trial of the “AFR Locate” software by the South Wales Police (“SWP”).

Since mid-2017, SWP has been the national lead on the testing and trailing of the AFR software. The software processes live CCTV feeds and extracts facial biometric information in real time. The faces are then compared to data held on a watchlist. The data is immediately deleted if there are no matches with the watchlist data.

The claim was brought by Edward Bridges, a civil liberties campaigner on the grounds the software infringed his Article 8 ECHR rights, data protection rights and was in breach of the Equality Act 2010.  The judicial review challenges the lawfulness of the SWP’s use of the software. Complaints relate to two occasions when the software had been deployed by the SWP in Cardiff.  On both occasions, the police had used a marked van equipped with cameras to trial the software.  Mr Bridges claims that he had been caught on camera on both occasions.

ECHR Article 8 Claim

The claimant argued the use of such technology violated his rights under ECHR Article 8(1) – his right to respect for one’s private and family life – and that the interference by the SWP’s use of this software was neither “in accordance with the law” or “proportionate” as per Article 8(2).

The Court held that the use of AFR Locate did entail infringement of Article 8(1) given the collection of personal biometric data, but such infringement was justified under the powers granted to public authorities where such infringement is in accordance with the law and necessary. The Court also determined that the legal framework regarding the use of AFR technology was clear and sufficient.

Court’s key findings in relation to the data protection claims include:

Data had been processed fairly and lawfully by SWP: The claimant contended SWP did not process the personal data collected fairly and in a lawful manner. SWP argued they did not process the personal data of Mr Bridges as his photos were deleted immediately given they did not match with any in the watchlist database. The court disagreed with SWP, arguing they had processed his personal data as the information recorded by AFR Locate individuated him from all the others. The ground that the processing of such personal data had not been carried out in a fair and lawful manner was rejected by the court, for reasons the court had given in the context of the Article 8 claim. The court decided the personal data had been processed fairly and lawfully as such processing is necessary for SWP’s common law obligation to prevent and detect crime.

Data processed by SWP considered “sensitive processing”: The claimant contended SWP’s use of AFR Locate entailed “sensitive processing” as described under s35(8) DPA 2018. In response, SWP claimed that not all processing of the biometric data was considered “sensitive processing” because the purpose of AFR Locate was to identify only the individuals whose biometric data matched with data held on the watchlist. Therefore, “sensitive processing” of biometric information only occurred when an individual’s data captured on AFR Locate matched with data on the watchlist. The court held the collection of all data with the use of the AFR Locate software constituted collection of biometric data and therefore it was all considered “sensitive processing”. The Court noted that although data captured by the system which did not match with the database of the watchlist was deleted immediately, it was still considered “sensitive processing” because for the software to work, it was necessary to uniquely identify each person captured by the software before any comparisons were made to watchlist’s database.

“Sensitive processing” of data had been strictly necessary for law enforcement purposes: Mr Bridges argued that SWP’s “sensitive processing” of the biometric data did not meet the requirements under s35(4) as it was not “strictly necessary” for law enforcement purposes. This claim was rejected by the court on the basis that the use of AFR Locate was strictly necessary for the law enforcement purpose to ensure the safety of the public and to detect crime, and again referred to the grounds the court had used to justify the Article 8 claim.

Policy document used by SWP found to be generic: Mr Bridges argued there was no appropriate policy document in place to meet the requirements of this section of the DPA 2018. The court noted the policy document used by the SWP had been generic and it would be desirable for the Information Commissioner to issue specific guidance but the court itself would not interfere at the present juncture.

SWP had relevant safeguards in place for processing of data: Under s64 DPA 2018, a data controller must conduct a proper impact assessment when processing data that may result in a high risk to the rights and freedoms of individuals. Mr Bridges claimed this impact assessment was not done properly. The court rejected this claim on the basis that the SWP had identified all the relevant safeguards in place for the duration for which the data was to be retained and processed.

Comment

Although this judgment concerns the use of AFR technology by a public authority for law enforcement purposes, the findings are relevant to its wider use. Businesses thinking about using AFR technology should look at what constitutes “sensitive processing”, the use of DPIAs, policy documents and other necessary safeguards to meet legal requirements, and how the data may be processed fairly and lawfully in accordance with data protection principles.

Businesses can hopefully expect more guidance on this area in future. The UK’s Information Commissioner’s Office (“ICO”) has expressed concern over the increasing prevalence of AFR technology, stating “facial recognition technology is a priority area for the ICO and when necessary, we will not hesitate to use our investigative and enforcement powers to protect people’s legal rights.”

The ICO is also currently working with a small group of participants in the ICO Sandbox, including Heathrow Airport’s Automation of the Passenger Journey programme, which aims to use AFR technology to streamline the passenger journey. By supporting this programme as it develops to ensure that appropriate protections and safeguards are in place, we would hope that the ICO can use its experiences here to develop any future guidance on the use of AFR technology.

If you have questions about AFR or privacy generally do feel free to reach out to the authors of this piece, Husna Grimes and Victoria Clement.

All the thoughts and commentary that HLaw publishes on this website, including those set out above, are subject to the terms and conditions of use of this website.  None of the above constitutes legal advice.  None of the above should be relied upon.  Always seek your own independent professional advice.

Humphreys Law

Sign up for news and insight
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.