News & Insight
Big Brother no more…?
Back in September 2019, HLaw took a deep dive into the High Court judgment that ruled the use of automatic facial recognition (“AFR”) by South Wales Police (“SWP”) to be lawful.
This case was the first involving the use of AFR technology to be heard in the British Courts.
In August 2020, the Court of Appeal (“COA”) reversed the High Court’s decision and declared the use of AFR by the SWP to be unlawful on the basis that it breached fundamental human rights under the ECHR and violated the Data Protection Acts 1998 and 2018.
This is probably something of a setback for AFR technology companies. One of the key questions is – what does such a judgment really mean commercially for companies or organisations that are either developing or using AFR technologies? In this piece, we briefly explore some of the legal issues that arise, and touch on practical steps companies should be aware of for keeping on top of a developing compliance regime.
We also share insight from industry which suggests the judgment could perhaps also be seen as a catalyst for AFR technology companies to come together and work in collaboration with the Government to help shape the regulatory agenda.
The SWP began the testing and trialling of AFR software as part of its facial recognition pilot programme in mid-2017. The software processes live CCTV feeds and extracts facial biometric information in real time. The facial images are then compared to data held on a watchlist. The data processed is then immediately deleted if there are no matches with the watchlist data.
The claim was brought by Edward Bridges, a civil liberties campaigner, on the grounds that the way in which the software (and his personal data) were being used infringed his Article 8 ECHR rights, data protection rights and was also in breach of the Equality Act 2010.
Complaints relate to two occasions when the software had been deployed by the SWP in Cardiff. On both occasions, the police had used a marked van equipped with cameras to trial the software.
The High Court that the use of AFR technology by SWP was lawful and proportionate, despite finding its use an infringement of ECHR Article 8(1) (right to respect for private and family life, etc) but that the infringement was necessary under Article 8(2) of the ECHR in the interests of national security, public security or the well-being of the country.
On appeal to the CoA, it was ruled that the use of AFR software by the SWP was unlawful. The case succeeded on the following grounds:
- The CoA found that the High Court had erred in concluding that the infringement of the claimant’s ECHR Article 8(1) right was lawful as such infringement was in line with ECHR Article 8(2). This was on the basis that the SWP’s use of AFR software was too broad and without apparent limits.
- The CoA also held that the High Court had been wrong to say that the SWP had followed their public sector equality duty to ensure that their equality impact assessment was adequate. The CoA highlighted the SWP’s ongoing public duty to ensure that the process used by the SWP was not discriminatory and that the software used did not have a racial or gender bias.
- The CoA also found that the SWP’s Data Protection Impact Assessment (“DPIA”) did not actually comply with the requirements of the DPA 2018. This was on the basis that the SWP were using AFR software to scan and capture facial images of individuals who were not on a police watchlist used for AFR purposes.
The ICO has welcomed the CoA’s judgment.
The ICO has previously expressed concerns regarding the SWP’s use of AFR technology, stating that the police should slow down and justify its use before the technology is rolled out in society. The ICO shares the view that the current legal framework is not sufficient for managing the risks that the technology presents and has committed to working with the Home Office, the Investigatory Powers Commissioner, the Biometrics Commissioner and other policing bodies to build upon a statutory code of practice for the use of AFR technology.
The judgment has highlighted the pressing need to further regulate and introduce legal frameworks and guidance to govern the use of AFR technology, both in the public sector and for commercial businesses. With the ICO pushing the agenda is it an area to be watched as the regulatory developments unfold.
For businesses that use AFR software or similar technologies in their day-to-day operation today, they should look to ensure that they are abiding by the highest level of compliance with data protection laws possible.
DPIAs should be reviewed, and where required updated, to ensure that the processing of any biometric data is in compliance with the relevant legislation. Businesses should also look to putting in place an operations policy that documents for internal reference, and any external audit, purposes how they are using such technology.
Businesses should also consider whether their use of such biometric data could also breach human rights or discriminatory laws, and factor in such considerations when reviewing their DPIAs.
Thoughts from an industry pioneer
We spoke with Richard Hicks, co-founder and COO of C-Screens, the UK’s largest out-of-home (“OOH”) TV network.
C-Screens uses AFR technology to track out-of-home TV viewers that are engaged with the company’s screens located in busy, popular pedestrian areas and consumer entertainment environments. The screens capture audience data which includes the approximate number of views, age and gender – all tracked in real time.
Critical for C-Screens is that all data collected are binary and anonymous. It is in this anonymous form that data is fed back to C-Screens client brands who are interested in delivering compelling and engaging adverts that can reach “the right audience, with the right message, with relevant premium programming, in the right environment, at the best time”. Hicks’ business has also been working with the government to provide advertising in public spaces to assist in the government’s fight against COVID-19.
The C-Screens business has not been affected by the recent judgment. Hicks and his team have worked hard on ensuring any AFT technology they use is done so in compliance with the law at all times. The frustration voiced is more with the lack of guidance the government has given on the use of AFR technology. Hicks accepts that more maybe forthcoming from the ICO, but feels that it is the businesses that use AFR technology who are leading the way by setting standards of best practice while doing their utmost in ensuring lawful compliance.
Developments in AFT technology are continuing to happen at pace and Hicks’ concerns are that the government has not caught up with these developments and does not have the expertise or knowledge to understand how businesses are using AFR capabilities commercially. Hicks would like to see the government and relevant regulatory bodies reach out to those who, like him and the team at C-Screens, are well versed with the commercial use of AFR technologies and for government and industry to work together to develop a way forward for regulating their use.
For now, Hicks and others in the AFR technology business rely upon Outsmart, the marketing body for the OOH industry, to bridge the communication gap between the government and commercial businesses that use AFR technology. It is through the OOH that AFR businesses are able to feedback issues they have with the government’s policies or lack of guidelines for the commercial use of AFR technology.
If your business is involved in the development or use of AFR technology and you are interested in understanding how to navigate the legal challenges that lie ahead, the team at Humphreys Law would be happy to help.
This piece was prepared by Victoria Clement and Justin Barrow with input from Husna Grimes.
All the thoughts and commentary that HLaw publishes on this website, including those set out above, are subject to the terms and conditions of use of this website. None of the above constitutes legal advice. Much of the above will no doubt fall out of date and conflict with future law and practice one day. None of the above should be relied upon. Always seek your own independent professional advice.