The use of facial recognition technology
October 09, 2019
R -v- The Chief Constable of South Wales Police and others  EWHC 2341 Case No: CO/4085/2018
The use by South Wales Police of AFR Locate, an application which utilises facial recognition technology, has been found to be lawful in a ruling delivered in the High Court.
AFR Locate takes digital images of members of the public from live CCTV feeds. Those images are subsequently processed to generate facial biometric information which can then be compared to corresponding ‘watch-lists’.
Judicial review proceedings were brought alleging that, by using AFR Locate, South Wales Police were in contravention of a wide range of legislation, including the Human Rights Act 1998, the Data Protection Act 1998 and its successor, the Data Protection Act 2018, and the Equality Act 2010.
The Court ultimately decided to refuse the judicial review application on all grounds, holding that the current legal framework is sufficient to safeguard against arbitrary and inappropriate use of AFR Locate.
Human Rights claims
The Court concluded that, although the use of AFR Locate did invoke the Article 8 ECHR right to respect for private life of the citizens whose images were taken and processed, the obtaining and processing of the images was subject to significant legal control and was legally justified. In particular, the Court held that AFR Locate:
- Was subject to controls contained in primary legislation, including statutory codes of practice, Data Protection legislation and the policies of the South Wales Police.
- Was used “for a legitimate aim … to justify interfering with the Claimant’s rights under Article 8”.
- “Struck a fair balance and was proportionate”.
- Was used “in an open and transparent way, with significant public engagement”.
- Was used for a “specific and limited purpose”.
- Was used in such a way that, if an image of a member of the public was captured and did not match a person on a watch-list, all data and personal data corresponding to that image would be “immediately and automatically deleted”.
Data Protection claims
The Court also concluded that, despite the personal data of the public being collected through the use and processing of the images, this processing of personal data was lawful and met the conditions set out in the Data Protection Act 2018. In particular, the Court agreed that the processing is strictly necessary for law enforcement, namely the “common law duty to prevent and detect crime”.
The Court additionally confirmed that, in deciding to implement and use AFR Locate, the South Wales Police had adhered to the requirements of the duty of equality in the public sector, contained in the Equality Act 2010. The Court found that, when the use of AFR Locate began, there was “no specific reason” for believing that the accuracy of the software’s results were affected by factors such as gender or race.
What impact could this have?
This outcome could have implications for the data protection environment. The use of intrusive technology such as AFR Locate has been widely debated amongst privacy professionals. The facts of the case currently restrict the use of facial recognition by law enforcement, and only where adequate controls are in place.
Privacy advocates will welcome the Court’s observations that as the facial recognition technology in question involves the processing of sensitive personal data of members of the public, it therefore requires compliance with the Data Protection Act 2018. Accordingly, the case illustrates that obtaining and processing data in such a way is subject to significant legal and technical controls. It is likely that as the use of facial recognition technology (and other biometric technology) becomes more widespread, there will be difficulty in reconciling the current legal framework with these new technologies, especially in the private sector.