Tech, Media and Comms

Appeals court hands down the UK's first judgment on automated facial recognition

Published on 24th Aug 2020

Ruling that the use of automated facial recognition by the South Wales Police was unlawful has broader significance for businesses and organisations looking to adopt the technology

DT_surveillance

The Court of Appeal has ruled in R (Bridges) v-Chief Constable of South Wales Police & Ors that the use of automated facial recognition technology (AFR) by South Wales Police breaches privacy rights. This is an important case – the first on AFR in the UK – and clarifies what commercial and public organisations will need to consider when taking advantage of this controversial technology.

What was the case about?

AFR uses complex algorithms to reduce an image of a face to a set of measurements (a biometric) which can then be automatically compared to the facial biometrics of other people. The case concerns its use by South Wales Police to scan members of the public in public places such as town centres with CCTV cameras, automatically registering their faces in the live footage. The technology compared the images with those of people on the police's watch list, which included convicted criminals, crime suspects and missing persons.

Liberty, the privacy and civil rights group, supported an action brought against South Wales police by civil liberties campaigner Ed Bridges, who was present on two occasions that AFR was used, claiming that these particular deployments of AFR were unlawful on a variety of grounds.

Why this use of AFR was a privacy breach

Although some of the grounds of appeal failed, the Court of Appeal ruled that use of AFR by South Wales Police was unlawful as it breached the right to privacy under Article 8 of the European Convention on Human Rights (ECHR). It was in breach of the UK's Data Protection Acts of 1998 and 2018, and failed to comply with the police's obligations under equality legislation.

Under Article 8 of the ECHR, the use of AFR was an interference with Mr Bridges' rights to a private life. The court held that this interference was unlawful, because the UK's legal framework covering the use of AFR did not set out clear guidance as to who could be on the watch list, where the technology could be deployed, and it left too much to the individual discretion of the police officers involved. However, the court rejected Mr Bridges' argument that it was disproportionate to use AFR.

The Data Protection Act 2018 required the police to conduct a Data Protection Impact Assessment when deploying technologies that are high risk from a privacy point of view, and it was accepted that this includes biometric technologies such as AFR. Given the finding on the ECHR, the Data Protection Act 2018 was inevitably breached because the force's Data Protection Impact Assessment failed to take account of the fact that the ECHR might be infringed. Also, it did not adequately take account of the processing of biometric personal data of members of the public who are not on police watch lists.

The judgment also discusses how the force failed to properly investigate whether the algorithms used in the AFR software might lead to indirect discrimination through racial or gender bias. The court stressed that, while there was no evidence that the actual AFR system used showed this bias, the force should have taken steps to satisfy itself as to whether, in practice, there would be any bias.

The UK Information Commissioner has welcomed the ruling and stated that in order for "the public to have trust and confidence in the police and their actions there needs to be a clear legal framework. Today’s judgment is a useful step towards providing that". The Surveillance Camera Commissioner, Liberty and South Wales Police have also given their views on the judgment.

Wider relevance?

Does this have wider relevance beyond the use of AFR by the police? Although the Court of Appeal gave its opinion on the legal issues specific to law enforcement bodies, the case is also relevant to the use of AFR under data protection legislation more generally.

What now?

Increasing numbers of private organisations are reported to be looking at the deployment of AFR for uses ranging from accessing premises and unlocking mobile devices to targeting offers and adverts in stores. What do businesses need to consider if they wish to use this technology?


  • It is important for organisations to be aware that AFR requires extra careful handling under the General Data Protection Regulation (GDPR) because facial recognition data is biometric data‚ a form of special category data.
  • Organisations that deploy AFR ought to have in place a detailed operational policy document covering its use, as well as to have conducted and documented a rigorous and comprehensive data protection impact assessment.

  • The case made clear that organisations that adopt AFR need to be transparent about how they are using AFR and to recognise the importance of minimising the retention of biometric data.

  • Human rights considerations may need to be taken into account when balancing the rights and freedoms of individuals with the benefits that AFR could bring.

  • More widely, the case demonstrates that data ethics and law are becoming closer, and that AFR users need to consider the quality and quantity of datasets that algorithms are trained on and how they can demonstrate that the technology will not make discriminatory decisions.

We have a number of experts that can provide practical legal advice for organisations making use of biometrics, algorithms and machine learning and artificial intelligence. If you would like to discuss this further, please do get in touch with your usual Osborne Clarke contact or connect with one of our team.

 

Follow

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Interested in hearing more from Osborne Clarke?