The Court of Appeal has ruled in R (Bridges) v-Chief Constable of South Wales Police & Ors that the use of automated facial recognition technology (AFR) by South Wales Police breaches privacy rights. This is an important case – the first on AFR in the UK – and clarifies what commercial and public organisations will need to consider when taking advantage of this controversial technology.
What was the case about?
AFR uses complex algorithms to reduce an image of a face to a set of measurements (a biometric) which can then be automatically compared to the facial biometrics of other people. The case concerns its use by South Wales Police to scan members of the public in public places such as town centres with CCTV cameras, automatically registering their faces in the live footage. The technology compared the images with those of people on the police's watch list, which included convicted criminals, crime suspects and missing persons.
Liberty, the privacy and civil rights group, supported an action brought against South Wales police by civil liberties campaigner Ed Bridges, who was present on two occasions that AFR was used, claiming that these particular deployments of AFR were unlawful on a variety of grounds.
Why this use of AFR was a privacy breach
Although some of the grounds of appeal failed, the Court of Appeal ruled that use of AFR by South Wales Police was unlawful as it breached the right to privacy under Article 8 of the European Convention on Human Rights (ECHR). It was in breach of the UK's Data Protection Acts of 1998 and 2018, and failed to comply with the police's obligations under equality legislation.
Under Article 8 of the ECHR, the use of AFR was an interference with Mr Bridges' rights to a private life. The court held that this interference was unlawful, because the UK's legal framework covering the use of AFR did not set out clear guidance as to who could be on the watch list, where the technology could be deployed, and it left too much to the individual discretion of the police officers involved. However, the court rejected Mr Bridges' argument that it was disproportionate to use AFR.
The Data Protection Act 2018 required the police to conduct a Data Protection Impact Assessment when deploying technologies that are high risk from a privacy point of view, and it was accepted that this includes biometric technologies such as AFR. Given the finding on the ECHR, the Data Protection Act 2018 was inevitably breached because the force's Data Protection Impact Assessment failed to take account of the fact that the ECHR might be infringed. Also, it did not adequately take account of the processing of biometric personal data of members of the public who are not on police watch lists.
The judgment also discusses how the force failed to properly investigate whether the algorithms used in the AFR software might lead to indirect discrimination through racial or gender bias. The court stressed that, while there was no evidence that the actual AFR system used showed this bias, the force should have taken steps to satisfy itself as to whether, in practice, there would be any bias.
The UK Information Commissioner has welcomed the ruling and stated that in order for "the public to have trust and confidence in the police and their actions there needs to be a clear legal framework. Today’s judgment is a useful step towards providing that". The Surveillance Camera Commissioner, Liberty and South Wales Police have also given their views on the judgment.
Does this have wider relevance beyond the use of AFR by the police? Although the Court of Appeal gave its opinion on the legal issues specific to law enforcement bodies, the case is also relevant to the use of AFR under data protection legislation more generally.
South Wales Police said that they will not be appealing the judgement, so the case is the precedent for use of this technology. As this is the first case in the UK on the use of AFR, it is required reading for anyone who wants to understand the legal pitfalls around its use.