What Do Organisations Thinking Of Using Facial Recognition Software Need To Do Following The Court Of Appeal Judgment In R (Bridges) v CC South Wales ?

Published date28 August 2020
Subject MatterLitigation, Mediation & Arbitration, Privacy, Technology, Data Protection, Privacy Protection, Trials & Appeals & Compensation, New Technology
Law FirmGowling WLG
AuthorMs Jocelyn S. Paulley and Rocio De La Cruz

We have identified four general principles that any organisation wanting to use facial recognition technology should consider following this judgment (as the context and grounds of appeal are to some extent specific in the context of use of facial recognition technology by the police force):

  • Any use of facial recognition technology must undergo a Data Protection Impact Assessment which should assume that the right to privacy under the European Convention on Human Rights is engaged and likely to be infringed by the use of the technology.
  • The benefits gained by use of facial recognition technology can outweigh an individual's right to privacy This will clearly be context-specific but the use by the police in this instance was proportionate. This balancing exercise must be clearly documented (in a Data Protection Impact Assessment) by an organisation to show that the effect on individuals' right to privacy under the Human Rights Act has been considered and risks mitigated where possible.
  • Clearly then facial recognition technology does interfere with an individual's right to privacy so any use must be "in accordance with the law". Whilst overall the use of the technology was proportionate, giving individuals (in this case police officers) discretion as to when it is used and who the technology identifies is unlikely to be "in accordance with the law" unless there is very clear guidance.
  • Consider whether the facial recognition technology will result in any discrimination or bias. In this case, specific public sector legislation dealing with equality was engaged. Whilst this will not be the case for private sector organisations, the courts are clearly alive to the potential for discrimination and expect users of these systems to satisfy themselves that there will be no bias.

Background

The case concerned the use of a facial recognition system, called AFR Locate, by the South Wales Police ("SWP") to capture faces and match them to a "watchlist". The watchlist contains details of persons of interest to the police and vulnerable persons in need of protection.

AFR Locate uses a live camera feed to capture digital images of the public in real time. Software isolates individual faces and extracts unique features to create a biometric template. AFR Locate compares the template with a set of images in a digital database to generate a "similarity score" (numerical value) of faces of those on the watchlist. If there is no match, the software will automatically delete...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT