Court Rules AI Breaks Equality Laws

Published date25 November 2020
Subject MatterPrivacy, Data Protection, Privacy Protection
Law FirmHill Dickinson
AuthorDavid Hill, Eleanor Tunnicliffe and Richard Parker

The Court of Appeal decided that South Wales Police had not done enough to make sure that the technology it was using was not biased in this way. The Court said:

'The fact remains, however, that SWP has never sought to satisfy themselves, either directly or by way of independent verification, that the software program in this case does not have an unacceptable bias on grounds of race or sex. There is evidence, in particular from Dr Jain, that programs for AFR can sometimes have such a bias.'

Addressing the interference in people's privacy, the Court also said that the fact that the new technology was only being trialled was no defence to the requirement to act in accordance with the law.

So, developers and purchasers of AI in the public sector will need to find ways of demonstrating that they have investigated the question of whether the technology may be biased against different groups of people, and ensure that measures are put in place to mitigate against the risk of bias or discrimination. It is important to understand that the Court of Appeal does not go so far to say that the technology must be completely free from bias. Rather, the organisations using it must have investigated whether there is any bias and taken this into account when deciding whether to adopt the technology. Given that the relevant data protection, privacy and equalities duties are...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT