Facial recognition technology has been implemented on a trial basis in the UK, both in London by the Met (Metropolitan Police Service) and in Wales by the UK Police. But an appeals court ruled last Tuesday that police use of facial recognition technology violates several laws and has “fundamental flaws”. The move, which comes at the end of a much-watched procedure by opponents of facial recognition, is a great victory for privacy advocates in the UK.

Human rights organization Liberty is claiming victory in Britain after a court ruled police trials of facial recognition technology violated privacy laws. The Court of Appeal ruled that the use of automatic facial recognition systems unfairly violated the privacy rights of the plaintiff named Ed Bridges. The judges added that there were issues with the way personal data was handled, and said the trials should be stopped for now.

Ed Bridges, a resident of Cardiff, filed a complaint in 2019 against the police, alleging that having his face scanned in 2017 and 2018 violated his legal rights. Police in Wales began their experimental use of automated facial recognition technology in 2017, openly deploying a system called AFR Locate at several dozen major events such as football matches. Police compared the scans to watchlists of known individuals to identify those wanted by police, subject to open arrest warrants or otherwise being persons of interest .

Despite being backed by Liberty, Bridges lost his case in 2019, but the Court of Appeal overturned that ruling, ruling that the Wales Police facial recognition program was illegal. An order released Tuesday said two deployments in December 2017 and March 2018, and others “on an ongoing basis … were not in compliance with the law.”

“Too much discretion is currently left to the police,” said three judges of the Court. “It is not clear who can be placed on the watch list, nor that there are criteria to determine where the AFR can be deployed,” the judges added. They said that there had been no adequate data protection assessment, which is required by the Data Protection Act 2018, and that there had been a violation of the duty of equality. in the public sector which aims to protect against discrimination.

In 2018, Police in Wales released data admitting that around 2,300 of the nearly 2,500 matches – or around 92% – of software used at an event in 2017 were false positives. In London, an independent 2019 study found the Met’s system to have an 81% error rate, Sky News reported, although the Met’s own analysis claimed the error rate was much lower.

Bridges said after the judgment : “I am delighted that the court has recognized that facial recognition clearly threatens our rights. This technology is an intrusive and discriminatory mass surveillance tool … We should all be able to use our public spaces without being subjected to oppressive surveillance.”

Related Articles
Leave a Reply

Your email address will not be published. Required fields are marked *