September 30, 2020
PrivSec speaks to Ed Bridges who, together with Liberty, took SouthWales Police to court over its use of facial recognition technology in Cardiff.
Ed Bridges, a dad of two from Cardiff, was strolling down a busy shopping street one lunch time just before Christmas, when he encountered a police van using facial recognition equipment.
“It's only when I got close enough to it that I could seethe facial recognition technology on it and that I realised what it was therefor. And of course, by that point it would have scanned my data several times over. As a law-abiding citizen going about their business, that struck me as being an invasion of my privacy,” he recalls.
Discomfited, he nevertheless went about his day.
But something happened a few months later that made Bridges, who quips about his “rather boring life”, really start to pay attention. While attending a peaceful protest outside an arms fair – Bridges is a Council Member of the pacifist organisation the Peace Pledge Union and a former Lib Dem County Councillor – he saw South Wales Police again using facial recognition technology.
“I'm interested in issues around privacy and civil liberties, but it wasn't an issue on which I’d ever really campaigned or paid especially close attention, before. But I hold perhaps a slightly old-fashioned view that the police should uphold people's right to peaceful protest, and that peaceful protest is a cornerstone of democracy. And so when I saw my local police force using technology to scan the biometric data of peaceful protesters, that struck me as not just being unacceptable but actually being fundamentally at odds with what the police should be doing,” he says.
At a recent webinar hosted by the Irish Council for Civil Liberties, he described his objections thus:
“If you point at people a camera which has the capacity to take their uniquely identifiable biometric data, even if you don’t retain that information for long, you will inevitably make people feel reluctant to use their right to peaceful protest. It’s one thing handing over data to your local café so you can have a meal out with your partner under COVID restrictions – it’s quite another to give your data over to the state when you’re protesting about something with which the state may disagree.”
But, in addition, Bridges was uncomfortable with the data protection implications of using facial recognition technology.
“I’m a Newport County fan and I have plenty of experience of being in a crowd where police have taken pictures or pointed the video camera at a crowd that I've been part of as part of their policing tactics. But under data protection legislation in the UK, I have rights about that data, so if the police filmed me in a football crowd, I can write to them and say, “You filmed this crowd I was on this date, at this location, I would like to see any footage that you hold of me”, and they have to send me that. So there is a recourse, there's some safeguards and I can understand what they found and potentially why,” he says.
“The difference with facial recognition technology is that…it isn't the same sort of thing. This is the processing of biometric data. It’s more like taking a digital fingerprint than it is taking a piece of footage or taking a still image. And there is no way of after the event saying to the police, ‘Have you scanned my image, why were you doing this, I want to know what data you hold on me.’”
Bridges took his concerns to campaigning civil liberties organisation Liberty.
“I was sort of hoping I could just sign a petition or something,” he says. But, instead, he ended up in court.
“They were looking around to see where they could potentially bring a test case about the use of this technology… And it moved from there.”
South Wales Police had been (and still is) conducting a trial called “AFR [Automated Facial Recognition] Locate” – in which surveillance cameras overtly capture digital images of members of the public and compare them with a watchlist compiled by them for this purpose.
How does AFR Locate work?
The facial features contained in the police watchlist are extracted and expressed as numerical values. A CCTV camera takes moving digital images in real time, and the AFR software detects and isolates individual faces. It then extracts unique facial features from the image of each face, creating a unique biometric template, which is then compared to the watchlist data, generating a “similarity score”. This indicates the likelihood of a match, the threshold value of which can be fixed by the end user in most AFR systems. 50 faces per second can be scanned in this way.
In the case brought by Bridges and Liberty, South WalesPolice’s watchlist was compiled primarily from a database of custody photographs of persons wanted on warrants, individuals having escaped from custody, persons suspected of having committed crimes, persons who might be in need of protection such as missing persons, persons whose presence might cause concern, persons of interest for intelligence purposes, and vulnerable persons.
Any possible match is identified by the software is reviewed by the operator, a police officer, who may discard the match if they believe the image is not the subject of interest, or notify nearby officers, who may speak to them, or perhaps stop and search or arrest them.
If no match is made, the system immediately and automatically deletes the image and facial biometrics, though the CCTV footage is retained for 31 days. If a match is made, the data is retained within the system for up to 24 hours, and a match report including personal information of the individual concerned is retained for 31 days.
In 2019 the High Court found that the use of facial recognition technology by South Wales Police was not unlawful and that the current legal framework was adequate. It found that a combination of data protection legislation, along with the Surveillance Camera Code of Practice andSouth Wales Police’s policy documents provided a sufficient legal framework.
However, in August this year, the Court of Appeal found in favour of Bridges on the grounds that the use of AFR was not in full accordance with the law for the purposes of Article 8(2) of the European Convention on Human Rights (right to respect for private and family life), that South Wales Police’s Data ProtectionImpact Assessment (DPIA) was not sufficient for full compliance with section 64of the Data Protection Act 2018, and that South Wales Police had not sought to satisfy themselves that the software had no possible bias on the grounds of race or sex, though there was no evidence that bias had actually occurred.
The court found there were “fundamental deficiencies” in the legal framework, as too much discretion was allowed to individual police officers to select individuals for watchlists and where AFR Locate could be deployed.
The court also concluded that as the technology use infringed Article 8 (2) of the ECHR, the DPIA therefore inadequately dealt with this aspect, so did not properly assess the risks to the rights and freedoms of data subjects and “failed to address the measures envisaged to address the risks arising from the deficiencies we have found, as required by section64(3)(b) and (c) of the DPA 2018.”
The court, however, rejected two further grounds on which the appeal had taken place.
The reaction of South Wales Police to the ruling by has been positive, and points to further evolution in the deployment of facial recognition technology, rather than cessation of the use of any such software. In a statement responding to the Court of Appeal judgment, South Wales Police said the following:
“South Wales Police welcomes the judgment of the Court of Appeal, having never challenged Mr Bridges’ right to bring such proceedings. The force will not be appealing today’s findings.”
Chief Constable Matt Jukes said: “The test of our ground-breaking use of this technology by the courts has been a welcome and important step in its development. I am confident this is a judgment that we can work with. Our priority remains protecting the public, and that goes hand-in-hand with a commitment to ensuring they can see we are using new technology in ways that are responsible and fair.
“After our approach to these issues was initially endorsed by the Divisional Court and now that the Court of Appeal has given further scrutiny on specific points, we will give their findings serious attention. The Court of Appeal’s judgment helpfully points to a limited number of policy areas that require this attention. Our policies have already evolved since the trials in 2017 and 2018 were considered by the courts, and we are now in discussions with the Home Office and Surveillance Camera Commissioner about the further adjustments we should make and any other interventions that are required.
The national policing lead for facial technology, Deputy Chief Constable Jeremy Vaughan, has been working with the Home Office, police forces and other law enforcement agencies on the issue of consistent deployment of facial recognition technology across policing in England and Wales.
He said: “There is nothing in the Court of Appeal judgment that fundamentally undermines the use of facial recognition to protect the public. This judgment will only strengthen the work which is already underway to ensure that the operational policies we have in place can withstand robust legal challenge and public scrutiny.”
For Bridges, the case has brought a sense of vindication of his unease with the technology used. He is not, however, desperately keen to take up the mantle of fighting other privacy causes in the near future.
“I’d like to think that I’ve done my bit, and it’s now time perhaps time for others to have a turn,” he says.
To hear more from Ed Bridges and others on the ethics of facial recognition technology and AI for public privacy, register for PrivSec Global on 2nd December 2020.