“As with all mass surveillance tools, it is the general public who suffer more than criminals.”
To the outrage of privacy advocates, London’s Metropolitan Police deployed facial recognition software Monday and Tuesday to scan faces of thousands of people as they walked unaware through London streets shopping for the holidays.
The use of the facial recognition technology was only a test run, but the police said its purpose is to reduce crime, in particular, violent crime, by matching the faces of people to a database. Forbes reported the database allegedly contains “not only criminals but also suspects, protesters, football fans and innocent people with mental health problems.”
The recent test deployed the technology in Soho, Piccadilly Circus and Leicester Square for a period of eight hours over two days, but London’s police have tested the facial recognition program elsewhere over the last couple of years – including the Notting Hill Carnival and Remembrance Sunday.
The Met Police had claimed their use of the technology would be transparent as leaflets would be given out to people and posters would be posted to explain what was going on. They also claimed that anyone who declined to be scanned would not be harassed in any way, except of course it is apparent that the individual is a suspect – given additional information.
Police Covert in Deployment of Facial Recognition Technology
But several human rights groups expressed concern with the facial scanning activities and said the police had not been transparent and forthright enough with their activities.
Liberty, the activist group that protested the police as they conducted the test run, said the police only handed out leaflets when pressed for information. They also condemned the police’s use of an unmarked van and insufficient display of posters explaining the test run.
As Gizmodo reported, “What’s particularly enraging activists is that the van is unmarked, and the only notice given about what’s going on is via a few small posters in front of it — and by the time you’ve approached it to read what’s going on, your frowning London face has been scanned in and added to the Met’s list of potentially disruptive inquisitive people who like to know what’s going on.”
There are just three signs about to say what's going on, but they aren't exactly big enough to draw any real attention. And the police were not handing out any leaflets to members of the public until we asked where those leaflets were. #ResistFacialRec pic.twitter.com/noGBlE46uh
— Liberty (@libertyhq) December 17, 2018
The police said in a case where scanned faces matched those in the police database, officers would push ahead to verify the individual’s identity and run it through their database.
100% False-Positive Rate for Facial Recognition Technology
Big Brother Watch, a privacy advocate group, discovered that when the police used facial recognition technology last June, the system generated five alerts all of which turned out to be “false-positives.” The system had incorrectly matched innocent people to criminals on the database, one instance lead to an “intervention or stop” of an innocent person.
“As with all mass surveillance tools, it is the general public who suffer more than criminals,” said Big Brother Watch director Silkie Carlo. “The fact that it has been utterly useless so far shows what a terrible waste of police time and public money it is. It is well overdue that police drop this dangerous and lawless technology.”
Big Brother Watch and Baroness Jenny Jones, a member of the UK’s House of Lords, together launched a legal challenge to the police’s use of the technology. As Forbes reported, Jones’ own photo was held on the Met’s ‘domestic extremism’ database while she was on a committee scrutinizing the Met and was running for Mayor of London.
Despite the technology’s criticisms and failures, the U.K. has spent around £2.6 million so far on the project and is continuing to move forward with it. Seven more trials are planned for within the next five months.