Use of Facial Recognition Stirs ControversyThe Technology Sparks Privacy Concerns in the UK and Beyond
A developer's use of facial recognition technology to scan the faces of pedestrians in one London neighborhood that's home to a large train station has drawn protests from residents and the city's mayor and sparked an investigation by the U.K.'s Information Commissioner's Office.
Meanwhile, the use of this technology by law enforcement and private firms is raising privacy concerns worldwide and is even becoming an issue in the U.S. presidential race.
"Scanning people's faces as they lawfully go about their daily lives, to identify them, is a potential threat to privacy that should concern us all. That is especially the case if it is done without people's knowledge or understanding," says Elizabeth Denham, the ICO's information commissioner, in a statement.
Although not mentioned in the ICO's statement, the privacy watchdog has the ability to suggest fines under the European Union's General Data Protection Regulation.
Argent, a developer in the King's Cross neighborhood, has been using facial recognition technology in its CCTV system to scan pedestrians near the world-famous King's Cross railway station, according to the BBC. While some of the land there is private, most of it is open to the public, the BBC reports.
Replying to a letter sent by London Mayor Sadiq Khan, Argent said it was deploying the technology in the interest of public safety, the Guardian reports. Local police use similar technology in the same area, which has also drawn concern from residents, the BBC reports.
Mayor of London Sadiq Khan has written to the King's Cross Central development asking for reassurance its use of facial-recognition technology is legalhttps://t.co/gcu9KMgZXI— BBC London (@BBCLondonNews) August 14, 2019
After Argent issued its statement, the ICO announced it plans to investigate whether any laws were broken by using facial recognition. As part of the investigation, the ICO plans to request detailed information from the relevant organizations about how the technology is used. The agency will also inspect the system and its operation on-site to assess whether or not it complies with data protection law.
"Put simply, any organizations wanting to use facial recognition technology must comply with the law - and they must do so in a fair, transparent and accountable way," the ICO's Denham says. "They must have documented how and why they believe their use of the technology is legal, proportionate and justified."
John Hollywood, a researcher at the think tank The Rand Corp., says using facial recognition for purposes such as identifying criminal suspects is risky because it can generate false positives as often as 25 percent of the time.
Another risk, he says, is that a large volume of stored facial recognition data could pose privacy risks if it's hacked or leaked.
The use of facial recognition technology is governed in the U.K. and the EU by GDPR, according to the BBC.
In the past several months, the ICO has been making headlines with how it's used GDPR to put companies on notice when they violate Europeans' privacy right or do not report breaches quickly enough to authorities.
For instance, in early July, the ICO proposed a £164 million ($125 million) fine under GDPR against hotel giant Marriott for its failure to more rapidly detect and remediate a data breach that persisted for four years (see: Marriott Faces $125 Million GDPR Fine Over Mega-Breach).
Around the same time, the ICO issued a "notice of intent" that it plans to fine British Airways £184 million ($230 million) for violating GDPR, citing security deficiencies that led to a massive data breach (see: British Airways Faces Record-Setting $230 Million GDPR Fine).
A Global Controversy
Over the past several years, the use of facial recognition - along with other technologies such as machine learning, artificial intelligence and big data - has stoked global invasion of privacy fears.
In the U.S., the American Civil Liberties Union has taken aim at Amazon's Rekognition product, which uses a number of technologies to enable its users to rapidly run searches against facial databases. The ACLU's Nicole Ozer last year called for guarding against supercharged surveillance before it's used to track protesters, target immigrants and spy on entire neighborhoods (see: Amazon Rekognition Stokes Surveillance State Fears ).
More recently, city officials in San Francisco and Oakland have banned police from using facial recognition technology.
The debate over facial recognition technology has also been addressed by several U.S. presidential candidates.
On Monday, Democratic hopeful Bernie Sanders became the first presidential candidate to call for a ban on the use of facial recognition by law enforcement. This is one part of a larger criminal justice reform package that the Vermont senator's campaign calls "Justice and Safety for All."
Another Democratic candidate, Julian Castro, the former mayor of the Texas city of San Antonio, previously announced as part of his "People First Policing" policy that he would call for establishing guidelines for next-generation surveillance technologies, such as facial recognition.