Amazon Comes Under Fire for Facial Recognition Platform

Privacy advocates say facial recognition can be an agent of authoritarian surveillance; others say it’s an invaluable tool to combat kidnapping, locate lost children and track down criminals on the run.

Facial-recognition technology has long been touted as a useful tool for law enforcement, but the ability of systems like Amazon’s Rekognition platform to identify large numbers of people at once in a single video or still frame has raised the hackles of privacy advocates.

The American Civil Liberties Union said Tuesday that it has confirmed that the Orlando, Fla. Police Department and the Washington County Sheriff’s Office in Oregon both have contracts for Rekognition, according to documents obtained under Freedom of Information Act requests. It flagged the contracts as evidence of the potential for invasive mass surveillance.

The platform typically works by matching facial images to those housed in databases chosen by the customer. In Washington County’s case, officers can use a mobile app to submit an image to be indexed against a database of more than 300,000 mugshots, according to the documents.

In Orlando, the technology is in a pilot phase, where the resort town is using eight city-owned cameras to pick up on “persons of interest.” Rekognition simply scans the video feed and flags people that also have hits in the database that the police department is using.

While the public-good aspects of the technology are obvious – picking up on people on the FBI’s most-wanted list, for example, or hunting for other criminals on the lam, in addition to augmenting Amber and Silver alert efforts – the capacity for overreach has sounded alarm bells for some.

“With Rekognition, a government can now build a system to automate the identification and tracking of anyone,” the ACLU said in a statement published on Tuesday. “If police body cameras, for example, were outfitted with facial recognition, devices intended for officer transparency and accountability would further transform into surveillance machines aimed at the public. With this technology, police would be able to determine who attends protests. ICE could seek to continuously monitor immigrants as they embark on new lives. Cities might routinely track their own residents, whether they have reason to suspect criminal activity or not. As with other surveillance technologies, these systems are certain to be disproportionately aimed at minority communities.”

As Amazon describes in its materials for the product, in “Crowd Mode” customers can “detect, analyze, and index up to 100 faces (up from 15) in a single image.”

Officials from the ACLU, Electronic Frontier Foundation, Freedom of the Press Foundation, Human Rights Watch and others also wrote a joint letter to CEO Jeff Bezos, demanding “that Amazon stop powering a government surveillance infrastructure that poses a grave threat to customers and communities across the country. Amazon should not be in the business of providing surveillance systems like Rekognition to the government.”

Amazon responded in a media statement that the platform is a tool that must be used responsibly, and that it would suspend its use if the services are abused.

“Amazon requires that customers comply with the law and be responsible when they use AWS services,” the tech giant said. “As a technology, Amazon Rekognition has many useful applications in the real world…Our quality of life would be much worse today if we outlawed new technology because some people could choose to abuse the technology. Imagine if customers couldn’t buy a computer because it was possible to use that computer for illegal purposes?”

It also noted the capacity for good, explaining that various agencies have used Rekognition to find kidnapping and sex-trafficking victims; that amusement parks use Rekognition to find lost children; and that UK police used Rekognition over the weekend to identify and verify royal wedding attendees.

And to be fair, these are not the first law-enforcement agencies to consider facial recognition as a useful tool: The U.S. Customs and Border Patrol for instance has been piloting this kind of technology at various airports since last year.

Too Soon?

That said, there are real-world, full-implementation examples of misuse for facial recognition (and technology troubles) that should offer a cautionary tale.

Last week in the UK, Big Brother Watch called facial recognition a “dangerously authoritarian surveillance tool,” after examining its use by police forces during sporting events and festivals to identify suspects in real time.

The group said in a report that the system that the Metropolitan Police use in London for example issues false positives 98 percent of the time. Meanwhile, BBW said that the South Wales Police has stored images of 2,400 of those false-positive images without the consent of the citizens.

“We’re seeing ordinary people being asked to produce ID to prove their innocence as police are wrongly identifying thousands of innocent citizens as criminals,” said BBW director Silkie Carlo. “It is deeply disturbing and undemocratic that police are using a technology that is almost entirely inaccurate, that they have no legal power for, and that poses a major risk to our freedoms.”

Regardless of one’s opinion on whether the potential for 1984-like government surveillance balances out the positive policework functions it enables, there are also less savory applications for facial recognition that engender data privacy questions.

As Amazon points out in the Rekognition product guide, “you can accurately capture demographics and analyze sentiments for all faces in group photos, crowded events, and public places such as airports and department stores.”

This application has also been put into practice in the real world, also across the pond: A giant smart billboard in London’s Piccadilly Circus has started using facial and object-recognition technology and a tower of hidden cameras to scan pedestrians, indexing age, gender, income and even mood and language.

The goal of course is to serve up “relevant” messages and marketing. As Tim Bleakley, CEO at the company running ads for the board, Ocean Outdoor, explained: “Coca-Cola, for example, can log on at any given moment, see a large group of Spanish tourists and change the copy of the ad from ‘hello,’ to ‘buenos dias’.”

David Ginsburg, vice president of marketing at Cavirin, told Threatpost that the marketing applications could represent a do-not-cross line.

“Having cameras equipped with facial recognition is nothing new,” he said. “However, the scary thing that Amazon has is access to both consumer and retail databases. Currently, if the police want to tap into the personal databases, they have to do it on their own. But if Amazon can deliver it to them on a silver platter, that is a bit disconcerting. Frankly, tying the police to retail and social databases may be crossing a line and we might want to rethink what we are doing moving forward. Although Amazon is not tying this to their retail database now, it is a slippery slope.”

Image source: Amazon

Suggested articles