ACLU Sues Clearview AI Over Faceprint Collection, Sale

facial recognition technology

Watchdog group said company has violated the Illinois BIPA and ‘will end privacy as we know it’ without intervention.

The American Civil Liberties Union (ACLU) has sued a New York-based startup for amassing a database of biometric face-identification data of billions of people and selling it to third parties without their consent or knowledge

The U.S. citizens’-rights watchdog organization has filed suit in the Circuit Court of Cook County in Illinois against Clearview AI, on behalf of a number of organizations comprised of vulnerable communities—such as survivors of sexual assault or domestic violence and undocumented immigrants—for violating the the Illinois Biometric Information Privacy Act (BIPA).

Clearview has been collecting what are called “faceprints,” or unique biometric identifiers similar to someone’s fingerprint or DNA profile, and then selling them to “private companies, police, federal agencies and wealthy individuals, allowing them to secretly track and target whomever they wished using face recognition technology,” ACLU Staff Attorney Nathan Freed Wessler wrote in a blog post published Thursday.“The company has captured these faceprints in secret, without our knowledge, much less our consent, using everything from casual selfies to photos of birthday parties, college graduations, weddings and so much more,” he wrote, adding that Clearview “will end privacy as we know it if it isn’t stopped.”

BIPA is an Illinois law aimed at protecting people “against the surreptitious and nonconsensual capture of their biometric identifiers, including faceprints,” according to the ACLU’s complaint, which also lists its Illinois branch as well as the Chicago Alliance Against Sexual Exploitation, Sex Workers Outreach Project Chicago, Illinois State Public Research Interest Group Inc. and Mujeres Latinas En Accion as plaintiffs.

The complaint demands that the court immediately order Clearview to stop its capture of biometric identifiers of the citizens of Illinois. “Clearview’s actions clearly violate BIPA,” Wessler wrote.

facial recognition coronavirus

face recognition technology concept illustration of big data and security in city with crowd

Clearview was founded by Australian entrepreneur Hoan Ton-Thatand, who has been under fire and harsh scrutiny for his company’s business practices since late last year. He has defended his company’s practices and intentions as well as welcomed the privacy debate, stating in various published reports that the technology is meant to be used by law enforcement to help solve crimes and not to violate people’s privacy.

Clearview not only sells a database of faceprints to its clients, but it also provides access to a smartphone app that allows them to upload a photo of an unknown person and instantly receive a set of matching photos.

Ton-Thatand told CNN that more than 600 law-enforcement agencies in the United States and Canada are using Clearview’s technology, citing an example of how it was utilized by police in an investigation into a child-predator ring in New Jersey. Another published report outlined how the technology was used to quickly identify and arrest a person causing a disturbance in New York’s Fulton Street subway station.

Still, Clearview’s unauthorized scraping of biometric data from photos across social media platforms like Facebook, Twitter and YouTube has spurred those companies to send Clearview cease-and-desist letters.

Now Clearview is facing this new legal challenge from the ACLU and the individuals represented in the Illinois case, who in their complaint argue that due to the “immutability of our biometric information and the difficulty of completely hiding our faces in public,” unfettered collection of face-recognition data “poses severe risks to our security and privacy.”

“The capture and storage of faceprints leaves people vulnerable to data breaches and identity theft,” according to the complaint.

The plaintiffs also raised concerns that having access to this biometric information also could lead to unwanted or invasive tracking and surveillance of people in places where they should feel safe, such as in houses of worship, domestic-violence shelters or Alcohol Anonymous meetings.

The data also could be used to identify people at protest or political rallies and put them in danger from organizations or people who don’t agree with their political views.

“For our clients — organizations that serve survivors of domestic violence and sexual assault, undocumented immigrants, and people of color — this surveillance system is dangerous and even life-threatening,” Wessler wrote. “It empowers abusive ex-partners and serial harassers, exploitative companies, and ICE agents to track and target domestic violence and sexual assault survivors, undocumented immigrants, and other vulnerable communities.”

The lawsuit is not the first time a company has faced a legal challenge to faceprint-data collection by a technology company based on violation of the Illinois BIPA. Last September, a lawsuit filed against Vimeo used the same law to challenge the ad-free video platform’s collection and storage of users’ facial biometrics in videos and photos without their consent or knowledge.

Concerned about the IoT security challenges businesses face as more connected devices run our enterprises, drive our manufacturing lines, track and deliver healthcare to patients, and more? On June 3 at 2 p.m. ET, join renowned security technologist Bruce Schneier, Armis CISO Curtis Simpson and Threatpost for a FREE webinar, Taming the Unmanaged and IoT Device Tsunami. Get exclusive insights on how to manage this new and growing attack surface. Please register here for this sponsored webinar.

Suggested articles