Clearview AI, the maker of facial recognition software, on Monday settled a lawsuit brought by the American Civil Liberties Union and agreed to limit its facial database to the United States. weak for government agencies and does not allow most US companies access to it.
Under the settlement filed in Illinois state court, Clearview will not sell its database of what it says are more than 20 billion face photos to most private individuals and businesses in the country. But the company can still sell that database to federal and state agencies.
The deal is the latest blow to the New York-based startup, which has built its facial recognition software by pulling photos from the web and popular sites, such as Facebook. , LinkedIn and Instagram. Clearview later sold its software to local police departments and government agencies, including the FBI and Immigration and Customs Enforcement.
But its technology has been made illegal in Canada, Australia and parts of Europe due to a breach of privacy laws. Clearview also faces an interim fine of $22.6 million in the UK, as well as a fine of 20 million euros from Italy’s data protection authority.
“Clearview can no longer treat people’s unique biometric identifiers as an unlimited source of profit,” said Nathan Freed Wessler, deputy director of the ACLU’s Technology, Privacy and Statements Project , said in a statement about the deal. “Other companies would be wise to take note and other states should follow Illinois’ lead in enacting strong biometric privacy laws.”
Floyd Abrams, a First Amendment expert hired by Clearview to protect the company’s right to collect information publicly and make it searchable, said the company is “delighted to put it out there.” this lawsuit.”
“To avoid a lengthy, costly, and distracting legal dispute with the ACLU and others, Clearview AI has agreed to continue withholding its services to law enforcement agencies in Illinois for a period of time. period,” he said.
The ACLU filed the lawsuit in May 2020 on behalf of groups representing victims of domestic violence, undocumented immigrants and sex workers. The group alleges Clearview violated Illinois’ Biometric Information Privacy Act, a state law that prohibits private organizations from using citizens’ body identification numbers, including algorithmic maps of their faces without consent.
“This is a huge victory for the most vulnerable people in Illinois,” said Linda Xóchitl Tortolero, the plaintiff in the case and the head of Mujeres Latinas en Acción, an advocacy group for survivors. sexual assault and domestic violence, said. “For many Latinas, many of whom are undocumented and have low IT or social media qualifications, not understanding how technology can be used against you is a huge challenge.”
One of Clearview’s sales methods is to offer a free trial to potential customers, including private businesses, government employees, and the police. Under the deal, the company will have a more formalized process for demo accounts, ensuring that individual police officers are authorized by their employers to use the facial recognition app.
Clearview is also prohibited from selling to any Illinois-based entity, private or public, for five years as part of the settlement. It can then continue to do business with local or state law enforcement agencies within the state, Wessler said.
In one important exception, Clearview will still be able to make its database available to US banks and financial institutions under the fix in the BIPA.
The settlement does not mean that Clearview cannot sell any products to corporations. It will still be able to sell its facial recognition algorithms, without a database of 20 billion images, to companies. Its algorithm helps match people’s faces to any database the customer provides.
As part of the settlement, Clearview did not admit any liability and agreed to pay $250,000 in attorneys’ fees to the plaintiffs. The settlement must be approved by an Illinois judge.