Cobb police seek contract with controversial face recognition firm
Cobb police use the technology “to enhance public safety while continuing to act with regard to the basic privacy of the public we serve,” the statement says.
But some lawmakers and privacy activists have concerns over how the images are obtained by Clearview AI, how the images are used by law enforcement, how the software can misidentify people of color and the lack of restrictions to curb the potential for misuse.
Use of biometric information is becoming more common in daily life — from face scanners in the airport to fingerprint unlocking features on smartphones. Face prints, fingerprints, eye scans and voice recognition are some forms of biometric data that are unique to each individual and can be used as identifiers.
Clearview AI has collected over 10 billion public images from the internet, including social media networks, in a database which law enforcement agencies can pay to access. The company’s artificial intelligence software creates unique prints for each individual based on face structure. The face prints can be matched with other images to identify individuals, sometimes in seconds.
Christopher Bruce, an attorney with the American Civil Liberties Union of Georgia, said police should not have access to that much information.
“Police officers should not just be able to have this treasure trove of your data, your whereabouts, who you are, especially without you having consented to it,” he said.
In a public discussion posted online in 2021, Cobb County Police Chief Stuart VanHoozer said the department opts out of utilizing social media images and has a database of only “book-in photos.”
“Some systems do sift social media. We elect not to do that. We don’t do that, and our system doesn’t do that,” he said.
The department’s four-page policy on face recognition technology does not address which images officers have access to, but it does specify that they need more than a face match to make an arrest.
“We don’t make cases on a match,” VanHoozer said in the presentation. “All that tells us is a starting point.”
The ACLU sued Clearview AI in 2020 under the Illinois Biometric Information Privacy Act, which prohibits companies from collecting biometric data without consent. The ACLU argued the company had violated the law by creating face prints from images without consent.
The settlement reached in May now prohibits Clearview AI from allowing private companies to access its face vector database nationwide and limits access to Clearview AI’s database to law enforcement, government agencies and, in some cases, government contractors.
Nate Freed Wessler, a deputy director of the ACLU’s project on speech, privacy and technology, worked on the lawsuit against Clearview AI. He said even if the technology was 100% accurate, it would still raise concerns.
“Face recognition technology is both dangerous when it works and dangerous when it doesn’t work,” Wessler said. “It really gives the government an incredible, unprecedented power to instantaneously identify any of us.”
Some states have passed laws regulating the use of face recognition technology and other biometric data. This year alone, 11 states have introduced 19 bills on face recognition technology, and four have been enacted, according to a database created by the National Conference of State Legislatures (NCSL), a nonpartisan association that tracks such laws.
Georgia has not passed any regulations addressing the issue.
“Facial technology has become more accessible,” said AJ Wagner, a policy analyst for the NCSL. “It’s still seeing a lot of restrictions, but they’re not outright prohibitions.”
Cobb Commission Chairwoman Lisa Cupid said in an interview that she asked the police chief to withdraw the contract request until he had spoken more with the commissioners about how the technology is used.
Cupid said the contract will not be voted on until police leadership can show “how this will work, and how it will not result in any disparate impacts to our citizens,” particularly people of color.
The technology can be particularly fallible when matching people of color, which can lead to wrongful arrests, Bruce said.
Experts also said that while false arrests are concerning, the technology will someday be 100% perfect, which is just as dangerous.
“Inaccuracy is a problem, but inaccuracy alone isn’t the reason to beware of facial recognition technology,” said Deven Desai, a professor at Georgia Tech with a focus on law, ethics and technology. “The bigger problem is the way it can be abused and becomes an avenue to surveillance without due process.”
The Cobb County Police Department’s policy says officers must undergo training on the use of the technology; conduct peer reviews of each case using face recognition software; gather additional evidence; and receive supervisor approval before proceeding with charges.
The policy also says the technology cannot be used to conduct surveillance.
The Atlanta Police Department is the only other confirmed police agency in Georgia that uses face recognition. Its policy requires that officers go through two levels of approval before searching the database to determine if the technology is the appropriate tool for the investigation and whether the procedures have been followed.
But ACLU lawyer Nate Freed Wessler said more safeguards are needed.
“If they’re going to use this technology at all, it needs to be highly constrained to only the most serious types of crimes, only with a search warrant issued by a judge based on probable cause,” he said.