OLIVIA — At least 42 law enforcement agencies in Minnesota, including the Renville County Sheriff’s Office, reportedly used Clearview AI facial recognition software, according to a Buzzfeed investigation.
Clearview AI is a web-based platform that allows users to submit pictures for possible matches in a database of more than 3 billion images pulled from open source websites, including news sites and social media, according to the company’s web page.
The company also boasted of a 100% accuracy rate at one point, according to a document obtained by a public records request from Buzzfeed.
However, questions about the software’s reliability and legal standing remain in limbo, according to law enforcement and artificial intelligence and privacy experts.
According to the law enforcement agencies the West Central Tribune spoke to, zero percent of the searches led to an arrest.
Most of the agencies in Minnesota that Buzzfeed and West Central Tribune spoke to decided not to go forward with a subscription to the service.
Clearview did not respond to a request for comment regarding the reliability and implementation of the software, which, according to Business Insider, can cost up to $50,000 for a two-year contract. Nor the company reply to a request to verify Buzzfeed’s data spanning from 2018 to February 2020, which was provided by a confidential source.
Usage of facial recognition software in Minnesota
The Renville County Sheriff’s Office used the tool during a burglary investigation but the software did not generate any leads in the case, according to Sheriff Scott Hable.
Hable said an investigator used the Clearview AI software on a trial basis and submitted a photo of a suspect, resulting in two possible matches that were determined not to be the suspect.
The Renville County Sheriff’s Office does not have a subscription to the service, according to Hable.
“(The investigator) had a pretty clear picture of who it was that he was trying to identify,” Hable said. “And of course, if (the software) couldn’t do what it was purported to do, then there’s probably not a lot of value in what we were trying to use it for.”
Kandiyohi County Sheriff Eric Holien wrote in an email that the expense of maintaining a program like that — along with data retention, data requests and the potential oversight needed — could exceed what could be managed by his office.
Holien also wrote that it’s legally problematic to maintain such a program and that it would only help in a small portion of cases due to the quality of photos they typically have of suspects.
“We just have never had a use for it here due to the lack of backbone systems to maintain it or make it feasible on businesses or public ends,” Holien wrote.
Willmar Police Capt. Michael Anderson wrote in an email that his department has never looked at facial recognition software.
“However, I’m not sure that we have a need for it at this point,” Anderson wrote.
The Renville County Sheriff’s Office decision not to use the product following a trial run corresponds with the majority of law enforcement agencies in Minnesota that responded to Buzzfeed’s questions about their use of the tool, with agencies either deciding not to use the software or saying it returned no usable leads.
The Prior Lake Police Department, which had one of the state’s highest usages between 1,001 and 5,000 uses, according to Buzzfeed’s data, also ultimately did not end up purchasing the program.
The department’s Public Records Supervisor Jennifer Bisek told Buzzfeed and the West Central Tribune that only one officer used a trial version of the software after being contacted by a Clearview AI representative.
Bisek wrote in an email that the officer who tested the system did not recall how many results came back with his search and that their search count was between one and five, not 1,001 and 5,000.
Bisek said no search results led to any arrests or any substantial leads.
“The officer we had test it is really good at trying new things and testing out new technology and whatnot,” Bisek said. “If he finds a benefit for it, things that other officers would use, he would bring it to the admin to look at maybe getting it. So if he didn’t find it valuable, that’s one part, but then the biggest part is usually the finances.”
Captain Matt Smith of the Burnsville Police Department, which had between 1,001 to 5,000 usages according to Buzzfeed’s data, said he wasn’t aware of the amount of times his department used the system as it was never tracked.
Smith said he was aware that officers had used the program during a trial run but the department’s chief, Tanya Schwartz, told him the department ultimately did not move forward with a subscription because of the lack of established procedures, policies and legal precedent.
“To our knowledge, it was never used in a case that ever went anywhere,” Smith said.
The Minneapolis Police Department used Clearview’s software between 101 to 500 times from 2018 to February 2020, according to Buzzfeed’s data.
The department’s director of police information, John Elder, replied to an email request for information about the department’s use of the system by sending directions for requesting public records from the department.
The public records request for data regarding the Minneapolis Police Department’s use of the Clearview AI’s facial recognition software is still pending.
The Minneapolis City Council banned the city’s police department from using facial recognition technology in February of this year.
Minneapolis Police Chief Medaria Arradondo said in a statement that the ban was “crafted and approved without any consideration or conversation, insight or feedback,” from him, according to the Star Tribune.
The Eagan, Eden Prairie, Fridley, Oakdale, Plymouth, St. Louis Park, Stillwater and the University of Minnesota police departments, along with the Minnesota Commerce Fraud Bureau and the Stearns County Sheriff’s Office all acknowledged to Buzzfeed that their agencies used the system but did not ultimately pursue a subscription.
The Fridley, Oakdale, Plymouth and the Stearns County Sheriff’s Office all told Buzzfeed the program did not lead to any arrests.
Questions about Clearview’s reliability
Saad Bedros has spent decades in the artificial intelligence field. He has a bachelor’s, masters and Ph.D. in electrical engineering.
For 25 years, he did research and development for Honeywell Labs in different types of artificial intelligence and computer vision.
Bedros is currently the partnership director for robotic sensors in advanced manufacturing for the University of Minnesota. Bedros' job is to connect industry leaders with the university in order to collaborate.
When first asked about Clearview AI’s facial recognition software, he said he never heard of them.
Bedros also wondered why Clearview had never submitted for testing at the National Institute of Science and Technology Facial Recognition Vendor Test, the premier AI testing agency that ranks companies in a competition, according to Bedros.
The test looks for bias in algorithms and other factors, like ability to identify subjects with varying illumination, expressions and unposed photos.
For example, early facial recognition software had an issue identifying Asian faces because most of the data used to create the algorithm were white faces, according to Bedros.
In a controlled environment, meaning a mugshot to mugshot comparison, accuracy rates can be as high as 99% for facial recognition, according to Bedros.
The images law enforcement is most likely to send through Clearview AI’s program would be from an uncontrolled environment like a surveillance video which, even in the most advanced AI, would be less accurate.
Clearview AI said in March 2019 that it had scored a 100% accuracy rate.
“Why would you claim these things to be the best,” Bedros said. “If you were sure about your technology you would compete, right?”
In the spring of 2020, Chairwoman Eddie Bernice Johnson (D-Texas) and Subcommittee on Investigations and Oversight Chairman Bill Foster (D-Illinois) sent letters to Clearview AI’s CEO Hoan Ton-That questioning the company’s privacy practices, security and software accuracy after Clearview AI signed a new contract with the U.S. Immigration and Customs Enforcement.
“The emergence of yet another contract between Clearview AI and federal law enforcement agents raises alarms about the potential for individuals being misidentified and targeted by ICE agents, particularly people of color. Clearview AI has not subjected its technology to validation via the NIST Facial Recognition Vendor Test program or any other robust, independent review process to ensure its accuracy across all demographics. The Committee will continue to explore the technology concerns associated with Clearview AI’s and other facial recognition tools.”
Munira Mohamed, a policy associate with the ACLU of Minnesota, said the organization has general concerns about the use of facial recognition software, whether it be by the government or from private companies.
The biggest impact of facial recognition software being used by law enforcement is a false arrest, but also the threat to people’s civil liberties and expected anonymity in public are worrisome, according to Mohamed.
“Having facial recognition used by (law enforcement) is essentially walking around with a driver license on your forehead,” Mohamed said. “There’s no reason for you to be identified in a public space without a warrant and without suspicion.”
The technology is also more likely to be used against people who are not white men, the populace where the error rates go up dramatically, according to Mohamed.
“But what we’ve seen with Clearview AI is that they seem to have really no ethical practices,” she said, highlighting that the company falsely claimed the ACLU did an independent review of the software.
“They essentially just scrape billions of photos from social media websites and then shop them around to law enforcement,” Mohamed said.
The company has also received cease and desist letters from YouTube, Facebook, Google, Microsoft and Twitter, according to Mohamed.
“They are very much the bottom feeders of the tech world,” Mohamed said.
Following the banning of facial recognition software in Minneapolis, Mohamed said the ACLU is hoping to scale up the ban because law enforcement agencies interact with each other constantly.
“There’s just a ton of loopholes,” Mohamed said, adding that while Minneapolis police cannot use facial recognition software, another law enforcement agency can and that information might be shared with the department.
“We advocate for the full ban of facial recognition technology until it becomes more reliable,” Mohamed said.
Clearview AI did not respond to a request for comment about their technology before this article was published.