Facial recognition startup fends off accuracy doubts and legal claims after NYT report
Clearview AI, an artificial intelligence firm providing facial recognition technology to US law enforcement, may be overstating how effective its services are in catching terrorist suspects and preventing attacks, according to a report from BuzzFeed News.
The company, which gained widespread recognition from a New York Times report published earlier this month, claims it was instrumental in identifying a New York suspect from video footage who had placed three rice cookers disguised as explosive devices around New York City last August, creating panic and setting off a citywide manhunt. BuzzFeed News found via a public records request that Clearview AI has been claiming in promotional material that law enforcement linked the suspect to an online profile in only five seconds using its database. But city police now say this is simply false.
“The NYPD did not use Clearview technology to identify the suspect in the August 16th rice cooker incident,” an NYPD spokesperson told BuzzFeed News. “The NYPD identified the suspect using the Department’s facial recognition practice where a still image from a surveillance video was compared to a pool of lawfully possessed arrest photos.”
The NYPD now says it has no formal relationship with Clearview, despite the company’s claims otherwise both in the promotional material it’s using to pitch its technology around the country and even publicly on its website. Clearview CEO Hoan Ton-That now says the NYPD is using its technology “on a demo basis,” BuzzFeed reports.
In a blog post published on Thursday responding to criticism, Clearview claims it has rejected the idea it produce a public, consumer-facing facial recognition app that could be accessed by anyone.
“Clearview’s app is not available to the public. While many people have advised us that a public version would be more profitable, we have rejected the idea,” the post reads. “Clearview exists to help law enforcement agencies solve the toughest cases, and our technology comes with strict guidelines and safeguards to ensure investigators use it for its intended purpose only.”
Clearview has quickly risen to the forefront of the national conversation around facial recognition technology — in particular, growing concern among activists and politicians over how it may be used to violate civil rights and whether it’s being adopted too quickly based on false or misleading claims about its effectiveness. Amazon, which makes a cloud-based facial recognition product called Rekognition, has also faced similar criticism for selling its technology to law enforcement despite repeated concerns from academics and activists who say it is flawed when used to try to identity darker-skinned and female individuals.
Clearview is also facing challenges from platforms in the wake of the NYT report. Twitter has sent Clearview a cease-and-desist letter demanding that the company stop scraping its platform for photos to include in its database. Twitter also demanded the company delete any existing data it may have obtained from the platform because using it to fill out a third-party database without user consent is against Twitter’s policies. Clearview has acknowledged publicly that it built out its database in part by scraping social media profiles.
Members of Congress are also expressing concerns over the product. Sen. Ed Markey (D-MA), a vocal critic of Silicon Valley privacy practices and overreach, also sent a letter to Ton-That earlier this month demanding the company provide crucial information about its practices and technology. The list of questions includes information on which law enforcement agencies Clearview is working with, results of internal bias and accuracy tests if there are any, whether the company plans to market its technology to individuals or third-party companies beyond law enforcement, and its child privacy protections, among other info.
“The ways in which this technology could be weaponized are vast and disturbing. Using Clearview’s technology, a criminal could easily find out where someone walking down the street lives or works. A foreign adversary could quickly gather information about targeted individuals for blackmail purposes,” reads Markey’s letter. “Clearview’s product appears to pose particularly chilling privacy risks, and I am deeply concerned that it is capable of fundamentally dismantling Americans’ expectation that they can move, assemble, or simply appear in public without being identified.”
In one particularly dystopian twist, The New York Times reported that Clearview had identified and reached out to police officers who may have been talking with journalists by checking logs of which officers uploaded photos of those journalists into Clearview’s app. “It’s extremely troubling that this company may have monitored usage specifically to tamp down on questions from journalists about the legality of their app,” Sen. Ron Wyden (D-OR) tweeted last Sunday.
It’s extremely troubling that this company may have monitored usage specifically to tamp down questions from journalists about the legality of their app. Everyday we witness a growing need for strong federal laws to protect Americans’ privacy.
— Ron Wyden (@RonWyden) January 19, 2020
Clearview AI, an artificial intelligence firm providing facial recognition technology to US law enforcement, may be overstating how effective its services are in catching terrorist suspects and preventing attacks, according to a report from BuzzFeed News. The company, which gained widespread recognition from a New York Times report published earlier…
Recent Posts
- Google now offers ‘web’ search — and an AI opt-out button
- Ayn’s new gaming handheld looks like a PSP, and it might just fill the hole in your heart left by Sony’s best portable
- Google Search is getting a massive upgrade – including letting you search with video
- Google Project Astra hands-on: Full of potential, but it’s going to be a while
- OpenAI chief scientist Ilya Sutskever is officially leaving
Archives
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- December 2011