When facial recognition is used unbeknownst to the public, controversial questions arise. So when the news that software facial recognition company, Clearview AI, has supposedly been sharing images of the public's faces with over 2,200 organizations, you can imaging the uproar.
Clearview AI's client list comprises of government agencies such as the Department of Justice, the FBI, Interpol, as well as private companies like Macy's and Best Buy.
A leaked client list was shared online by BuzzFeed News.
3 billion scraped images
Clearview AI's database includes around three billion images that have been scraped from social media as well as other sites. The intentions behind Clearview AI's scraping could be viewed as commendable, as they aim to assist law enforcement to find and catch persons of interest.
What's controversial, however, is that this has all been happening under wraps.
By scraping these images, within minutes the organizations can find out a person's name, where they live, and other personal information.
Some companies have voiced their strong objection to Clearview AI's manner of operating behind closed doors. For instance, Twitter, YouTube, and LinkedIn have all sent cease and desist letters to the company. Facebook also stressed that the firm "stop accessing or using information from Facebook or Instagram."
The scraped images are not only being kept locally either. Clearview AI has supposedly been working with a sovereign wealth fund in the United Arab Emirates, as well as the Royal Canadian Mounted Police. The latter have used Clearview AI's software for the past four months to find online child predators. They successfully rescued two children thanks to the technology.
Clearview AI's CEO, Hoan Ton-That has insisted that the company is allowed to use these images as they have been publicly shared online. Moreover, the company stresses that they are not a "consumer application" that's available to the public.
The New York-based startup is under fire at the moment, and time will tell what the outcome will be.