
Somewhere in a private database, there is almost certainly a photo of your face. Not one you submitted. Not one you consented to. One that was scraped from the internet — from social media, from news sites, from anywhere your image has ever appeared publicly — by a company most people have never heard of.
That company is called Clearview AI. And what it has built is, by every measure, the largest facial recognition database in human history. As of 2024, Clearview's database contains over 50 billion facial images — a number so large it almost certainly includes the majority of American adults, and hundreds of millions of people worldwide, none of whom ever agreed to be in it.
The way Clearview works is straightforward and, depending on your perspective, either impressive or terrifying. Upload a photo of any face. Within seconds, the system returns every other image of that person it can find online — along with the URLs where those images appeared. Name, employer, social media profiles, home neighborhood — all of it potentially surfaced from a single photograph.
For years, Clearview operated almost entirely in secret. It wasn't until 2020 that a journalist obtained a leaked client list and revealed the scale of what had been quietly built. Over 1,800 agencies — police departments, federal law enforcement, government bodies — had been using the technology, many of them without their own cities or oversight boards knowing.
The wrongful arrest cases started surfacing almost immediately. In 2022, a man named Randal Quran Reid was pulled over in Georgia and arrested for theft in Louisiana — a state he had never visited. Clearview's technology had matched his face to the wrong person. He spent six days in jail and thousands of dollars on legal fees before the charges were dropped. He was not the only one.
Every major platform whose images were scraped — Facebook, Google, Twitter, YouTube, Venmo — sent Clearview cease-and-desist letters. Clearview ignored them, arguing that publicly posted images are fair game under the First Amendment. The EU fined the company and banned its operations in several countries. In the U.S., it kept growing.
What makes this particularly difficult to undo is the nature of the data itself. Unlike a password, you cannot change your face. Once your biometric data is in a system like Clearview's, there is no practical way to remove it, no way to opt out retroactively, and no reliable way to know how it has already been used.
Clearview's founder has said publicly that his eventual vision is a world where anyone can be identified anywhere, in real time, simply by pointing a phone at their face. That world doesn't fully exist yet. But the database that would make it possible already does — and your face is almost certainly already in it.




