AI facial recognition used in thousands of child exploitation cold cases


DHS (Department of Homeland Security) is using facial recognition and AI to find child abusers and rescue victims in a major new operation underway to solve thousands of cold cases.
An unprecedented HSI operation is trying to solve thousands of historical child exploitation cases. Clearview facial recognition is at the center of the task force’s work. GETTY IMAGES

In early July, U.K. police contacted the DHS Homeland Security Investigations unit about a sexually explicit video involving a man and an infant that the British investigators believed was made in America.

In an effort to identify both the adult and the child, HSI ran the pair’s faces through an undisclosed facial recognition tool that scanned a mass database of images scraped from the web and social media.

It found a match: Scott Barker, a college sports coordinator in Ashland, Missouri, according to a search warrant reviewed by Forbes.

Investigators reviewed Barker’s Facebook profile and found photos that further corroborated the facial recognition match as well as what appeared to be photos of the child in the video, according to the warrant.

Two weeks later, Barker was arrested and charged with one count of sexual exploitation of a child. An indictment is yet to be filed and Barker has not yet filed a plea as a result, his lawyer, federal defender Troy Stabenow, told Forbes.

The Baker investigation provides a rare insight into how HSI is using facial recognition tools like Clearview AI to quickly chase down new child exploitation leads.

But HSI is also using this type of technology in an unprecedented three-week operation to solve years-old crimes that’s led to hundreds of identifications of children and abusers, according to Jim Cole, who spent over two decades on fighting crimes against minors for the HSI and who pushed the initiative before retiring earlier this year.

Cole told Forbes the previously unreported task force started operating out of the HSI Cyber Crime Center in mid-July and ended on August 4.

“Facial recognition can never be the basis for probable cause.”

Jim Cole, former HSI child exploitation investigator

“No single effort like this has resulted in that amount of identifications in such a short period of time,” Cole told Forbes. “The tech used can assimilate the data and put that puzzle together. Before, we didn’t have the pieces.”

HSI declined to confirm or comment on the operations’ existence.

Cole declined to name the tools that were used, but sources with knowledge of the operation told Forbes one of them was the controversial facial recognition technology created by Clearview AI.

The New York City–based startup claims to have amassed a database of more than 30 billion images scraped without permission from places such as Facebook, Instagram, and LinkedIn.

HSI has signed multiple contracts with Clearview worth up to $2 million, and Clearview has previously said its tech was used by HSI to investigate child exploitation.

The same sources told Forbes that Clearview and other AI tools were used to scan huge caches of child exploitation material captured by HSI as well as Interpol’s Child Sexual Exploitation database, which contains more than 4.3 million images and videos of abuse.

One source said the repositories contained information on thousands of cases that have gone unsolved in recent years.

Cole said each image that includes exploitation is cropped so only faces are uploaded to Clearview servers. Each one is given a signature, which can later be searched by HSI staff auditing the tool’s use.

Every query is logged and auditors monitor appropriate use, Cole said, noting that all data “sits behind an encrypted wall,” so Clearview employees can’t access it.

The unprecedented effort and its scope had previously been infeasible, Cole explained, because the HSI Cyber Crimes Lab didn’t have the resources to ensure these types of precautions for searching such material at scale.

Cole added that facial recognition should only ever be used for intelligence leads, not as a reason to make an arrest. “Facial recognition can never be the basis for probable cause. That’s a pretty major safeguard.”

Clearview and Interpol did not respond to a request for comment.

Whilst law enforcement see great value in Clearview as a tool to identify and rescue children from abuse, privacy advocates worry about its massive scale in the absence of regulation.

“All too often, we have seen mission creep in government use of surveillance technology,” said Adam Schwartz, senior staff attorney at the Electronic Frontier Foundation. “Once the government starts using face recognition, even for sympathetic purposes, it will inevitably start using it for more problematic purposes – like identifying protesters.”

Late last week, the New York Times reported on the sixth case of facial recognition leading to a wrongful arrest in Detroit, in which a pregnant woman was falsely tied to a carjacking.

All six people wrongfully arrested have been Black, three of them in Detroit.

Meanwhile, Clearview continues to sign contracts in the U.S. and beyond. It’s been used in Ukraine to identify dead Russian soldiers and in January of this year, it signed a $120,000 deal with the FBI to assist in unspecified “crimes against children” investigations.

Cole hopes the HSI initiative will change critics’ perception of how law enforcement uses the technology. “I hope that this is used as an example of how the tech can be used ethically, appropriately, how the guidelines and safeguards over the technology work, and can be applied for incredibly positive activities.”

This article was first published on and all figures are in USD.

More from Forbes Australia