By Heather Barnhart
Child exploitation is evolving — and accelerating — as generative AI enables predators to create deepfake content involving minors without ever making contact with a victim. In 2024, the National Center for Missing and Exploited Children (NCMEC) received more than 20 million reports through its CyberTipline — including a 1,325% increase in cases involving AI-generated content.
This surge reflects a disturbing shift: children are now being exploited without ever making direct contact with an abuser. GAI is rapidly accelerating the creation and spread of non-consensual, exploitative deepfake content — particularly involving minors.
In response, First Lady Melania Trump made this issue a key focus of her platform early in President Trump’s second term. On May 19, 2025, he signed the Take It Down Act into law. The act prohibits the nonconsensual online publication of intimate visual depictions of individuals — whether authentic or AI-generated — and mandates that websites and social media platforms promptly remove such content upon notification. It also requires the removal of duplicate images and bans the online posting of sexually explicit material intended to abuse or harass a child or satisfy the sexual desires of others.
Public websites, online services and apps that host user-generated content must now provide a clear process for individuals to report and request the removal of this material.
What the Take It Down Act means for public safety agencies
While the Take It Down Act represents a landmark step in combating online child exploitation, it also brings the real possibility of further straining an already overwhelmed system. Although this is a rare issue with strong bipartisan support, the protections outlined in the law — such as prohibiting the nonconsensual online publication of intimate images of minors — can only be effective if law enforcement is equipped to act on them.
The reality, however, is that investigative professionals and agencies are already stretched thin. In fact, 68% of investigators report they don’t have enough time to manage their current caseloads.
To fulfill the promise of the new law, agencies must be provided with the right tools and training to combat both existing and emerging forms of online exploitation.
How law enforcement can adapt to make good on the law’s promises
To bring the law’s promises to fruition, law enforcement needs access to the right technology and training.
Digital evidence is at the center of every one of these cases. On average, agencies spend 69 hours reviewing devices per case, and examiners are facing backlogs of three to four weeks. AI-powered digital investigative tools can help address these challenges by automating much of the data review process and rapidly surfacing relevant connections. These solutions also enhance pattern recognition and anomaly detection, accelerating investigations that involve large volumes of evidence and improving the accuracy of investigative findings.
For many local jurisdictions, persistent staffing shortages and limited resources make these challenges even more difficult. As caseloads are expected to grow under the new law, examiners can save valuable time by using AI-powered technology to offload tedious tasks and focus on the more complex aspects of their work. With a built-in, auditable chain of custody, these tools also help ensure that digital evidence is admissible in court, enabling faster prosecutions.
Importantly, leveraging these tools can also help protect the mental health of investigators. Research shows that police officers experience higher rates of post-traumatic stress disorder than the general population. By automating evidence categorization, AI tools reduce investigators’ exposure to graphic and traumatic content, helping to preserve their emotional stamina while working on these deeply distressing cases.
The Take It Down Act is a critical first step in protecting children online. But any law is only as effective as its enforcement. With the right digital forensic technology and regular training, agencies can uphold the law’s intent, manage rising caseloads and keep children safer in an increasingly digital world.
About the author
Heather Barnhart is the Senior Director of Forensic Research at Cellebrite and SANS Institute fellow. She advises on strategic digital intelligence operations and educates both the public and industry professionals on the latest challenges in the space and how Cellebrite helps address them. For more than 23 years, Heather has worked on high-profile cases, investigating everything from child exploitation to Osama Bin Laden’s digital media. She has helped law enforcement, eDiscovery firms and the federal government extract and manually decode artifacts used in solving investigations around the world.
| LISTEN: In this episode, Policing Matters podcast host Jim Dudley speaks with Heather Mahalik Barnhart, Senior Director of Community Engagement at Cellebrite, about the complexities law enforcement faces due to AI advancements.