We’ve reached our $100,000 gift match goal! But the journey doesn’t stop here—let’s keep going!

Skip to main content

We Hit Our Goal—Let’s Keep Going!

Thanks to our amazing supporters, we’ve reached our $100,000 match goal! But there’s still more to do—join us in protecting children and supporting our mission.

The Growing Concerns of Generative AI and Child Sexual Exploitation

12-13-2024

Generative artificial intelligence (GAI) has revolutionized how we interact with technology, offering innovative tools and solutions. However, this technology also introduces significant risks, particularly to children. At the National Center for Missing & Exploited Children (NCMEC), we are deeply concerned about the potential for GAI to be used in ways that sexually exploit and harm children.

Over the past two years, NCMEC’s CyberTipline has received more than 7,000 reports related to GAI-generated child exploitation. These reports represent only the cases we are aware of, and there are likely many more unreported or unidentified instances. As this technology becomes more pervasive, and public awareness grows, we expect these numbers to grow.

GAI technology enables the creation of fake imagery, including synthetic media, digital forgery, and nude images of children, through tools like “nudify” apps. These manipulative creations can cause tremendous harm to children, including harassment, future exploitation, fear, shame, and emotional distress. Even when exploitative images are entirely fabricated, the harm to children and their families is very real.

The risks don’t stop at images. GAI is being used to manipulate children in other ways, such as generating realistic text prompts that predators use to groom or exploit victims. Offenders have even leveraged GAI in sextortion cases, using explicit AI-generated imagery to coerce children into providing additional content or money. Financial sextortion is a growing threat, and GAI tools make it easier for offenders to target children.

These technologies represent a new frontier of child exploitation, one that requires proactive solutions and collective action. If you encounter any instance of GAI-created child sexual exploitation, here’s how you can get help:

  1. Report to the CyberTipline: If you’ve encountered child sexual exploitation content, including content created with GAI, report it to NCMEC’s CyberTipline. Your report helps connect victims to resources and ensures incidents are addressed by law enforcement.
  2. Use NCMEC's Take It Down: If a sexually explicit image of you or someone you know – whether real or GAI-created – is circulating online, Take It Down can help. This tool allows individuals to anonymously request the removal of explicit images from participating platforms and social media companies.
  3. Work with Your School Personnel: For schools, addressing the risks of GAI technology requires clear codes of conduct, robust incident response plans, and comprehensive training for educators to recognize, prevent, and respond to incidents involving AI-generated exploitation or abuse. Empowering school communities with these tools is critical to creating a safe environment for children.
  4. Learn and Share Resources: Education is key to prevention. NCMEC’s NetSmartz program provides tools and resources for parents, educators, and children to understand and navigate the risks of GAI. Visit NetSmartz to access tip sheets, lesson plans, and more.

The rise of GAI child sexual exploitation is a challenge we can’t ignore. By reporting incidents, supporting victims, and educating ourselves, we can fight back against these harmful uses of technology and protect children everywhere. Visit NCMEC’s GAI issues page to learn more about what we’re seeing in this space and how we’re addressing this critical issue.

Together, we can all make sure that every child has a safe childhood.