Child Internet Safety in 2025

Understanding the Risks

The digital world offers immense opportunities for learning, play, and social connection—but it also presents unprecedented risks for children. Recent investigations, including Bloomberg’s exposé on Roblox’s pedophile problem, reveal just how vulnerable young users are on popular platforms. Research shows that 1 in 12 children globally is at risk of online sexual exploitation, and more than 300 million children were impacted by non-consensual sharing or creation of sexual content in the past year alone.

Platforms like Roblox, TikTok, Snapchat, and Discord have become targets for predators, with offenses ranging from grooming and exploitation to cyberbullying. For example, nearly 48% of online grooming cases in the UK occurred on Snapchat, and Roblox’s user base of over 79 million daily active users, half under age 13, makes it a high-risk environment.

 

Platform Responses: Progress and Gaps

Roblox

Roblox has responded with new parental controls and stricter communication limits for children under 13, but internal concerns remain about whether its AI moderation tools are equipped to detect grooming. Darknet forums even circulate tips on exploiting Roblox’s systems, underscoring the urgency for better safeguards.

TikTok

TikTok’s Family Pairing feature allows parents to set screen time limits, control direct messaging, and view their child’s followers. Despite this, TikTok faces lawsuits, such as one from Utah alleging that its LIVE feature facilitated the sexual exploitation of minors in exchange for virtual currency.

Discord

Discord has collaborated with non-profits like Thorn to develop AI tools for detecting child sexual abuse material (CSAM) and has made some of these tools open-source. While Discord prohibits teen dating servers and explicit content, the sheer scale of its messaging system makes comprehensive moderation a challenge.

 

Government Action: Laws and Regulations

United States

The Kids Online Safety Act (KOSA) pushes platforms to adopt a “duty of care” for under-16 users, requiring risk prevention measures, stronger parental tools, and independent audits. The Children’s Online Privacy Protection Act (COPPA) updates also introduce new obligations, though their future is uncertain under shifting federal priorities.

United Kingdom

The UK Online Safety Act (2023) mandates platforms to address illegal and harmful content, imposing fines of up to £18 million or 10% of annual revenue for noncompliance. It even proposes scanning encrypted messages for child abuse content—a controversial move with privacy implications.

European Union

The EU Digital Services Act (DSA) enhances protections against cyberbullying, illegal content, and opaque platform practices, aiming to create a safer online environment without compromising fundamental rights.

 

The Double-Edged Sword of AI

AI plays a contradictory role in child safety online. While offenders exploit AI to generate fake child abuse images and teach grooming tactics, AI also offers solutions, from detecting CSAM to helping law enforcement identify victims faster. Staying ahead of abusers requires continuous innovation and cross-sector collaboration.

 

What Parents and Educators Can Do Today

  1. Use Platform Safety Features
    Activate parental controls on platforms like TikTok and Roblox. For example, TikTok’s Time Away feature can enforce device breaks during meals or school.
  2. Maintain Open Communication
    Talk regularly with children about their online experiences. Ask about who they interact with and teach them to recognize red flags.
  3. Stay Informed About Risks
    Keep up with the latest safety updates and reports on the platforms your children use.
  4. Collaborate Across Roles
    Work with schools, community groups, and other parents to create a broader safety net.
  5. Report and Document Incidents
    Use in-app reporting tools and, when necessary, contact local law enforcement to address suspected exploitation.