Explore

Protecting Children Online Takes Technology, Human Oversight and Accountability

Carter: Governments, legislators, companies and parents must work together to prevent abuse and exploitation on gaming platforms & social media.

Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

The rise of social media and online gaming has transformed how children interact, learn and play. While platforms like Roblox, Discord, TikTok and Instagram offer opportunities for connection and creativity, they also present serious risks, as online predators, traffickers and exploitative individuals increasingly use these platforms to groom, exploit and manipulate minors.

A 2023 study by the National Center for Missing & Exploited Children reported 32 million instances of child sexual abuse material being flagged across social media and gaming platforms. Interpol has also warned about the rising number of cases where predators use these platforms to groom children through manipulation, coercion and deception.

A recent lawsuit charging that Roblox and Discord “facilitated the criminal victimization of a 13-year-old child” highlights the systemic vulnerabilities that make young people easy targets. Despite Roblox’s safety and moderation policies, which cite ‘frequent’ audits and improvements to its algorithms to detect and block behavior that violates its Terms of Use, the lawsuit says loopholes allowed inappropriate content and predator interactions to persist. Discord, known for its private, invitation-only servers, has been cited multiple times for hosting unmoderated spaces where illicit activities thrive.

In fact, the National Society for the Prevention of Cruelty to Children found that platforms like Instagram, Snapchat and TikTok were responsible for over 80% of online grooming cases in the U.K. With children spending an average of four to seven hours daily online, exposure to potential harm is greater than ever.

While no single group or entity can solve this crisis alone, social media, gaming and tech companies must prioritize user safety. Rapid advancements in technology and artificial intelligence have unlocked vast opportunities for implementing safeguards in a way that is more streamlined and automated than ever. But it’s not solely about technology enhancements; a multi-pronged approach that leverages technology, human oversight and accountability is necessary, with changes such as:   

  • Stronger content moderation: AI-powered filtering is helpful but flawed. More human oversight is needed to identify harmful content and shut down predator accounts.
  • Improved reporting mechanisms: Users should have an easy, one-click way to report inappropriate content, with clear follow-up actions.
  • Age verification enhancements: Current systems are easily bypassed. More stringent ID-based verification should be mandatory.
  • Proactive predator detection: Platforms should use behavioral analysis to flag predatory activity before harm occurs.
  • Increased transparency and accountability: Lawsuits like the one against Roblox and Discord prove that self-regulation isn’t enough. Social media companies must be held legally responsible for failing to protect children.

With so many stakeholders — tech companies, parents, lawmakers and law enforcement — it’s easy for the legal and ethical responsibility for protecting minors to become blurred. But any crisis as large as this cannot be solved by solely one party, because no single entity can fully prevent these incidents.

Parents and guardians must start having conversations with their children about online safety — not tomorrow, but tonight. They must realize they are the first line of defense in protecting young people from online predators, so extra vigilance, open dialogue and education are critical. Minors must also be taught about red flags, processes for reporting inappropriate content and the importance of talking to a trusted adult when put in a difficult or inappropriate situation, so they can navigate online spaces safely.

But while all must share some responsibility, accountability is paramount. As governments worldwide grapple with how to balance child protection and internet freedom, legislators must push for stricter regulations and penalties for tech companies that fail to safeguard children. And tech companies must begin upholding a safety-by-design standard that invests in better detection of harmful content, grooming patterns and suspicious behavior.

Protecting children in the digital age requires constant vigilance, policy enforcement and education. It’s time to turn the tide against online exploitation to create a safer digital world where children can explore, play and learn without fear of exploitation.

Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

Republish This Article

We want our stories to be shared as widely as possible — for free.

Please view The 74's republishing terms.





On The 74 Today