Explore

Congress Must Pass Kids Online Safety Act to Make Social Media Safe for Youth

Okolo: Bill would protect kids' personal information, turn off addictive product features and require companies to minimize mental health risks.

Help fund stories like this. Donate now!

On Jan. 31, the Senate Judiciary Committee held a hearing to address the dangers young people experience when navigating social media platforms. The emotionally charged event drew a diverse audience, featuring heart-wrenching stories that underscored the severe impact of online bullying, harassment and specifically sexual exploitation on children’s mental health and safety. Senators questioned social media executives on what, if any, steps they would commit to taking to protect young users on their platforms.

At The Jed Foundation, we advocate for a safety-first approach, aligning with the surgeon general’s advisory on Social Media and Youth Mental Health. We know young people find community online. What we’re hoping is that they find safe communities — and it’s time to insist that platforms prioritize children’s safety. That’s why JED has endorsed the Kids Online Safety Act, which requires social media platforms to protect young users by enabling them to protect their personal information, turn off addictive product features and opt out of receiving recommendations made by algorithms.

The bill would also require social media companies to address and minimize mental health risks to young people by more effectively controlling content that encourages self-harm, suicide, eating disorders, substance abuse and sexual exploitation. In addition, it would ensure that academic researchers and nonprofit organizations can access essential datasets in order to further explore the links between youth social media use and mental health.  

The purpose of social media platforms, like all companies, is to pursue profit, which is why they cannot regulate themselves. There needs to be more transparency and accountability, especially when it comes to products used by minors. Here are some steps that JED recommends Congress take:

Establish a regulatory agency exclusively dedicated to safeguarding digital and online safety.

Social media companies made over $11 billion in revenue advertising to Americans under 18 in 2022 and remain one of the only unregulated industries marketing to minors. During the hearing, bipartisan support surfaced for creating a regulatory agency. Executives from major social media platforms, including X, Meta, Snap, Discord and TikTok, expressed openness to the idea. The agency would ensure that these companies comply with safety regulations, are held accountable for violations and prioritize the well-being of users in both practice and policy.

Support federal regulations designed to limit the harmful aspects of social media and maximize protections of users.

Social media companies should be required to leverage algorithms to surface supportive mental health content, build in time limits and digital breaks for young users, and seek out and and ban content that encourages harmful behaviors, including suicide, self-injury, eating disorders and cyberbullying. It is also crucial that platforms reduce features that encourage overuse, such as video autoplay, push notifications and rewards for  engaging online.

Invest in high-quality, large-scale research into interventions, protective policies, and the short- and long-term effects of social media on mental health.

While Meta CEO Mark Zuckerberg said that studies linking social media use and mental health issues in teens do not show cause and effect, there is ample evidence demonstrating risks from exposure to harmful content and cyberbullying, as well as chronic online engagement. Social media platforms also utilize tools that encourage young people to compare themselves to others, which has been linked with body image issues and eating disorders. ”Likes,” follower counts and viewership rates all push users to value their personal worth based on these comparisons.

Much is known about social media and mental health, but there is also a lot yet to be discovered. The current research is insufficient because platforms are not required to share data. This is why a safety-first approach is so important. Collaboration between technology companies and independent researchers is an essential next step in understanding how social media use affects youth mental health.

Require social media companies to establish and implement data transparency policies.

Social media has been around for more than 20 years, and still there is a lack of clarity about how each platform operates. It’s essential that these companies make clear to young users what factors — such as engagement “streaks” on Snap or autoplay on Instagram — shape their online experiences; give them more autonomy in choosing the type of content that appears in their feeds; and provide parents and caregivers with greater control to tailor their children’s experiences. 
Young people spend a large portion of their lives on social media. They need to be able to do so without serious risk of harm. There is bipartisan support for creating change in the industry — and unanimous support for the bill in the Senate Judiciary Committee. With that momentum, there is a clear opportunity to make youth well-being and mental health a priority. Youth mental health advocates have a responsibility to hold companies accountable, eliminate what is harmful and ensure a positive, safe social media experience for all young people.

Help fund stories like this. Donate now!

Republish This Article

We want our stories to be shared as widely as possible — for free.

Please view The 74's republishing terms.





On The 74 Today