Thu. Dec 26th, 2024

Social media platforms have evolved from communication tools into highly addictive environments, particularly for minors. Platforms like Instagram and TikTok use advanced algorithms to track user behavior, pushing personalized content that keeps users engaged for hours. This isn’t just about entertainment—it’s about fostering dependency, particularly in young, impressionable users. Issues like anxiety and depression continue to rise, and states like New York have begun to take action. New York’s SAFE Act (Stop Addictive Feeds Exploitation for Kids Act) marks a significant step in protecting children from these manipulative algorithms. It is time for Connecticut to follow suit.

Adolescent mental health in the U.S. is at a tipping point. In 2023, 53% of high school girls reported persistent sadness or hopelessness—up from 36% in 2011. Even more troubling, 27% of these girls seriously considered suicide in 2021, a sharp rise from 19% a decade earlier. These disturbing trends coincide with the rapid expansion of social media use. According to a survey by Common Sense Media, between 2019 and 2021, screen time for children aged 8 to 12 grew by 17%, increasing from an average of 4 hours and 44 minutes per day in 2019 to approximately 5 hours and 32 minutes per day in 2021, with teens logging even higher hours, rising from 7 hours and 22 minutes per day in 2019 to likely over 8 hours per day by 2021.

Anna Szesnat

A concerning link emerges: children who spend more than three hours per day on social media are twice as likely to report poor mental health, including anxiety and depression. With 95% of youth now using social media and more than a third using it “almost constantly,” these risks are clear. The algorithms driving platforms like Instagram and TikTok keep children endlessly scrolling, often serving content that promotes harmful behaviors and deepens emotional struggles for vulnerable users. While correlation doesn’t prove causation, the relationship between rising screen time and worsening mental health for adolescents is too stark to ignore. It’s difficult to say that social media alone is the cause, but it’s a major factor.

Federal efforts like the Kids Online Safety Act (KOSA) aim to keep children safe online, but they miss a crucial piece: the addictive nature of these algorithms. While KOSA encourages platforms to limit harmful content, it doesn’t go far enough. The act stops short of directly tackling the powerful algorithms that drive kids’ social media addiction, leaving a gap that New York’s SAFE Act aims to fill. Unlike federal measures, the SAFE Act empowers New York’s Attorney General to enforce its provisions, holding platforms accountable if they fail to protect young users.

Connecticut’s data privacy protections, such as the Connecticut Data Privacy Act (CTDPA), which grants individuals rights to access, correct, and delete their personal data, and includes stringent breach notification requirements, are valuable steps toward safeguarding personal information. The state also mandates transparency and consent for certain data practices to ensure responsible handling of personal data.

However, these measures fail to address the real threat to youth mental health: the mechanics of algorithm-driven feeds. Current protections do little to prevent children from being drawn into endless scrolling and are insufficient to counter platforms designed to maximize screen time at the expense of young people’s mental health. New York’s SAFE Act, however, directly targets these issues. By mandating a reverse-chronological feed for minors, the law limits the stream of attention-grabbing, algorithm-selected content that often keeps children glued to their screens. The SAFE Act also restricts late-night notifications, protecting sleep—a crucial factor in mental health. Social media regulation policy is about more than safeguarding privacy; it’s about actively protecting young people’s well-being.

Some critics argue that regulating how content is delivered infringes on free speech or parental autonomy. But the SAFE Act doesn’t control what content minors can access—it regulates how that content is presented, targeting the manipulative design tactics that keep kids attached to their screens. This isn’t a free speech issue—it’s a public health one. Protecting children from addiction-driven features is necessary for their well-being.

Others claim that existing parental controls, like Instagram’s Quiet Mode and Family Center, are sufficient. But these tools are voluntary and require active engagement from parents, who may lack the time or tech knowledge to fully utilize them. The SAFE Act goes further by mandating that platforms take proactive steps to protect minors, regardless of parental involvement, because relying solely on parents is insufficient when algorithms are designed to undermine those very efforts.

The teenage mental health crisis isn’t slowing down, and social media addiction is clearly a contributing factor. Connecticut can no longer afford to delay. Social media regulation legislation is more than just policy; it’s a lifeline for the mental and emotional well-being of our youth. Connecticut must make a choice: protect our kids or let social media dictate their well-being. The time for Connecticut to act is now.

Anna Szesnat is a junior at Trinity College, majoring in Public Policy & Law and Religious Studies.

By