Sun. Jan 26th, 2025

A tablet with a series of social media icons on it

In this photo illustration, social media apps are displayed on an iPad. As concerns over youth mental health and privacy grow, more states are hoping to rein in the negative effects of social media through the courts and legislatures. (Joe Raedle/Getty Images)

Ritika Shroff had the typical Gen Z experience with social media. At 13, she signed up for Instagram, then Snapchat. Later, she downloaded TikTok and worked her way through other popular platforms.

But in high school, she began to see downsides, feeling pressure when comparing her number of followers, test scores and experiences with those of her peers online.

“They’re doing X, Y and Z with their lives, and I think I got pulled into it,” Shroff said.

GET THE MORNING HEADLINES.

Today, Shroff, a 19-year-old sophomore at American University in Washington, D.C., still sees the benefits of social media, such as allowing her to stay in touch with hometown friends from Des Moines, Iowa, and family in India. While she thinks there should be more rules around social media, she doesn’t think individual state actions, such as a state suing a platform, would make much difference.

“These small things won’t make an impact in the broader landscape,” Shroff said.

More states are hoping to rein in the harm that social media can do to teens’ mental health and privacy by approving laws that require age verification or parental consent, prohibit “addictive feeds” or ban the apps for minors. They also are taking social media companies to court.

But some experts say such efforts won’t make social media any safer. Instead, they fear the moves might infringe on people’s privacy and First Amendment rights — while potentially making the platforms harder for everyone to use.

“This is global media, and trying to regulate it at the micro level … the fear for a lot of people is that we’re going to end up with different rules for different states, which is just going to undercut the whole promise and potential of internet-based media and communication,” said Kevin Goldberg of the Freedom Forum, a nonprofit aimed at protecting First Amendment rights.

Some social media disputes are playing out at the federal level. Last week, the U.S. Supreme Court upheld a bipartisan federal law banning TikTok, a popular video sharing platform, unless its China-based parent company agreed to sell the app. The ban briefly went into effect before President Donald Trump, who had tried unsuccessfully to ban TikTok by executive order in his first term, signed an executive order delaying it for another 75 days.

But absent other federal action to curb social media’s effects on young people, many states are considering new legislation. In New York, a law enacted in June prohibits social media platforms from providing to minors so-called addictive feeds without parental consent. New York Attorney General Letitia James, a Democrat, is drafting formal rules to enforce the law.

Social media feeds are designed to keep kids scrolling longer and longer to drive up ad revenue, noted state Democratic Sen. Andrew Gounardes, who sponsored the bill. Kids who are addicted to social media suffer mental health issues, and people who spend more time scrolling tend to struggle to navigate real-life relationships, he argued.

The bills are all different, so it’s hard to say that all of them are good or all of them are bad.

– Paul Lekas, Software & Information Industry Association

“So social media, for all the positives that might exist, has some real, deeply negative and dark downsides that we are finally seeing manifest, and we have to reconcile it,” Gounardes said.

But tech developers are concerned new state laws could weaken privacy protections for users, take away online mental health resources for marginalized communities and restrict the flow of online information, said Paul Lekas, the senior vice president and head of global public policy and government affairs at the Software & Information Industry Association, a trade association representing the digital content industry.

“The bills are all different, so it’s hard to say that all of them are good or all of them are bad,” Lekas said. “But a lot of concerns come up in a number of these bills.”

Age restrictions

Some research suggests that excessive social media use is worsening young people’s mental health. Teens who spend the most time on social media are significantly more likely to exhibit negative emotions, such as sadness and anger, according to a 2023 Gallup poll.

A Florida law that went into effect this month prohibits kids who are under 14 from having social media accounts. A user who is 14 or 15 would have to get parental consent before starting an account.

Ashley Moody, Florida’s Republican attorney general at the time, agreed not to enforce the law while a lawsuit alleging it would restrict minors’ freedom of speech plays out. Moody was sworn into the U.S. Senate this week to replace Sen. Marco Rubio, the new U.S. secretary of state.

More measures are expected across the country during 2025 legislative sessions.

A new bill in Indiana would prohibit anyone under the age of 16 from creating social media accounts without verified parental permission. A similar bill was introduced in Nebraska, but with an age limit of 18. A prefiled bill in Nevada would set the age at 13.

To verify age, some apps may require all users to upload a photo of their ID. This could be of particular concern for adult users who would have their full legal identity tied to their social media account, said Ash Johnson, a senior policy manager at the Information Technology & Innovation Foundation, a think tank focused on public policy surrounding technology.

Rather than an outright ban on social media accounts for users under a certain age, increasing transparency and accountability measures for social media developers would improve the safety of the apps, Johnson said.

She pointed to California as an example. The state’s Age-Appropriate Design Code Act was partially blocked from enforcement by a federal appeals court last year. It would have required companies to ensure that online services likely to be accessed by children are designed to eliminate the risk of harm to them.

Parental controls, Johnson said, also could make it easier for parents to oversee their child’s media presence by deciding what content they can access.

Instagram’s new Teen Accounts, for example, automatically place teenage users into an account that limits who can contact them and the content they see — and anyone under the age of 16 will have to get parental permission before changing any of the safety features.

“It would give children a really customizable experience on social media depending on their individual developmental needs,” Johnson said.

A lot of the laws around the country are specifically designed to prevent younger people from either accessing certain content online or entire social media platforms, said Goldberg, of the Freedom Forum. Changing the way in which social media developers control who can and can’t have an account could change what people see on their feeds.

“We’ve seen a lot of this, especially at the state level, which is concerning,” he said. “Many of the laws that we are seeing proposed — and even passed — raise First Amendment concerns.”

States go to court

States also are turning to lawsuits to address social media effects on young people.

In October, attorneys general in California, Illinois, Kentucky, Louisiana, Massachusetts, Mississippi, New Jersey, New York, North Carolina, Oregon, South Carolina, Vermont, Washington and the District of Columbia sued TikTok, alleging violations of state consumer protection laws.

Led by California Democratic Attorney General Rob Bonta and James of New York, the lawsuits allege that TikTok exploits and harms young users and deceives the public about the social media platform’s dangers.

Texas Republican Attorney General Ken Paxton filed a similar suit that same month accusing TikTok of violating a state law protecting children online. The law prohibits digital service providers from sharing, disclosing or selling a minor’s personal information without permission from a parent.

TikTok has disputed the claims, calling them “inaccurate and misleading” in a statement to CNN. The company says its platform is safe for kids and offers time limits and parental controls.

States have also taken aim at Snapchat and Meta. In September, New Mexico Attorney General Raúl Torrez, a Democrat, filed a complaint against Snap Inc., Snapchat’s parent company, alleging the app’s developers were ignoring reports of sextortion, failing to implement age-verification rules, admitting to features that connect minors with adults and more.

And in 2023, more than 40 states sued Meta, claiming Instagram and Facebook worsened the youth mental health crisis.

The social media companies need to be held accountable, said Julie Scelfo, founder of Mothers Against Media Addiction.

Scelfo, a career journalist who covered youth mental health for years, said she was disturbed after finding out that more and more young children wanted to commit suicide as social media became more mainstream.

“Social media can connect people for positive things, but it has also been a very convenient conduit for all of the worst forces in society,” Scelfo said.

But tech companies are winning some fights — and going on the offensive.

In addition to the partial block of the Age-Appropriate Design Code Act, a federal judge has blocked until Feb. 1 another California law designed to protect children from addictive feeds. The Protecting Our Kids from Social Media Addiction Act would prevent social media platforms from providing minors with “personalized feeds.”

Across the states, companies are challenging dozens of laws restricting social media — and in some cases, they’re winning.

“I think that shows that courts are skeptical that either there’s no proof behind the goals of the legislators or that they’re not being precise enough,” Goldberg said. “So, I’m skeptical. I don’t think this is going to help because there will always be ways for children to access content on the internet or social media — it’s almost impossible to truly enforce.”

Stateline is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. Stateline maintains editorial independence. Contact Editor Scott S. Greenberger for questions: info@stateline.org.

YOU MAKE OUR WORK POSSIBLE.