(Photo Illustration by Justin Sullivan/Getty Images).
COLUMBIA — People under the age of 18 would need parental permission to use social media under a bill legislators advanced to the House floor Tuesday.
In an effort to combat some of the same issues legislators raised as reasons to restrict social media, Meta, the company that operates Facebook and Instagram, released free videos Tuesday to teach children about internet safety.
The House Judiciary Committee voted 20-1 to advance a proposal that would require parents to give their permission for anyone under the age of 18 to download a social media app beginning March 1, 2026.
The House passed a similar bill last year 113-1, but the Senate never took it up.
A Senate bill set for subcommittee consideration Wednesday morning could present a glimmer of hope for proponents of the House bill, said Judiciary Chairman Weston Newton, a Bluffton Republican who sponsored the House version.
“This isn’t about pride of authorship. It’s about doing something in South Carolina that protects our children,” he said. “I just hope the Senate’s willing to do something this year.”
Representatives tacked on another bill that’s more similar to the proposal senators will consider Wednesday.
The “Age-Appropriate Code Design Act,” as legislators titled it, requires any websites that use children’s personal data to limit access to harmful content, provide safety tools and collect as little personal information as possible.
The goal of the proposal is to protect children without putting too many restrictions on private social media companies, said Sen. Sean Bennett, who sponsored the Senate’s version.
“I don’t want to tell you how to do your job,” the Summerville Republican said. “I just want you to make a safe product.”
Bennett, who also leads the subcommittee that will consider his bill Wednesday, said he anticipates members will need several meetings to come up with a final proposal to advance.
Even if senators decide to advance only the web design aspect of the proposal and skip the requirement of parental consent, that would be a start, Newton said.
“Action is better than inaction,” Newton said.
Social media bill
Along with requiring parent permission, the bill representatives advanced Tuesday would require social media sites to give parents a way of easily supervising their child’s activity online, block violent or graphic content, and prohibit adults from messaging minors they are not already connected with online.
Rep. Travis Moore likened the regulation to those put on other products used by children, such as cribs and car seats. Major advancements in technology require stricter rules around who can use it and how, the Roebuck Republican said.
“This is not the same as when someone’s mom kept them off AOL on a desktop in the ‘90s,” Moore said. “This stuff is in our kids’ pockets. It goes everywhere with them.”
Rep. Justin Bamberg, the sole vote against the bill last year, called the proposal “lip service legislation,” claiming its wording is so vague, it doesn’t actually do any of what it’ supposed to accomplish.
Legislation seeks to limit children’s use of social media, but opponents warn of lawsuits
For instance, the state has no way to know that social media platforms are actually verifying the identities of children’s parents, the Democrat from Bamberg said.
“This sounds good to people in South Carolina who are asking or begging South Carolina to do something to make social media safer, but it doesn’t actually fix a single thing,” he said.
A better solution would be for parents to not give their children cellphones at all or regulate their use of social media if they’re concerned about the apps they might use, Bamberg said.
“Kids can exist without social media,” he said. “Kids can exist without unfettered access to apps on their cellphones.”
But parents often have enough to do without constantly checking in on how their children are using social media, said Rep. Kathy Landing.
“For most parents, it’s overwhelming,” the Mount Pleasant Republican said. “It’s too much.”
Meta curriculum
For parents who need extra help, Meta funded a curriculum aimed at teaching middle schoolers how to safely use the internet, the company announced Tuesday.
The curriculum, developed with nonprofit Childhelp, teaches lessons through videos and activities on recognizing grooming and sextortion scams and tells children how to get help if they encounter any sort of exploitation online, said Ravi Sinha, the company’s head of child safety policy.
That includes more general information on understanding what’s safe in a relationship, setting boundaries and asking for help when needed, Sinha said.
“It is really trying to teach (children) a skill set that they can use in the offline world and the online world, so they can start with a head start compared to where a lot of us started in terms of understanding the risks and knowing how to avoid them,” Sinha said.
Nigerian man faces life in a US prison for sextortion that led to death of SC legislator’s son
The curriculum is designed to be easy to use, so teachers can use it in their classrooms, parents can show it to their children, and older mentors can teach it to younger students, Sinha said.
Protections for children are vital as the number of cases involving exploitation and bullying continues to rise, Sinha said.
Between January and October 2024, the National Center of Missing and Exploited Children received more than 456,000 reports of online enticement, which includes sextortion. That was an increase from 186,800 in 2023, according to the national nonprofit that Meta partnered with in creating its curriculum.
“The truth of it is that the people who are committing these terrible crimes are extremely adversarial and sophisticated,” Sinha said. “They adapt constantly. They’re highly motivated and unflinching in their cruelty.”
Meta supports legislative restrictions on its platforms for that reason, Sinha said. That includes requiring parental permission to download social media apps, making parents input their child’s age into their phone, and enabling safety features for children.
The company already uses some of those approaches through accounts designed specifically for teenagers, which use stricter privacy features and require parental permission to adjust those settings, Sinha said.
“There’s no one simple solution to prevent these types of crimes, so we have to do everything,” Sinha said.
The state needs a way to enforce that teenagers are actually using those accounts and that other social media companies are implementing similar features, Moore said.
“The fact is, there is absolutely no legal framework in place to manage this growing risk,” Moore said.
GET THE MORNING HEADLINES.