Thu. Mar 13th, 2025

State Sen. Heather Somers, R-Groton, has experienced first-hand how artificial intelligence can be used to infringe on personal privacy.

Somers said some voters in her district had created “very suggestive” photographs of her that were created with AI. “It’s putting me in clothing that I would never wear, and in a pose that I would never pose,” she told the Connecticut Mirror Monday. 

Somers said she can handle it — she has a “thick skin” — but she’s concerned about young girls whose photos could be distorted and sexualized with AI, then circulated on the internet by the people who create them. And she worked to develop legislation that would criminalize that behavior.

Senate Bill 1440, introduced this year in the legislature’s Judiciary Committee, would make it a crime to disseminate AI-generated sexual or “intimate” images without the knowledge or consent of the person being depicted. 

If it passes, Connecticut would join a handful of other states that have adopted similar measures to crack down on AI-generated images, specifically those considered “deepfake pornography.”

Minnesota recently proposed a bill that would ban websites and apps from allowing “nudification” technology to be used on photos and give the state’s attorney general the right to impose fines and allow people to sue the companies. California Gov. Gavin Newsom in September signed new laws prohibiting people from distributing AI-generated sexual images and requiring social media companies to create a protocol for identifying and removing non-consensual sexual images created by AI. 

The Connecticut bill would not hold social media companies responsible for images that spread across a platform.

A separate bill, introduced earlier this year by Sen. James Maroney, D-Milford, contained a similar provision prohibiting AI-generated sexual images without a person’s consent. But Maroney’s proposal, S.B. 2, is far wider-ranging, regulating the use of artificial intelligence by banks, landlords, employers and other areas where the discriminatory nature of algorithms raises concerns of fairness and privacy. For the moment, the legislation doesn’t appear to have the support of Gov. Ned Lamont’s administration.

The narrower issue of exploitation and deepfake pornography, currently before the Judiciary Committee, could follow a different path.

“Right now, one of the most pernicious uses of AI is the deepfake nonconsensual intimate images,” Maroney said in an interview with CT Mirror Monday. Maroney said the vast majority of deepfake video is pornography and revenge porn, most of which targets women and girls. “Everyone heard about it when it happened to Taylor Swift, but the fact is, it’s happening to girls in Connecticut,” he said.

In a public hearing before the committee Monday, Somers testified about one particular instance she’d learned of from a family in her district. A high school student’s photo was altered and then circulated among other students in her school — a situation Somers called “heartbreaking” for the family, who she described as quiet and religious.

The student’s family was ashamed, Somers said. “Horrific situation. Non-consensual. Going around a high school, destroying this young girl’s life — and, in their mind, the family’s reputation — with no recourse,” she said. 

The Office of the Public Defender pushed back against the proposal. John R. DelBarba, a lawyer with the public defender’s office, said in written testimony that the proposed legislation raised potential First Amendment issues, was too vague and had the potential to “turn an entire high school into felons within 1 school day.”

DelBarba raised a series of hypothetical questions to emphasize his point.

“Would a cartoon involving the intimate parts of a teacher now be a felony? Would a caricature that depicts the likeness of a classmate with her breasts visible be a crime? Would an X Rated Emoji now be a crime if sent to a friend?” he wrote, adding that newer smartphones have technology that allows users to create digitally altered images.

“You can simply describe what you want the Genmoji [a personalized emoji created with Apple’s AI function] to look like or create one of friends and family based on their photos and it simply appears on your screen,” DelBarba wrote. “You are only limited by your imagination.” 

Somers said during the public hearing that she disagreed with that analysis. “I don’t think you have a constitutional right to put my face on a pornographic image and put it up on the internet,” she said. “You have a right to free speech, but I don’t think that you have a right to do that.”  

DelBarba said the public defender’s office was working with the General Law Committee to address similar concerns with Maroney’s bill. 

State Sen. Gary Winfield, D-New Haven, told the CT Mirror that he also had concerns with Somers’ bill. He said it still needs work. 

“ There are clearly execution issues. And there seems to be logic issues as well,” he said. “Why [would you] endeavor to do this for AI, but … if I did a hyper-realistic drawing, that’s fine? It doesn’t flow logically.”