Thu. Feb 27th, 2025

Gov. Ned Lamont’s chief innovation officer and top economic adviser forcefully warned lawmakers Wednesday against adopting an ambitious Senate bill that would put Connecticut at the forefront of regulating artificial intelligence, a technology likened to a second industrial revolution.

Testifying via video from a trade mission to India, Dan O’Keefe said the bill was a premature and risky effort to regulate a fast-evolving technology and would chill the economy of Connecticut at a time when it finally has distanced itself from a chronic fiscal crisis and flat economy.

“We end up having that chilling effect. We see a suppression of economic creativity. We see a suppression of innovation,” O’Keefe said. It would send “a message that says, ‘Because we don’t understand this yet, you can’t innovate here. You can’t take risks here.’ ”

The bill’s author, Sen. James Maroney, D-Milford, who has become a national leader among state lawmakers on data privacy and AI, listened intently, resting his bearded chin in his cupped hand, occasionally interrupting with a question or a challenge.

Sen. James Maroney listens to Commissioner Dan O’Keefe’s strenuous complaints about his AI bill. Credit: mark pazniokas / ctmirror.org

The Lamont administration’s opposition and skepticism expressed by House Speaker Matt Ritter, D-Hartford, outside the hearing room signals that Maroney’s measure, deemed a Senate Democratic majority priority as Senate Bill 2, will have trouble advancing beyond the state Senate for a second consecutive year.

Ritter declined to call the bill for a vote last year and is disinclined to do so now.

Last year’s argument that regulation of generative AI should rest with Congress and, perhaps, the Federal Trade Commission and other watchdog agencies has all but disappeared with the election of President Donald J. Trump and his reliance on tech billionaire Elon Musk to defang federal regulators.

“The outcome of that election makes [federal action] impossible for the next four years, and so I think it’s left to the states to do what they can,” said Senate President Pro Tem Martin M. Looney, D-New Haven.

O’Keefe, a tech investor who was a well-connected national fundraiser for Joe Biden, Barack Obama and other Democrats before joining the Lamont administration, did not disagree with that assessment. 

But he and Maroney, who have consulted and sparred over this issue for more than a year, are at odds over what form should that regulation take, whether there is sufficient buy-in from other states to command the attention of tech, and most crucially, whether the time for action has arrived.

“We’re too early here. We’re a state that represents 1% of the U.S. population. It is super unclear to me why Connecticut should be the only state in the region, the first state in the region, only the second beyond Colorado,” to define the parameters of AI regulation, O’Keefe said.

Colorado passed a similar bill last year, answering one objection raised when Connecticut was trying to be the first. But Gov. Jared Polis of Colorado already is seeking revisions to the bill he signed last year “with reservations,” and skeptics did not yet see a critical mass.

“Colorado’s not enough. I tend to agree with the governor on that,” Ritter said.

Virginia has passed an AI bill, though a gubernatorial signature is not assured. Bills are under consideration in Rhode Island, New York and Massachusetts, Maroney said.

Maroney half-jokingly blames Senate Majority Leader Bob Duff, D-Norwalk, an early proponent of data privacy protection — and of exploring the state’s role in protecting children from social media and other online abuse — “for sending me down this rabbit hole.”

Connecticut passed a data privacy bill in 2022, and Maroney and Duff approached the issue in concert with the formal and informal networks of state lawmakers who see state legislatures as new power centers in an era of gridlock in Congress. Duff said Maroney has become a visible and respected leader in those networks.

“We were the fifth state in the country to do so,” Maroney said of the data privacy law. “And really at that time is when I became interested in automated decision making, and actually, in 2022, before ChatGPT came out, we had already launched our first task force to look at AI.”

Maroney has crafted the bill as co-chair of the General Law Committee, which historically has been focused on consumer protection and the regulation of everything from liquor prices to the retail sales of cannabis. Senate Bill 2 was one of nine bills up for a public hearing Wednesday. Others dealt with refunds for unused heating oil, real estate contracts, drug production standards and legal revisions sought by the Department of Consumer Protection.

His co-chair, Rep. Roland Lemar, D-New Haven, mildly noted to O’Keefe that some academic research argues that regulatory structures are best birthed at the beginning of industrial and revolutions, not after they take flight and eventually become entrenched. 

O’Keefe did not respond. He testified from Bengaluru, speaking long after sunset to lawmakers who began their hearing at 10 a.m. O’Keefe was hired as the state’s chief innovation officer and has since taken on the additional post of commissioner of economic and community development.

Maroney sees urgency in ensuring that AI does not, among other things, enable banks, landlords and employers to use discriminatory algorithms in screening applicants for credit, homes or jobs. His bill would create liability for “high-risk” AI developers, not just the end users. “High risk” refers to applications in which AI  is a “substantial factor in making a consequential decision.”

That alarms some users of AI who note that the very nature of how quickly and widely AI has become embedded makes regulation a challenge.

Trinity Health of New England, which owns St. Francis Hospital and Medical Center in Hartford and five other hospitals, filed written testimony urging that health care be exempted, saying AI is being used for everything from identifying anomalies in imaging to patient scheduling.

“SB 2, by its own definitions, reaches practically every use of AI in healthcare. The intermingled and confusing language for when AI restrictions and controls might not apply in health care are insufficient, confusing, vague, and in many cases will make its application impossible and prevent the use of innovation to improve health care,” Trinity said.

Maroney’s bill also would criminalize the production and dissemination of so-called deep fakes and other synthetic images that harm a person artificially depicted. The harm includes subjecting someone “to hatred, contempt, ridicule, physical injury, financial injury, psychological harm or serious emotional distress.”

O’Keefe said elements of the bill are constructive, especially sections promoting training and economic development. But he said the potential unintended consequences in the regulatory sections are endless, particularly in a state where AI already is widely used insurance, finance, advanced manufacturing and research.