Sat. Nov 2nd, 2024

Sen. James Maroney, left, speaks to Connecticut House Speaker Matt Ritter. (Photo by Mark Pazniokas / The Connecticut Mirror)

Lawmakers from three other states where laws governing artificial intelligence were recently passed discussed how to protect people’s privacy in future legislation written and debated by New Mexico legislators, and across the country.

Connecticut Sen. James Maroney helps to lead a working group coming up with definitions and lessons which could inform the ways AI privacy legislation is crafted. The group has members from all but three states, he told the New Mexico Courts, Corrections & Justice Committee on Aug. 13.

About 19 states in total have adopted laws similar to Connecticut’s commonly dealing with people’s rights and data controllers’ and data processors’ responsibilities, Minnesota Rep. Steve Elkins said during the committee meeting.

New Mexico this year enacted its own law related to AI, but it has to do with political advertisements and doesn’t deal with other related issues like data privacy. Another bill discussed in the 2024 regular session would have provided basic guard rails on how the state can use AI, but it never reached a vote in either chamber.

Maroney and Elkins said while no two AI bills in state legislatures will be exactly alike, it’s important to make them similar so national businesses can more easily comply with them.

“So if you pick up Senator Maroney’s bill, or the Maryland bill, or my bill as a starting point, you’re going to be most of the way there, because all of the provisions in these acts have been thoroughly vetted and by-and-large accepted by the business community,” Elkins told the New Mexico committee.

Maryland Sen. Sara Love said these kinds of AI laws should be able to operate state-by-state, “However, that doesn’t mean that I was going to accept everything that (businesses) wanted.”

In most other states, Love said, the law says the data controller must limit their collection to what’s “adequate, relevant and reasonably necessary in relation to the disclosed purposes with which the data is processed.” In other words, the company just has to disclose they’ve collected the data, and given a reason why.

Love said she wanted something stronger in her bill, so she required data collectors to limit their collections to what’s “reasonably necessary for providing or maintaining the product or service.”

For example, a company running a mobile phone game shouldn’t be able to collect the user’s location, because that’s not necessary to provide the service, Love said.

AI decision making

AI has the potential to make decisions better, fairer and more transparent, said Christopher Moore, a professor at the Santa Fe Institute.

On the other hand, AI algorithms are often based on historical data containing their own biases, and often assume past patterns will continue in the future, Moore said at another presentation to the Courts, Corrections & Justice Committee later in the day on Aug. 13.

“AI treats people as statistics in some sense,” he said. “They don’t look at the individual facts about a person the way that a human decision maker might.”

Maroney said the working group is concerned about how biased algorithms can profile people and negatively impact their applications for jobs or housing.

Landlords often use AI to run a background check on a prospective tenant, Moore said. Even if a landlord doesn’t intend to discriminate against them, the AI could still have a discriminatory effect, he said. There’s a lot of inaccurate data floating out there including eviction records for someone with a similar name, or criminal charges that were dropped, he said.

It’s important the law requires companies who do this algorithmic profiling must study any possible disparate impacts, Martoney said.

“We need to make sure we’re testing these algorithms before they’re making important life decisions,” he said.

It’s also important companies governed by the law may not discriminate against consumers for exercising their rights, Love said, like charging more for a product if they refuse to sign over the rights to their data.

Many AI systems are called “black boxes,” Moore said, meaning you don’t get to see how or why they produce a particular score or recommendation.

The people affected by AI and the decision makers advised by it need to understand the logic behind it, what kinds of errors it can make, and make an independent assessment of its accuracy and fairness, he said.

“AI is a great tool, but like any other tool, we need to figure out when and where it’s the right tool for the job,” Moore said. “Or do we just have to take the vendor’s word for it, that it works great and you should pay for it?”

Data privacy

Elkins said his bill borrowed language from Oregon and Delaware requiring companies disclose where they might have sold someone’s data, and prohibiting them from selling or using for targeted advertising sensitive information like precise location data.

He said he also inserted a provision in his bill in Minnesota explicitly prohibiting the re-identification of data which has been purposely collected anonymously.

The Minnesota law also requires companies to provide an annual report on its data minimization policies, meaning companies shouldn’t be collecting data they don’t need in order to provide goods or services, and shouldn’t be collecting data just to sell it to someone else.

Love said Maryland’s law allows people to access their data being collected and processed, correct inaccuracies and obtain copies. It also allows people to opt out of having their data processed for targeted advertising, sold, or used to profile them for consequential decisions, she said.

New Mexico Sen. Antoinette Sedillo-Lopez (D-Albuquerque) asked what opting out means.

Elkins said many state laws say if you don’t want companies to use your data to rate you for auto insurance or getting an apartment, you can opt out of having your data used for that purpose.

Elkins said he doesn’t think that standard is adequate, because if you have a reason to complain about your data being profiled, “it’s probably already been done.” So he added an additional provision in his bill that says if your data is already profiled, you have additional rights.

By