Getty Images
As technology progresses and becomes more accessible to the public, perpetrators have been able to refine artificial intelligence to enact sexual violence without ever having met their victims, supporters of a ban on nonconsensual “deepfake” pornography told Michigan lawmakers Wednesday.
The fact that the AI-generated sexually explicit images aren’t “real”, meaning they can be created without the victim engaging in the portrayed actions, doesn’t make the impact any less real for victims, Zoey Brewer, policy and grassroots coordinator with the Rape, Abuse & Incest National Network said during a House Judiciary Committee hearing Wednesday.
“You may have never even been in the same room as the person who created this photo or video of you, but the damage and harm is just as pervasive as if the violence had occurred in person,” Brewer said. “You feel humiliated and embarrassed. You’re dreading every single notification that pops up on your phone. It’s harder to face your peers, your community, you don’t go to work or class anymore.”
The legislation before lawmakers now, a reintroduction of a widely bipartisanly supported package that died last term, looks to recognize the impact non consensual deepfake pornography can have on a person’s life and provide avenues for individuals to seek justice through the court system, bill sponsor Rep. Matt Bierlein (R-Vassar) said during the committee meeting.
Creating or distributing media that falsely portrays an identifiable person in a sexual manner without that person’s permission would be a crime under House Bill 4047 and House Bill 4048. Victims would be permitted to sue for damages accrued by such an image being created and criminal penalties would range from a one-year misdemeanor offense to a three-year felony.
“As artificial intelligence continues to outpace our legal system, this initiative demonstrates our state’s commitment to protecting its citizens from the harmful consequences of non consensual deepfake creation and distribution,” Bierlein said. “Such content can inflict severe psychological trauma, damage relationships and even result in financial loss or physical harm by specifically targeting deepfakes that depict intimate parts or sexual acts. This bill acknowledges the need for tailored protection in these highly sensitive areas.”
Authorities have noted the creation of deepfake sexually explicit images is growing, while at the same time extorting individuals with the threat of releasing sexually explicit materials,known as “sextortion”, is also on the rise.
The consequences of not updating laws to keep up with the ever-evolving digital age are too great to ignore, especially for women and children, Ilana Beller of the democracy team for Public Citizen, a consumer advocacy organization said. She noted that experts and studies reflect that the vast majority of all deepfake pornography portrays women and girls.
The feelings of shame and hopelessness victims, particularly younger victims experience, can become unbearable as they worry what sexually explicit images could mean for their future relationships, job prospects and educational opportunities, Beller said, lamenting that in some cases, teenagers have killed themselves when a perpetrator attempts to extort them with sexually explicit images.
The vast majority of states are starting to try and address these emerging forms of sexual violence and bills to address deepfake pornography typically have bipartisan support, like the Michigan bills do, Beller said, it’s just important for Michigan to now see these bills through.
GET THE MORNING HEADLINES.