Artificial intelligence, social media and a sprawling network of influencers helped spread propaganda and misinformation in the final weeks of the 2024 election campaign, an election technology expert says.(Melissa Sue Gerrits | Getty Images)
Advancements in AI technology, and the changing “information environment” undoubtedly influenced how campaigns operated and voters made decisions in the 2024 election, an elections and democracy expert said.
Technologists and election academics warned a few months ago that mis- and disinformation would play an even larger role in 2024 than it did in 2020 and 2016. What exactly that disinformation would look like became more clear in the two weeks leading up to the election, said Tim Harper, senior policy analyst for democracy and elections at the Center for Democracy and Technology.
“I think a lot of folks kind of maybe prematurely claimed that generative AI’s impact was overblown,” Harper said. “And then, you know, in short order, in the last week, we saw several kinds of disinformation campaigns emerge.”
Harper specifically mentioned the false claims that vice presidential nominee Tim Walz was alleged to have perpetrated an act of sexual misconduct, and a deep fake video of election officials ripping up ballots, both of which have been shown to be Russian misinformation campaigns.
AI also played a role in attempted voter suppression, Harper said, not just by foreign governments, but by domestic parties as well. EagleAI, a database that scrapes public voter data, was being used by a 2,000-person North Carolina group which aimed to challenge the ballots of “suspicious voters.”
Emails obtained by Wired last month show that voters the group aimed to challenge include “same-day registrants, U.S. service members overseas, or people with homestead exemptions, a home tax exemption for vulnerable individuals, such as elderly or disabled people, in cases where there are anomalies with their registration or address.”
The group also aimed to target people who voted from a college dorm, people who registered using a PO Box address and people with “inactive” voter status.
Another shift Harper noted from the 2020 election was a rollback of enforcement of misinformation policies on social media platforms. Many platforms feared being seen as “influencing the election” if they flagged or challenged misinformation content.
Last year, Facebook and Instagram’s parent company Meta, as well as X began allowing political advertisements that perpetuated election denial of the 2020 election.
Youtube also changed its policy to allow election misinformation, saying “In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm.”
But there are real-world risks for rampant misinformation, Harper said. Federal investigative agencies have made clear that misinformation narratives that delegitimize past elections directly contribute to higher risk of political violence.
Platforms with less-well-established trust and safety teams, such Discord and Twitch also play a role. They experienced their “first rodeo” of mass disinformation this election cycle, Harper said.
“They were tested, and I think we’re still evaluating how they did at preventing this content,” he said.
Podcasters and social influencers also increasingly shaped political opinions of their followers this year, often under murky ethical guidelines. Influencers do not follow ethical guidelines and rules for sharing information like journalists do, but Americans have increasingly relied on social media for their news.
There’s also a lack of transparency between influencers and the political campaigns and candidates they’re speaking about — some have reportedly taken under-the-table payments by campaigns, or have made sponsored content for their followers without disclosing the agreement to viewers.
The Federal Election Commission decided late last year that while campaigns have to disclose spending to an influencer, influencers do not have to disclose such payments to their audience.
“In terms of kind of the balkanization of the internet, of the information environment, … I think this election cycle may end up being seen kind of as ‘the influencer election,’” Harper said.
GET THE MORNING HEADLINES.