Politics

Voters back AI rules as campaigns’ fake videos, deepfakes prompt concerns

AJC poll finds there’s common ground among Georgia Republicans, Democrats that the government should do more to regulate AI.
(Photo Illustration: By the AJC | Source: Pexels)
(Photo Illustration: By the AJC | Source: Pexels)
1 hour ago

Georgia has become a laboratory for AI politics as the technology reshapes both the strategy and ethics of modern campaigning.

The depth and breadth of the technology’s adaptation are striking. Candidates are using generative artificial intelligence to craft polished biographical content, manufacture fictional statements by opponents and open two-way conversations with voters — all fast, cheap and with few guardrails.

Nowhere has that experiment turned uglier than in the GOP lieutenant governor’s race, where state Sen. Greg Dolezal has deployed the technology to inflame cultural anxieties in ways that would have been far more costly — or simply impossible — just two years ago.

There is broad public support behind restrictions. Two-thirds of Georgia Republicans and nearly 4 in 5 Democrats say in an Atlanta Journal-Constitution poll that the government should do more to regulate AI. It’s a rare patch of bipartisan common ground.

“The challenge before us is not whether AI will impact elections — it already has,” said state Rep. Todd Jones, a Republican who chairs the House Technology Committee. “The question is whether we will put responsible safeguards in place.”

Here’s how the technology is playing out on the campaign trail in Georgia.

Culture-war provocateur

An AI ad created by the campaign of state Sen. Greg Dolezal, a GOP candidate for lieutenant governor, showed Sharia Law come to the suburbs. (Screenshot)
An AI ad created by the campaign of state Sen. Greg Dolezal, a GOP candidate for lieutenant governor, showed Sharia Law come to the suburbs. (Screenshot)

Dolezal’s AI ads mark a different frontier.

Rather than use the technology to tell his story or attack an opponent, his campaign has deployed it to depict Muslims invading suburban Georgia neighborhoods, including armed figures terrorizing a woman resembling the actress Claire Danes from the HBO spy thriller “Homeland.” The ad ends with this message: “Keep Georgia Sharia free.”

The backlash was immediate and bipartisan. State Sen. Sheikh Rahman, D-Lawrenceville, condemned it from the Senate floor as “pathetic” and demanded a public apology. Former Senate GOP leader John F. Kennedy, a rival in the May 19 primary, called it “bizarre” and “outlandish.”

Former Democratic state Sen. Nabilah Parkes, who is Muslim, said she was so incensed she abandoned her campaign for insurance commissioner to run for lieutenant governor instead, setting up a possible clash with Dolezal in November.

Dolezal refused to apologize. He said he wouldn’t “take campaign advice from the Democrats.” A second ad, released weeks later, pushed the same Islamophobic message further, but this time without the AI graphics.

In a crowded downballot race against four other sitting or former state legislators, Dolezal made the calculation that the AI spots were this cycle’s version of Brian Kemp’s “shotgun” ad — a provocation engineered to burn his name into voters’ minds before they head to the polls.

Deepfake debate

U.S. Rep. Mike Collins’ Senate campaign used AI to create a fake video of U.S. Jon Ossoff with fabricated comments about his vote during a government shutdown last year.
U.S. Rep. Mike Collins’ Senate campaign used AI to create a fake video of U.S. Jon Ossoff with fabricated comments about his vote during a government shutdown last year.

Last fall, U.S. Rep. Mike Collins became the first major Georgia candidate to deploy an AI deepfake against an opponent when his Senate campaign released a video showing Democratic incumbent Jon Ossoff appearing to say he knows his vote during last year’s government shutdown would hurt farmers.

“But I wouldn’t know,” says the faux Ossoff. “I’ve only seen a farm on Instagram.”

Ossoff never said any of it. The video, built from his official Senate portrait, included a small on-screen disclaimer marking it as a spoof, though critics noted it was barely legible on mobile devices.

The backlash came from unexpected quarters. Republican rival Derek Dooley’s campaign mocked Collins for “shooting himself in the foot.” Conservative WSB radio host Eric Von Haessler called Collins a “dirtbag” on the air.

The AJC asked each of Georgia’s leading Senate campaigns whether they would pledge not to use deepfakes to misattribute words or actions to opponents. Only Ossoff said yes.

Collins’ campaign dug in, saying it “will be at the forefront embracing new tactics and strategies that pierce through lopsided legacy media coverage” and would “continue to use all methods permissible under the law.”

Ossoff has since become a frequent target of AI-generated attacks from outside groups as well.

The National Republican Senatorial Committee timed a digital ad to coincide with UGA’s appearance in the SEC title game last December that featured two genuine-looking AI-generated sportscasters.

Honky-tonk hype

A screenshot from a music video ad for Rick Jackson, a Republican candidate for governor in Georgia. (Screenshot)
A screenshot from a music video ad for Rick Jackson, a Republican candidate for governor in Georgia. (Screenshot)

Rick Jackson, the billionaire founder of Jackson Healthcare who entered the governor’s race in February with a pledge to open his checkbook, has frequently deployed AI in his campaign, including a sing-song attack depicting Lt. Gov. Burt Jones as Johnny Cash.

But the most prominent example is a four-minute AI-generated country music video in March tracing his journey from Atlanta’s Techwood Homes to wealthy healthcare executive to candidate for governor.

The lyrics are what you might expect, though catchy in a country-pop way. “When the world says stay down, you get up and stand your ground,” is one example.

His rivals mocked it. Jackson joked at a recent forum that he’s appeared in “an awful lot of AI ads.” And he’s said, as governor, he would make “rational decisions” about governing AI in campaigns.

But the video highlighted a new reality. Even deep-pocketed contenders like Jackson no longer need a pricey production crew or a studio to produce polished content.

Campaign chatbot

Republican candidate for governor Clark Dean, a commercial real estate executive, speaks at the Atlanta Press Club Loudermilk-Young debates at Georgia Public Broadcasting in Atlanta last week. (Arvin Temkar/AJC)
Republican candidate for governor Clark Dean, a commercial real estate executive, speaks at the Atlanta Press Club Loudermilk-Young debates at Georgia Public Broadcasting in Atlanta last week. (Arvin Temkar/AJC)

Clark Dean, an Atlanta businessman running a long-shot campaign in the Republican gubernatorial primary, rolled out something different in April: a chatbot he calls the Dean Machine. Trained on his policy positions and stances, it’s designed to let people ask him questions and get answers.

The potential rewards are obvious. An always-on bot can engage countless voters simultaneously while feeding the campaign data on who’s asking and what they care about. Indeed, many campaigns are already using AI tools behind the scenes to target voters and recalibrate messaging, but only Dean has taken this step.

Dean framed his model as a genuine “two-way conversation” between candidates and voters — the kind of direct exchange, he argues, that traditional campaigning rarely allows.

The risks are just as real. A flawed response on a charged issue — imagine a misstep on abortion — or a messy answer could go viral and hobble a campaign in a flash.

And the technology can be quirky. When the Dean Machine was prompted about the candidate’s low polling numbers and told to “be honest” about his chances, it replied that Dean is doing the best he can to promote his message.

“If I lose having made that case honestly, I can live with that,” Dean’s machine said. “What I couldn’t live with is not trying.”

What the law says

Georgia State Rep. Todd Jones, R-Cumming, has been advocating for government regulation of artificial intelligence. (Arvin Temkar/AJC)
Georgia State Rep. Todd Jones, R-Cumming, has been advocating for government regulation of artificial intelligence. (Arvin Temkar/AJC)

The legal system has struggled to keep pace. A patchwork of states has enacted laws requiring disclosures or restricting deceptive political deepfakes near elections. Georgia lawmakers have debated their own proposals for years.

Jones pioneered one of the more memorable arguments for action back in 2024, when he opened a committee hearing with an AI-generated video that mimicked the voices of two far-right activists, falsely making it appear they endorsed a bill restricting such deceptive content that they actually opposed.

The demonstration made his point more vividly than any speech could.

This session, Georgia ended with several AI-related bills on Gov. Brian Kemp’s desk, including a chatbot disclosure and child safety measure. Kemp signed a law earlier this week banning health insurance companies from solely relying on AI to make coverage decisions.

But lawmakers stopped short of creating a broad framework governing AI use in political campaigns over concerns about free speech restrictions.

That leaves voters navigating a new political landscape where campaign-altering content can be manufactured with startling ease.

Jones said he aims to revive the legislation next year to push “clear disclosure standards for AI-generated political content, defining intent-based violations and ensuring voters can confidently distinguish fact from fabrication.”

He added: “Protecting voter trust is essential to preserving the legitimacy of our electoral system.”

About the Author

Greg Bluestein is the Atlanta Journal Constitution's chief political reporter. He is also an author, TV analyst and co-host of the Politically Georgia podcast.

More Stories