Political Consultant Acquitted in AI-Generated Biden Robocall Case: Implications for AI Regulation

Reviewed byNidhi Govil

4 Sources

Share

A New Hampshire jury acquitted Steven Kramer, a political consultant, of all charges related to AI-generated robocalls mimicking President Biden. The case highlights the challenges in regulating AI use in political campaigns and raises questions about the future of AI governance.

AI-Generated Robocalls and Legal Acquittal

In a landmark case involving artificial intelligence in political campaigns, Steven Kramer, a 56-year-old political consultant from New Orleans, was acquitted of all charges related to AI-generated robocalls mimicking President Joe Biden. The jury's decision came after a trial in Belknap County Superior Court, New Hampshire, where Kramer faced 11 felony voter suppression charges and 11 candidate impersonation charges

1

2

3

4

.

Source: AP NEWS

Source: AP NEWS

The case centered around robocalls sent to thousands of New Hampshire Democrats two days before the state's January 23, 2024, presidential primary. The AI-generated voice, similar to Biden's, used his catchphrase "What a bunch of malarkey" and suggested that voting in the primary would prevent participation in the November election

1

2

3

4

.

Kramer's Defense and Motivation

Kramer admitted to orchestrating the calls but claimed his intention was to raise awareness about the potential dangers of AI in political campaigns. He testified that he paid a New Orleans magician $150 to create the recording, describing it as his "one good deed this year"

1

2

3

4

.

The defense argued that the primary was a "meaningless straw poll" unsanctioned by the Democratic National Committee (DNC), and therefore not subject to state voter suppression laws. They also contended that Kramer didn't impersonate a candidate because the message didn't include Biden's name, and Biden wasn't a declared candidate in the primary

1

2

3

4

.

Implications for AI Regulation

The acquittal raises significant questions about the regulation of AI in political campaigns. New Hampshire Attorney General John M. Formella stated, "We will continue to work diligently to address the challenges posed by emerging technologies, including artificial intelligence, to protect the integrity of our elections"

1

2

3

4

.

Despite the legal victory, Kramer still faces a $6 million fine from the Federal Communications Commission (FCC), which he has stated he won't pay. Lingo Telecom, the company that transmitted the calls, agreed to a $1 million settlement in August

1

2

3

4

.

Broader Context of AI Regulation

The case highlights the complex landscape of AI regulation in the United States:

  1. Many states have enacted legislation to regulate AI deepfakes in political campaigns

    1

    2

    3

    4

    .

  2. The FCC was developing AI-related rules before Donald Trump's presidency but has since shown signs of potentially loosening regulations

    1

    2

    3

    4

    .

  3. House Republicans recently added a clause to their signature tax bill that would ban states and localities from regulating artificial intelligence for a decade, further complicating the regulatory landscape

    1

    2

    3

    4

    .

As AI technology continues to advance, this case underscores the urgent need for clear guidelines and regulations to govern its use in political campaigns and protect the integrity of democratic processes.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo