$6M fine for Biden robocall

The perpetrator behind the President Biden deepfake robocall scandal has been charged and fined £6M

Martin Crowley
May 24, 2024

The Federal Communications Commission (FCC) has indicted Steve Kramer, a political consultant who sent a deepfake robocall, impersonating President Biden, to New Hampshire residents, telling them “not to vote '' in the upcoming presidential elections, and fined him $6M. They’ve also issued Lingo Telecom, the telecom company that instigated the robocalls with a $2M fine for their part in the scam.

What happened with the Biden deepfake robocall?

Back in February, Kramer, who had been working on the democratic candidate, Dean Phillips, campaign, hired Lingo Telecom to engineer a deepfake robocall, that sounded just like Biden, complete with typical Biden phrases, like “what a bunch of malarkey”. He then sent the deepfake to the voters in New Hampshire, two days before the New Hampshire primary elections. The call falsely told New Hampshire residents that voting in the primaries would stop them from voting in the general election, in November.

What will happen to Kramer?

On top of the FCC’s $6M fine, Kramer also faces 26 counts from New Hampshire’s attorney general, which include trying to impersonate a presidential candidate and attempting to deter someone from voting, using misleading information.

In the US, voter suppression can carry a sentence of up to seven years in prison, and impersonating a candidate can carry up to a year in jail.

How have Kramer and Lingo Telecom responded?

Lingo Telecom have strongly refuted the FCC’s charges:

“Lingo Telecom was not involved whatsoever in the production of these calls and the actions it took complied with all applicable federal regulations and industry standards.”

While Kramer, who is scheduled to appear in court on June 5th, is “ready for the fight”, claiming that his scam wasn’t ever about influencing the outcome of the election, but simply a way of highlighting the potential dangers of AI:

“Maybe I’m a villain today, but I think in the end we get a better country and better democracy because of what I’ve done, deliberately,”

What does this mean for AI-content in political campaigns

Since the deepfake robocall scandal in February, the FCC has confirmed that AI voice-cloning tools used in robocalls are banned under existing law and, just yesterday, announced a proposal requiring political advertisers to disclose when content has been generated by AI in any TV or radio ads.  

This shows they’re taking steps to prevent AI-constructed misinformation from misleading or suppressing voters  

“We will act swiftly and decisively to ensure that bad actors cannot use U.S. telecommunications networks to facilitate the misuse of generative AI technology to interfere with elections.” Loyaan Egal, FCC’s Privacy and Data Protection Task Force Chair