PoliticsPresidential Election

Actions

Robocall to New Hampshire voters reportedly faked Biden's voice

New Hampshire Attorney General John Formella said any such message would be an illegal attempt to suppress the vote.
President Joe Biden
Posted at 6:28 PM, Jan 22, 2024

Officials in New Hampshire say they are investigating a reported robocall that appeared to fabricate President Joe Biden's voice with AI, in which he told voters not to turn out for the state's primary election on Tuesday.

New Hampshire Attorney General John Formella said any such message would be an illegal attempt to suppress the vote, and warned residents to disregard the message entirely.

According to The Associated Press, the message included President Biden's "What a bunch of malarkey" catchphrase. The message went on to say, "save your vote for the November election."

Voting on Tuesday does not affect one's ability to vote in the November general election. 

White House press secretary Karine Jean-Pierre said Monday President Biden had not recorded any message like the one heard in the call.

President Biden himself is not campaigning in New Hampshire for the primary, but there is a write-in campaign underway for him in the state.

The reported robocalls falsely appeared to be sent from Kathy Sullivan, a former New Hampshire Democratic Party chair who is associated with the write-in campaign.

Sullivan says she has alerted law enforcement and the state's attorney general.

It's not clear how many people received the message. The state attorney general's office has told any recipients to report the call to the state's election law division.

Meta updates political ad rules to cover AI-generated images, videos
Meta logo on a banner at a conference

Meta updates political ad rules to cover AI-generated images, videos

Advertisers on Instagram and Facebook will have to disclose when they use AI or other digital techniques starting in 2024.

LEARN MORE

The incident is the latest to show the influence artificial intelligence is already having on political and election discourse.

Earlier in the Republican race, AI-generated content appeared in at least one of Florida Gov. Ron DeSantis' campaign ads.

Some states, including Texas and California, have enacted their own laws to regulate deepfake content in political ads.

On Monday, U.S. Senator Amy Klobuchar, the Chairwoman of the Senate Committee on Rules and Administration with oversight over federal elections, issued a statement on the incident.

"Whether you are a Democrat or a Republican, no one wants to see fake ads or robocalls where you cannot even tell if it’s your candidate or not," her statement read. "That’s why I’m leading a bipartisan bill in the Senate to ban deceptive AI-generated content in our elections. We need federal action to ensure this powerful technology is not used to deceive voters and spread disinformation."

That bill, introduced in September of 2023, would "prohibit the distribution of materially deceptive AI-generated audio, images, or video relating to federal candidates in political ads or certain issue ads to influence a federal election or fundraise."