Trending Now
We have updated our Privacy Policy and Terms of Use for Eurasia Group and its affiliates, including GZERO Media, to clarify the types of data we collect, how we collect it, how we use data and with whom we share data. By using our website you consent to our Terms and Conditions and Privacy Policy, including the transfer of your personal data to the United States from your country of residence, and our use of cookies described in our Cookie Policy.
{{ subpage.title }}
The FEC kicks AI down the road
That means that the job of keeping deepfakes out of political ads will largely fall to tech platforms and AI developers. Tech companies signed an agreement at the Munich Security Conference in February, vowing to take “reasonable precautions” to prevent their AI tools from being used to disrupt elections. It’s also a task that could potentially fall to broadcasters. The Federal Communications Commission is still considering new rules for AI-generated content in political ads on broadcast television and radio stations. That’s caused tension between the two agencies too: The FEC doesn’t believe the FCC has the statutory authority to act, but the FCC maintains that it does.
After a deepfake version of Joe Biden’s voice was used in a robocall in the run-up to the New Hampshire Democratic primary, intended to trick voters into staying home, the FCC asserted that AI-generated robocalls were illegal under existing law. But time is ticking for further action since other AI-manipulated media may not be covered currently under the law. At this point, it seems likely that serious regulation from either agency might only come after Donald Trump and Kamala Harris square off in November — and perhaps only if Harris wins, as another Trump presidency might mean a further rollback of election rules.Get AI out of my robocalls
Ahead of the New Hampshire presidential primary, many voters got a suspicious robocall from Joe Biden urging them not to vote. Perhaps unsurprisingly, it was an AI-generated version of his voice custom made to confuse voters.
Now, the Federal Communications Commission wants to make AI-generated voice-cloning calls illegal under the Telephone Consumer Protection Act.
“AI-generated voice cloning and images are already sowing confusion by tricking consumers into thinking scams and frauds are legitimate,” FCC chairwoman Jessica Rosenworcel wrote in a statement. “We could all be a target of these faked calls,” she warned.
While AI has been used to make images and videos used in political advertising this election cycle, deepfake voices — especially over telephone — are arguably tougher to detect. Everyone sounds a little weird over the phone, right?
The FCC, wanting to act promptly amid primaries and before this November’s election, is set to vote on the proposed rule change in the coming weeks.
Deepfakes on are on the campaign trail too
The Dean Phillips chatbot isn’t the only artificial intelligence in the race.
Ahead of presidential primaries Tuesday night in the Granite State, the New Hampshire Justice Department said it is investigating reports of robocalls impersonating President Joe Biden. The calls, allegedly featuring an AI version of Biden’s voice, encourage voters to stay home on Tuesday and instead save their vote for November.
“Your vote makes a difference in November, not this Tuesday,” the faux Biden said. It’s the first-known case of someone using generative AI to suppress the vote in a presidential election. The robocall was also “spoofed” to seem like it was sent by a New Hampshire Democratic operative, the government said in a press release. The state justice department reminded voters that voting on Tuesday doesn’t preclude them from voting in November’s general election.
Biden’s likely opponent, former President Donald Trump, has meanwhile resorted to telling his supporters that an advertisement showing his gaffes is artificially generated — even though they’re not. “The perverts and losers at the failed and once disbanded Lincoln Project, and others, are using AI (Artificial Intelligence) in their Fake television commercials in order to make me look as bad and pathetic as Crooked Joe Biden, not an easy thing to do,” Trump posted on his social network Truth Social, a claim the Lincoln Project, the ad’s maker, vehemently denied.
In an interview with The Washington Post, the UC Berkeley professor Hany Farid said AI presents a “liar’s dividend,” which gives candidates plausible deniability to say anything they don’t like — or wish they didn’t do or say — is actually AI.