Trending Now
We have updated our Privacy Policy and Terms of Use for Eurasia Group and its affiliates, including GZERO Media, to clarify the types of data we collect, how we collect it, how we use data and with whom we share data. By using our website you consent to our Terms and Conditions and Privacy Policy, including the transfer of your personal data to the United States from your country of residence, and our use of cookies described in our Cookie Policy.
{{ subpage.title }}
AI will upset democracies, dictatorships, and elections
There’s no mistaking it: Artificial intelligence is here, and it’s already playing a major role in elections around the globe. In a year with national elections in 64 countries, the world’s governments are seeing the immediate impact of this nascent technology in real time.
In Pakistan, former Prime Minister Imran Khan – behind bars, with his party banned – used deepfake technology to simulate his voice and image to galvanize supporters. Khan’s allies (running as independents) took the greatest share of the vote, shocking the military-political establishment in Islamabad.
In Indonesia, Defense Minister Prabowo Subiantoused a “chubby-cheeked AI avatar” to appeal to younger voters on TikTok — and it worked. Official tallies are still pending, but Subianto is the presumed winner of the race, and watchdogs have criticized the conduct of the polls.
Meanwhile, another political party supporting Subianto used deepfake technology to portray former Indonesian dictator Suharto – who’s been dead for 16 years – urging citizens to vote. Fellow candidate Anies Baswedan got it going both ways: He deployed an AI chatbot to communicate with voters, but he was also the subject of an AI-made audio falsely portraying a political backer chastising him.
In the US, there have been AI-generated images used in political campaign videos from the Republican National Committee attacking President Joe Biden and Florida Gov. Ron DeSantis targeting former President Donald Trump. And in New Hampshire’s Democratic primary, voters received a robocall featuring a fake Biden voice telling them not to vote – a call we’ve since learned came from an associate of longshot challenger Dean Phillips.
“Politicians have to win the AI race before they win the election,” says Xiaomeng Lu, director of geo-technology at the Eurasia Group. Some of that work is defensive: Taiwan reportedly used AI tools to debunk disinformation campaigns coming from China ahead of its election in January.
Of course, AI isn’t just a factor in elections but in activism and pro-democracy movements as well. That means autocrats worldwide have to watch their digital backs.
In a recent GZERO panel conversation at the Munich Security Conference, former National Security Council official Fiona Hill said that there are innovative ways for the technology to be used in protest movements. “Someone like Alexei Navalny … would have been able to use AI in extraordinarily creative ways, in the case of the Russian elections, which is something of a foregone conclusion,” she said, saying we need to consider how these technologies can be used for good by legitimate opposite leaders.
But in countries like Russia, the immense power imbalance means those trying to use AI for political reforms still face a dangerous, uphill battle, according to Justin Sherman, founder, and CEO of Global Cyber Strategies. “Dictators certainly may worry about AI’s implications for their rule, but the reality of AI in those contexts is much more complex and messy.”
With regulation lagging far behind the spread of cheap, high-quality generative AI, look for voluntary commitments from AI firms to predate the passage of effective regulation. In February, a group of 20 leading tech companies — including Amazon, Google, Meta, and Microsoft — pledged to combat election-related misinformation. These are voluntary commitments, but commitments nonetheless: The companies promised to conduct risk assessments for their models; develop watermarking, detection, and labeling systems; and educate the public about AI.
Will it be enough? We’re about to find out.