Trending Now
We have updated our Privacy Policy and Terms of Use for Eurasia Group and its affiliates, including GZERO Media, to clarify the types of data we collect, how we collect it, how we use data and with whom we share data. By using our website you consent to our Terms and Conditions and Privacy Policy, including the transfer of your personal data to the United States from your country of residence, and our use of cookies described in our Cookie Policy.
{{ subpage.title }}
Deepfake recordings make a point in Georgia
A Georgia lawmaker used a novel approach to help pass legislation to ban deepfakes in politics: he used a deepfake. Republican state representative Brad Thomas used an AI-generated recording of two of his bills opponents—state senator Colton Moore and activist Mallory Staples—endorsing the bill.
Thomas presented the convincing audio to his peers, but cautioned that he made this fake recording on the cheap: “The particular one we used is, like, $50. With a $1,000 version, your own mother wouldn’t be able to tell the difference,” he said. The bill subsequently passed out of committee by an 8-1 vote.
Fake audio like this recently reared its head in US politics on the national level when an ally of then-Democratic presidential candidate Dean Phillips released a fake robocall of President Joe Biden telling New Hampshire voters to stay home during the state’s primary. The Federal Communications Commission moved quickly in the aftermath of this incident to declare that AI-generated robocalls are illegal under federal law.