Search
AI-powered search, human-powered content.
scroll to top arrow or icon

Deepfake recordings make a point in Georgia

​A view of the Georgia State Capitol in Atlanta, Georgia, U.S., May 11, 2021. Picture taken May 11, 2021.

A view of the Georgia State Capitol in Atlanta, Georgia, U.S., May 11, 2021. Picture taken May 11, 2021.

REUTERS/Linda So
Contributing Writer
https://x.com/ScottNover
https://www.linkedin.com/in/scottnover/

A Georgia lawmaker used a novel approach to help pass legislation to ban deepfakes in politics: he used a deepfake. Republican state representative Brad Thomas used an AI-generated recording of two of his bills opponents—state senator Colton Moore and activist Mallory Staples—endorsing the bill.


Thomas presented the convincing audio to his peers, but cautioned that he made this fake recording on the cheap: “The particular one we used is, like, $50. With a $1,000 version, your own mother wouldn’t be able to tell the difference,” he said. The bill subsequently passed out of committee by an 8-1 vote.

Fake audio like this recently reared its head in US politics on the national level when an ally of then-Democratic presidential candidate Dean Phillips released a fake robocall of President Joe Biden telling New Hampshire voters to stay home during the state’s primary. The Federal Communications Commission moved quickly in the aftermath of this incident to declare that AI-generated robocalls are illegal under federal law.