Generative AI models have been known to hallucinate, or make things up and state them as facts (in other words, lie). But new research suggests that despite that shortcoming, AI could be a key tool for determining whether someone – a human – is telling the truth.

An economist at the University of Würzburg in Germany found that an algorithm trained with Google’s BERT language model was better at detecting lies than human evaluators. AI might not be able to power a faultless polygraph – a notoriously unreliable device – but it may be able to sift fact from fiction in large datasets, such as sifting for disinformation on the internet.

Maybe the next US presidential debate could use an AI fact-checker to keep the candidates honest.

More For You

Russian President Vladimir Putin meets with journalists to comment on new U.S. sanctions targeting two major Russia's oil producers, as well as other international issues, in Moscow, Russia, October 23, 2025.
Sputnik/Alexander Shcherbak/Pool via REUTERS

The US has paused Russian oil sanctions in a bid to stabilize energy markets rocked by the war with Iran. Administration officials stress that it’s a “tailored” measure, applying only to oil already loaded onto tankers, but it’s still a gift to Russia, which has already been clocking an extra $150 million daily in oil revenues since the war began.

A Boeing C-135 Stratotanker / Stratolifter military aircraft known as KC-135 of the United States Air Force USAF configured as Air Tanker Transport for aerial refueling, powered by 4x CFMI jet engines and tail number 63-8003. The military plane spotted flying over the Netherlands in the blue sky from Mainland USA to Tel Aviv TLV to support the Israel USA - Iran war known as Operation Epic Fury by the US Department of Defense. Venlo, the Netherlands on March 2, 2026
Photo by Nicolas Economou/NurPhoto

4: The number of crew members aboard a US refuelling plane – out of six total – who died after the aircraft crashed in neighboring Iraq on Thursday, US Central Command said this morning.