Trending Now
We have updated our Privacy Policy and Terms of Use for Eurasia Group and its affiliates, including GZERO Media, to clarify the types of data we collect, how we collect it, how we use data and with whom we share data. By using our website you consent to our Terms and Conditions and Privacy Policy, including the transfer of your personal data to the United States from your country of residence, and our use of cookies described in our Cookie Policy.
{{ subpage.title }}
Courting AI opportunities (and hallucinations)
The AI boom, he said, brings both opportunities and concerns, noting that legal research may soon be “unimaginable” without the assistance of AI. “AI obviously has great potential to dramatically increase access to key information for lawyers and non-lawyers alike,” he wrote.
But he also cautioned “humility,” noting how “one of AI’s prominent applications made headlines this year for a shortcoming known as ‘hallucination,’ which caused the lawyers using the application to submit briefs with citations to nonexistent cases. (Always a bad idea.)” Indeed, AI chatbots tend to make stuff up — or hallucinate, which has been a repeated problem since the debut of ChatGPT.
So far, US federal courts have taken a decentralized approach, with 14 of 196 publishing their own gruidance on how AI tools can and cannot be used in litigation.
Meanwhile, across the pond, the United Kingdom recently took the first step toward allowing AI as an assistive tool in legal opinion writing. “Judges do not need to shun the careful use of AI,” high-ranking judge Geoffrey Vos wrote. “But they must ensure that they protect confidence and take full personal responsibility for everything they produce.”
So British courts will begin allowing AI to be used in legal writing, but not research — because of the aforementioned tendency to hallucinate.
Will AI judges take over? Roberts made an eloquent case against this and an impassioned defense of the humanity central to being an effective judge.
“Machines cannot fully replace key actors in court,” he wrote. “Judges, for example, measure the sincerity of a defendant’s allocution at sentencing. Nuance matters: Much can turn on a shaking hand, a quivering voice, a change of inflection, a bead of sweat, a moment’s hesitation, a fleeting break in eye contact. And most people still trust humans more than machines to perceive and draw the right inferences from these clues.”