Trending Now
We have updated our Privacy Policy and Terms of Use for Eurasia Group and its affiliates, including GZERO Media, to clarify the types of data we collect, how we collect it, how we use data and with whom we share data. By using our website you consent to our Terms and Conditions and Privacy Policy, including the transfer of your personal data to the United States from your country of residence, and our use of cookies described in our Cookie Policy.
{{ subpage.title }}
FTC cracks down on deceptive claims with AI
The US Federal Trade Commission, the agency chiefly responsible for consumer protection, is taking a tougher stance on false business claims involving artificial intelligence.
On Sept. 25, the agency announced Operation AI Comply, a set of five law enforcement actions against companies inappropriately using AI to hype or sell products. Under the FTC Act of 1914, the agency has broad authority to crack down on unfair and deceptive marketing practices. “Using AI tools to trick, mislead, or defraud people is illegal,” FTC Chair Lina M. Khan wrote in a statement. “The FTC’s enforcement actions make clear that there is no AI exemption from the laws on the books.”
The FTC sued DoNotPay, which billed itself as the “world’s first robot lawyer,” purportedly using artificial intelligence to replace real legal advice. But the company wasn’t able to effectively do what it marketed and, thus, has settled with the FTC for $193,000. The feds also took action against Rytr, an AI writing assistant, that could generate fake “detailed customer reviews,” because they would be deceptive in nature. The FTC is considering a settlement and is working on a consent agreement for the company.
The FTC also sued three ecommerce companies — Ascend Ecom, Ecommerce Empire Builders, and FBA Machine — which promised customers passive income by operating online AI-powered storefronts. Those cases are headed to litigation in federal court.
“The FTC has been particularly focused on AI the past few years, and one of its stated approaches is to use its existing authorities to address issues it sees related to AI – for example, to address deceptive practices under the FTC Act,” said Duane Pozza, a partner at the law firm Wiley Rein and former assistant director of the FTC’s Bureau of Consumer Protection.
“A key message from these actions is that companies need to be careful in making claims about what AI can do, and follow existing FTC guidance on properly supporting their claims,” he said. “Even if companies are not engaging in the exact conduct addressed in these actions, the FTC is sending a signal that it is looking at AI-related claims throughout the marketplace.”
Keeping your promises
In 2022, a grieving passenger went on Air Canada’s website and asked its AI-powered chatbot about the airline’s bereavement policy. The chatbot said yes, there are reduced fares if you’re traveling after the death of a loved one and you have 90 days after taking the flight in order to file a claim. The problem: That’s not Air Canada’s policy. The airline specifically requires passengers to apply for and receive the discount ahead of time — not after the flight.
Now, a Canadian court says that Air Canada has to honor the promises made by its AI chatbot, even though they were incorrect and inconsistent with the airline’s policies.
“While a chatbot has an interactive component, it is still just a part of Air Canada’s website,” the judge in the case wrote. “It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”
It’s a big ruling that could set new precedent, at least in Canada, that AI companies — or their clients — are legally liable for the accuracy of their chatbots’ claims. And that’s no simple thing to fix: Generative AI models are notorious for hallucinating — or making stuff up. If using AI becomes a major liability, it could drastically change how AI companies act, train their models, and lawyer up.
And it would immediately make AI a tough product to sell.