Trending Now
We have updated our Privacy Policy and Terms of Use for Eurasia Group and its affiliates, including GZERO Media, to clarify the types of data we collect, how we collect it, how we use data and with whom we share data. By using our website you consent to our Terms and Conditions and Privacy Policy, including the transfer of your personal data to the United States from your country of residence, and our use of cookies described in our Cookie Policy.
{{ subpage.title }}
When AI makes mistakes, who can be held responsible?
In this episode of GZERO AI, Taylor Owen, professor at the Max Bell School of Public Policy at McGill University and director of its Centre for Media, Technology & Democracy, explores the issues of responsibility and trust with the widespread deployment of AI. Who bears responsibility when AI makes errors? Additionally, can we rely on AI, and should we trust it?
So last week, a Canadian airline made headlines when a customer sued its chatbot. Not only is this story totally weird, but I think it might give us a hint at who will ultimately be responsible when AI messes up. So, this all started when Jake Moffatt's grandmother passed away and he went to the Air Canada website to see if they had a bereavement policy. He asked the chatbot this question, which told him to book the flight and that he had 90 days to request a refund. It turns out though, that you can't request bereavement refunds retroactively, a policy stated elsewhere on the Air Canada website. But here's where it gets interesting. Moffatt took Air Canada and their AI chatbot to British Columbia's Civil Resolution Tribunal, a sort of small claims court. Air Canada argued that the chatbot is a separate legal entity that is responsible for its own actions.
The AI is responsible here. They lost though and were forced to honor a policy that a chatbot made up. They've since deleted their chatbot. This case is so interesting because I think it strikes at two of the questions at the whole core of our AI conversation, responsibility and trust.
First, who's responsible when AI gets things wrong? Is Tesla responsible when their full self-driving car kills somebody? Is a newspaper liable when its AI makes things up and defames somebody? Is a government responsible for false arrests using facial recognition AI? I think the answer is likely to be yes for all of these, and this has huge implications.
Second, and maybe more profound though, is the question of whether we can and should trust AI? Anyone who watched the Super Bowl ads this year will know that AI companies are worried about this. AI has officially kicked off its PR campaign and at the core of the PR campaign is the question of trust.
According to a recent Pew Study, 52% of Americans are more concerned than they are excited about the growth of AI. So, for the people selling AI tools, this could be a real problem. A lot of these ads then seek to build public trust in the tools themselves. The ad for Microsoft Copilot, for example, shows people using AI assistant to help them write a business plan and to drop storyboards for a film to make their job better, not take it away. The message is clear here, "We're going to help you do your job better, trust us." Stepping back though, the risk of being negligent and moving fast and breaking things is that trust is really hard to earn back once you've lost it, just ask Facebook.
In Jake Moffatt's Air Canada case, all that was at stake was a $650 refund, but with AI starting to permeate every facet of our lives, it's only a matter of time before the stakes are much, much higher.
I'm Taylor Owen, and thanks for watching.
US summer travel may be easier than you think, says Pete Buttigieg
Memorial Day weekend signals that the unofficial start of the summer travel season is upon us. And if last year’s travel woes were any indication (paging: Southwest Airlines), we can expect long lines at TSA, full planes stranded on the tarmac, and lots and lots of cancellations. But, according to US Transportation Secretary Pete Buttigieg, things are not as dire as they may seem.
“The good news is that after a very disruptive year last year in terms of the struggles that the airlines had, things are catching up this year. In 2023, the preliminary data show cancellation rates under 2%.” In an extensive interview with Ian Bremmer for this week’s GZERO World. Secretary Buttigieg pointed the finger at airline companies for many of the travel hiccups that made news last year. Issues like staffing and air traffic control are not the main cause, not even close to being the main cause, of flight cancellations and delays. We've been working with the airlines, pressing the airlines, and they have delivered a lot of improvements with what's under their control.”
And, it turns out, quite a bit is under airlines’ control, including that the law requires they reimburse passengers for canceled flights. If that’s news to you, you’re not alone.
Watch the full episode of GZERO world: The road to repair: Pete Buttigieg & crumbling US infrastructure
Biden’s executive order cracks down on Big Tech and protects consumers
Marietje Schaake, International Policy Director at Stanford's Cyber Policy Center, Eurasia Group senior advisor and former MEP, discusses tech policy in the United States and the new White House executive order with no less than 72 competition enhancing measures.
How will Biden's executive order crack down on big tech?
The answer is in almost every way. The order clearly seeks stronger antitrust enforcement with specific provisions on data and the impact of its assembling on privacy. The order asks for new rules on surveillance from the FTC but will also allow for assessments of not only future but also past mergers. And that is important because the very wealthy, very powerful tech companies are known to buy up competitors that they may fear, and through those mergers grow their data piles. So, the executive order must cause concern in Silicon Valley. The order goes on to restore net neutrality, which is crucial for smaller companies and noncommercial websites. And the position of consumers improves with the possibility to have products repaired or to see others doing that, which is a practice that is often banned today. So once these various measures are in place, the public interest, innovation, consumer rights, and privacy protection should be better safeguarded from abuse of power by big tech.
Ireland's responses to ransomware attack; cryptocurrency scams
Marietje Schaake, International Policy Director at Stanford's Cyber Policy Center, Eurasia Group senior advisor and former MEP, discusses trends in big tech, privacy protection and cyberspace:
What options does Ireland have responding to the ransomware attack on the country's healthcare system?
Well, authorities are making resources available to decrypt and restore, which is a good step. And they also insist on not paying ransom to the criminals. But after the immediate fallout, they should do a scan on weaknesses in legacy software systems used across the country to make clear who is expected to protect and where weaknesses might exist. Then imposing information sharing standards could help the needed facts to come together and to facilitate both resilience and damage control in the future. There's also an opportunity to cooperate on attribution and accountability with like-minded countries. This should really push to end the impunity with which these crimes are perpetrated.
How can consumers protect themselves from cryptocurrency scams?
Well here, my best advice is to use common sense. If a deal seems too good to be true, it probably is. And if there is no way to verify who runs a Bitcoin operation, then you have to ask yourself what an acceptable level of risk is in relation to your precious savings.
What is Coinbase, the first major cryptocurrency company to go public?
Marietje Schaake, International Policy Director at Stanford's Cyber Policy Center, Eurasia Group senior advisor and former MEP, discusses trends in big tech, privacy protection and cyberspace:
What is Coinbase and why is it such a big deal that it's going public?
Now, Coinbase runs the US's largest cryptocurrency exchange and holds tens of billions of dollars' worth of bitcoins. When it went public on Wednesday, it was the first major cryptocurrency company to do so.
Are there any privacy issues you see looming around cryptocurrency and digital assets in general?
And I would say trust is a big question with cryptocurrencies and the question that goes beyond matters of privacy protection. Some customers of Coinbase, for example, reported losing their assets through account takeover attacks and were very disappointed with the lack of support the company offered. Digital currency transactions that are impossible to reverse or to be traced to a person, make them an attractive asset to steal. And the security rules that we know for banks do not yet apply equally to digital currency exchanges. And that has clearly had its pluses for some users, but steep downsides for those who lose their savings in the blink of an eye.
GameStop stock rally gives policymakers opportunity for legislation
Jon Lieber, who leads Eurasia Group's coverage of political and policy developments in Washington, offers insights on US politics:
First question. Stonks! Will the GameStop stock rally result in new regulation on Wall Street?
The answer is probably, but the interesting thing is, we have no idea what form that regulation might take. The interesting thing about this storyline around GameStop is that the run-up in prices, driven by social media chatter on Reddit, against hedge funds who had shorted the stock, opens up a whole can of worms for how you want to solve the issue, and is most likely going to be an outlet for members of Congress preexisting biases. If you want to regulate hedge funds, well, here's an excuse to do so. If you want to implement a financial transaction tax, this is your opportunity. If you're concerned about consumer protection, data privacy, this could be a hook to get into those issues as well. So, this is a headline grabbing event that's probably going to fade out of the news in a week or so, but it's going to stay relevant to policymakers for several more months, could potentially result in new legislation, or new regulation from the SEC based around investor protection and market structure. So, stay tuned. We're going to be hearing about GameStop for a long time.
Second question. What is the future for the legislative filibuster?
Well, as part of the Senate's organizing resolutions, two Democratic senators made public commitments that they were not going to vote to change the legislative filibuster. This is the 60-vote threshold in the Senate to pass legislation. With 50 Democratic members in the Senate, you need all 50 to agree to change the rules. They only have 48 votes right now at the most. This means it's probably not going to happen. Now, these two senators could change their mind down the road, Kyrsten Sinema from Arizona, Joe Manchin, from West Virginia. There's probably a handful of other senators who also oppose changing the rules but haven't done so publicly. The other thing they could do is pass legislation using the reconciliation process, which has guard rails around it that limited to only budget and tax legislation, but change the rules of the reconciliation process, to supercharge their ability to pass legislation with only 50 votes without changing the legislative filibuster. Something to keep an eye on, we expect at least two big pieces of legislation this year to pass through the reconciliation process, giving multiple opportunities to try to change those rules and erode the norms around reconciliation.