Trending Now
We have updated our Privacy Policy and Terms of Use for Eurasia Group and its affiliates, including GZERO Media, to clarify the types of data we collect, how we collect it, how we use data and with whom we share data. By using our website you consent to our Terms and Conditions and Privacy Policy, including the transfer of your personal data to the United States from your country of residence, and our use of cookies described in our Cookie Policy.
{{ subpage.title }}
Exclusive Poll: AI rules wanted, but can you trust the digital cops?
A new poll on AI raises one of the most critical questions of 2024: Do people want to regulate AI, and if so, who should do it?
For all the wars, elections, and crises going on, the most profound long-term transition going on right now is the light-speed development of AI and its voracious news capabilities. Nothing says a new technology has arrived more than when Open AI CEO Sam Altman claimed he needs to fabricate more semiconductor chips so urgently that … he requires $7 trillion.
Seven. Trillion. Dollars. A moment of perspective, please.
$7 trillion is more than three times the entire GDP of Canada and more than twice the GDP of France or the UK. So … it may be pocket change to the Silicon Valley technocrat class, but it’s a pretty big number to the rest of us.
Seven trillion dollars has a way of focusing the mind, even if the arrogance of even floating that number is staggering and, as we covered this week, preposterous. Still, it does give you a real sense of what is happening here: You will either be the AI bulldozer or the AI road. Which is it? So how do people feel about those options?
Conflicted is the answer: GZERO got access to a new survey from our partners at Data Sciences, which asked people in Canada about Big Tech, the government, and the AI boom. Should AI be regulated or not? Will it lead to job losses or gains? What about privacy? The results jibe very closely with similar polls in the US.
In general, the poll found people appreciate the economic and job opportunities that AI and tech are creating … but issues of anxiety and trust break down along generational lines, with younger people more trusting of technology companies than older people. That’s to be expected. I may be bewildered by my mom’s discomfort when I try to explain to her how to dictate a voice message on her phone, but then my kids roll their eyes at my attempts to tell them about issues relating to TikTok or Insta (Insta!, IG, On the ‘Gram, whatevs …) Technology, like music, is by nature generational.
But all tech companies are not equal. Social media companies score much lower when it comes to trust. For example, most Canadians say they trust tech companies like Microsoft, Amazon, or Apple, but less than 25% say they trust TikTok, Meta, or Alibaba. Why?
First, it’s about power. 75% of people agree that tech companies are “gaining excessive power,” according to the survey. Second, people believe there is a lack of transparency, accountability, and competition, so they want someone to do something about it. “A significant majority feel these companies are gaining excessive power (75% agree) and should face stronger government regulations (70% agree),” the DS survey says. “This call for government oversight is universal across the spectrum of AI usage.”
This echoes a Pew Research poll done in the USin November of 2023 in which 67% of Americans said they fear the government will NOT go far enough in regulating AI tech like ChatGPT.
So, while there is some consensus regarding the need to regulate AI, there is a diminishing number of people who actually trust the government to regulate it. Another Pew survey last September found that trust in government is the lowest it has been in 70 years of polling. “Currently, fewer than two-in-ten Americans say they trust the government in Washington to do what is right ‘just about always’ (1%) or ‘most of the time’ (15%).”
Canada fares slightly better on this score, but still, if you don’t trust the digital cops, how do you keep the AI streets safe?
As we covered in our 2024 Top Risks, Ungoverned AI, there are multiple attempts to regulate AI right now all over the world, from the US, the UN, and the EU, but there are two major obstacles to any of this working: speed and smarts. AI technology is moving like a Formula One car, while regulation is moving like a tricycle. And since governments struggle to keep up with the actual innovative new software engineering, they need to recruit the tech industry itself to help write the regulations. The obvious risk is here regulatory capture, where the industry-influenced policies become self-serving. Will news rules protect profits or the public good, or, in the best-case scenario, both? Or, will any regulations, no matter who makes them, be so leaky that they are essentially meaningless?
All this is a massive downside risk, but on the upside, it’s also a massive opportunity. If governments can get this right – and help make this powerful new technology more beneficial than harmful, more equitable than elitist, more job-creating than job-killing – they might regain the thing they need most to function productively: public trust.
Your own little Davos: Why trust is failing and what to do about it
With the world’s Most Powerful People™ busily pondering the fate of the rest of us at Davos this week, I thought to myself I’ll be damned if I’m not gonna go skiing too. So last weekend, I went with the family to Belleayre, a small mountain in upstate New York.
It’s not quite the same as Davos. The Eastern Catskills are not the Swiss Alps. I have it on good authority that the cost of a single schnitzel at Davos comfortably buys lunch for a family of four – maybe even six – at Belleayre.
But when it comes to places for thinking deep thoughts about the world, one mountain is as good as another. And since the Davoisie have dedicated their high-altitude gathering to the theme of “Rebuilding Trust,” I figured I also could think about trust while hitting the slopes.
Trust, as we keep hearing, is broken. Only 16% of Americans trust “government” – that’s down more than 60 points from “peak trust” in the mid-1960s. Fewer than a third of Americans trust each other, down from nearly 47% in the early 1970s. Meanwhile, half of Americans say “the media” deliberately misleads them, and fewer than one-quarter say journalists have society’s interests at heart.
But these data sometimes feel abstract. Like something that’s happening out there rather than right here. Well, a ski mountain is a good place to observe trust in action – a microcosm of the thousand little leaps of faith in people and things that get us through our days.
Consider the following: When you read the ski report and believe it, you are trusting the media. When you allow yourself to be whisked up the side of a mountain by a giant metal hanger with seats on it, you are trusting the institutions and experts who design and run ski lifts. (A sudden gust of wind will quickly heighten this trust.) And when you hit the lodge for lunch or the aprés, leaving your skis or snowboard on a rack unattended, you are showing social trust.
You can do this experiment anywhere, by the way. On your commute, where the subway conductor will not crash the train. On the highway, where the person driving toward you will not cross the double-yellow line. At the café, where the employees have washed their hands before returning to work. At the gym, where your spotter can, in fact, spot.
Why does trust seem to work at the mountain, the subway, the café, or the gym – but not in our national politics? Come closer. Trust works best when the stakes are immediate and observable. Where you can verify, you can trust. The farther things get from what you can see with your own eyes, the harder it is to believe in anything. Our online experiences only heighten this, of course: They are algorithmically engineered to feel close, personalized, and personal.
The data bear this out: Even amid the broader black diamond descent of trust, local institutions still shine. Polling by the Knight Foundation shows Americans are 17 points more likely to trust local news sources than national ones – (which makes the well-documented decline of local news 17 points more alarming.)
The same is true of government. Gallup found that while only a third of Americans trust the federal government, nearly 70% trust local government, where practical results are usually more important than partisan smackdowns.
With all this, it’s no wonder – as we head into a crucial global election year – that populism and nationalism are so appealing again. They’re each, in their ways, responses to falling trust in distant institutions. Populism seizes on our perfectly understandable lack of trust in distant institutions: Those people up there on the mountain are lying to you, let’s fight back. Nationalism and nativism propose a solution of their own, artificially shrinking the boundaries of society to tighten its bonds: It’s us vs. them. Let’s trust us.
What’s the solution? Lots have been written about this. But one place to start is by focusing on the places where things do work: invigorating good local government, reversing the decline of local media, and emphasizing the experiences of actual people rather than online avatars.
There’s no one solution, but, to flip a phrase from someone who knew a thing or two about trust: Keep your friends close, and your institutions closer.
See you on the slopes!