How should artificial intelligence be governed?

The drumbeat for regulating artificial intelligence (AI) is growing louder. Earlier this week, Sundar Pichai, the CEO of Google's parent company, Alphabet, became the latest high-profile Silicon Valley figure to call for governments to put guardrails around technologies that use huge amounts of (sometimes personal) data to teach computers how to identify faces, make decisions about mortgage applications, and myriad other tasks that previously relied on human brainpower.

"AI governance" was also a big topic of discussion at Davos this week. With big business lining up behind privacy and human-rights campaigners to call for rules for AI, it feels like the political ground is shifting.

But agreeing on the need for rules is one thing. Regulating a fuzzily-defined technology that has the potential to be used for everything from detecting breast cancer, to preventing traffic jams, to monitoring people's movements and even their emotions, will be a massive challenge.

Here are two arguments you're likely to hear more often as governments debate how to approach it:

Leave AI (mostly) alone: AI's enormous potential to benefit human health, the environment, and economic growth could be lost or delayed if the companies, researchers, and entrepreneurs working to roll out the technology end up stifled by onerous rules and regulations. Of course, governments should ensure that AI isn't used in ways that are unsafe or violate people's rights, but they should try to do it mainly by enforcing existing laws around privacy and product safety. They should encourage companies to adopt voluntary best practices around AI ethics, but they shouldn't over-regulate. This is the approach that the US is most likely to take.

AI needs its own special rules: Relying on under-enforced (or non-existent) digital privacy laws and trusting companies and public authorities to abide by voluntary ethics guidelines isn't enough. Years of light-touch regulation of digital technologies have already been disastrous for personal privacy. The potential harm to people or damage to democracy from biased data, shoddy programming, or misuse by companies and governments could be even higher without rules for AI. Also, we don't really know most of the ways AI will be used in the future, for good or for ill. To have any chance of keeping up, governments should try to create broad rules to ensure that whatever use it is put to, AI is developed and used safely and in ways that respect democratic principles and human rights. Some influential voices in Europe are in this camp.

There's plenty of room between these two poles. For example, governments could try to avoid overly broad rules while reserving the right to step in in narrower cases, like regulating the use of facial recognition in surveillance, say, or in sensitive sectors like finance or healthcare, where they see bigger risks if something were to go wrong.

This story is about to get a lot bigger. The EU is preparing to unveil its first big push for AI regulation, possibly as soon as next month. The Trump administration has already warned the Europeans not to impose tough new rules on Silicon Valley companies. China – the world's other emerging AI superpower – is also in the mix. Beijing's ambitious plans to harness AI to transform its economy, boost its military capabilities, and ensure the Communist Party's grip on power have contributed greatly to rising tensions between Washington and Beijing. But China is also working on its own approach to AI safety and ethics amid worries that other countries could get out ahead in setting the global agenda on regulation.

As with digital privacy, 5G data networks, and other high-tech fields, there is a risk that countries' competing priorities and intensifying geopolitical rivalries over technology will make it hard to find a common international approach.

More from GZERO Media

A Russian army soldier walks along a ruined street of Malaya Loknya settlement, which was recently retaken by Russia's armed forces in the course of Russia-Ukraine conflict in the Kursk region, on March 13, 2025.

Russian Defence Ministry/Handout via REUTERS

The Russian leader has conditions of his own for any ceasefire with Ukraine, and he also wants a meeting with Donald Trump.

Mahmoud Khalil speaks to members of the media about the Revolt for Rafah encampment at Columbia University on June 1, 2024.

REUTERS/Jeenah Moon

The court battle over whether the US can deport Mahmoud Khalil, the 30-year-old Palestinian-Algerian activist detained in New York last Saturday, began this week in Manhattan. Khalil, an outspoken activist for Palestinian rights at Columbia University, was arrested Saturday at his apartment in a university-owned building at Columbia University by Immigration and Customs Enforcement officers, and he is now being held in an ICE detention center in Louisiana.

The Israeli Air Force launched an airstrike on Thursday, targeting a building in the Mashrou Dummar area of Damascus.
(Photo by Rami Alsayed/NurPhoto)

An Israeli airstrike destroyed a residential building on the outskirts of Damascus on Thursday in the latest Israeli incursion into post-Assad Syria.

Lars Klingbeil (l), Chairman of the SPD parliamentary group, and Friedrich Merz, CDU Chairman and Chairman of the CDU/CSU parliamentary group, talk at the end of the 213th plenary session of the 20th legislative period in the German Bundestag.

Germany’s government is in a state of uncertainty as the outgoing government races to push through a huge, and highly controversial, new spending package before its term ends early this spring.

EPA Administrator Lee Zeldin, a Republican, speaks as the U.S. vice president visits East Palestine, Ohio, U.S., February 3, 2025.
Rebecca Droke/Pool via REUTERS/File Photo

On Wednesday, Environmental Protection Agency chief Lee Zeldin redefined the agency’s mission, stating that its focus is to “lower the cost of buying a car, heating a home, and running a business.”

Paige Fusco

Canada has begun thinking the unthinkable: how to defend against a US attack. It suddenly realizes — far too late – that the 2% GDP goal on defense spending is no longer aspirational but urgent. But what kind of military does it need? To find out, GZERO Publisher Evan Solomon spoke with retired Vice Admiral Mark Norman, the former vice chief of defense staff in Canada and currently a fellow at the Canadian Global Affairs Institute.

The energy transition is one of society’s biggest challenges – especially for Europe’s largest economy – according to a survey commissioned by the BMW Foundation Herbert Quandt and undertaken by the Allensbach Institute for Public Opinion Research. Sixty percent of those polled believe the energy transition is necessary but have doubts about how it is being implemented. A whopping 63% would like to be more involved in energy-transition decisions affecting their region. The findings strongly suggest that it’s essential to get the public more involved in energy policymaking – to help build a future energy policy that leads to both economic prosperity and social cohesion. Read the full study “Attitudes Toward the Energy Transition” here.