California wants to prevent an AI “catastrophe”

Courtesy of Midjourney

The Golden State may be close to passing AI safety regulation — and Silicon Valley isn’t pleased.

The proposed AI safety bill, SB 1047, also known as the Safe and Secure Innovation for Frontier Artificial Intelligence Models Act, aims to establish “common sense safety standards” for powerful AI models.

The bill would require companies developing high-powered AI models to implement safety measures, conduct rigorous testing, and provide assurances against "critical harms," such as the use of models to execute mass-casualty events and cyberattacks that lead to $500 million in damages. It warns that the California attorney general can take civil action against violators, though rules would only apply to models that cost $100 million to train and pass a certain computing threshold.

A group of prominent academics, including AI pioneers Geoffrey Hinton and Yoshua Bengio,published a letter last week to California’s political leaders supporting the bill. “There are fewer regulations on AI systems that could pose catastrophic risks than on sandwich shops or hairdressers,“ they wrote, saying that regulations are necessary not only to rein in the potential harms of AI but also to restore public confidence in the emerging technology.

Critics, including many in Silicon Valley, argue the bill is overly vague and could stifle innovation. In June, the influential startup incubator Y Combinator, wrote a public letter outlining its concerns. It said that liability should lie with those who abuse AI tools, not developers, that the threshold for inclusion under the law is arbitrary, and that a requirement that developers include a “kill switch” allowing them to turn off the model would be a “de facto ban on open-source AI development.”

Steven Tiell, a nonresident senior fellow with the Atlantic Council's GeoTech Center, thinks the bill is “a good start” but points to “some pitfalls.” He appreciates that it only applies to the largest models but has concerns about the bill’s approach to “full shutdown” capabilities – aka the kill switch.

“The way SB 1047 talks about the ability for a ‘full shutdown’ of a model – and derivative models – seems to assume foundation models would have some ability to control derivative models,” Tiell says. He warned this could “materially impact the commercial viability of foundation models across wide swaths of the industry.”

Hayley Tsukayama, associate director of legislative activism at the Electronic Frontier Foundation, acknowledges the tech industry’s concerns. “AI is changing rapidly, so it’s hard to know whether — even with the flexibility in the bill — the regulation it’s proposing will age well with the industry,” she says.

“The whole idea of open-source is that you’re making a tool for people to use as they see fit,” she says, emphasizing the burden on open-source developers. “And it’s both harder to make that assurance and also less likely that you’ll be able to deal with penalties in the bill because open-source projects are often less funded and less able to spend money on compliance.”

State Sen. Scott Wiener, the bill’s sponsor, told Bloomberg he’s heard industry criticisms and made adjustments to its language to clarify that open-source developers aren’t entirely liable for all the ways their models are adapted, but he stood by the bill’s intentions. “I’m a strong supporter of AI. I’m a strong supporter of open source. I’m not looking in any way to impede that innovation,” Wiener said. “But I think it’s important, as these developments happen, for people to be mindful of safety.” Spokespeople for Wiener did not respond to GZERO’s request for comment.

In the past few months, Utah and Colorado have passed their own AI laws, but they’ve both focused on consumer protection rather than liability for catastrophic results of the technology. California, which houses many of the biggest companies in AI, has broader ambitions. But while California has been able to lead the nation — and the federal government on data privacy — it might need industry support to get its AI bill fully approved in the legislature and signed into law. California’s Senate passed the bill last month, and the Assembly is set to vote on it before the end of August.

California Gov. Gavin Newsom hasn’t signaled whether or not he’ll sign the bill should it pass both houses of the legislature, but in May, he publicly warned against over-regulating AI and ceding America’s advantage to rival nations: “If we over-regulate, if we overindulge, if we chase the shiny object, we could put ourselves in a perilous position.”

More from GZERO Media

Listen: In this episode of Energized: The Future of Energy, host JJ Ramberg and Enbridge CEO Greg Ebel talk to Justin Bourque, President of Athabasca Indigenous Investments, and Mark Podlasly, Chief Sustainability Officer of First Nations Major Project Coalition. They discuss how a partnership deal between Enbridge and 23 Indigenous communities in northern Alberta is improving life for those communities and how Indigenous peoples are investing in the energy transition—and their futures.

Ukraine's President Volodymyr Zelenskiy addresses lawmakers as he presents the so-called 'Victory Plan' during a parliament session, amid Russia's attack on Ukraine, in Kyiv, Ukraine October 16, 2024.
REUTERS/Andrii Nesterenko

On Wednesday, President Volodymyr Zelensky presented his much-discussed “victory plan” to Ukraine’s parliament.

U.S. President Joe Biden meets with Israeli Prime Minister Benjamin Netanyahu in the Oval Office at the White House in Washington, U.S., July 25, 2024.
REUTERS/Elizabeth Frantz

From the IDF’s offensive in Rafah to its more recent invasion in Lebanon, there have been myriad examples of Israel taking escalatory steps that Washington has vocally opposed.

South African President Cyril Ramaphosa interacts with the leader of the opposition party, John Steenhuisen ahead of National Assembly members' questions in parliament in Cape Town, South Africa, November 3, 2022.
REUTERS/Esa Alexander

This unexpected alliance between South Africa’s long-ruling ANC and the Democratic Alliance has shown early signs of promise.

FILE PHOTO: At a secret jungle camp in Myanmar's eastern Karen state, a fitness coach and other civilians are training with armed ethnic guerrillas to fight back against the country's military takeover.
REUTERS/Independent photographer

After a year of rebel victories that have left Myanmar’s ruling junta on the defensive, its chairman, Gen. Min Aung Hlaing, invited ethnic minority armies to peace talks in a state television broadcast on Tuesday.

In this episode of “Energized: The Future of Energy,” a podcast series from GZERO Media's Blue Circle Studios and Enbridge, host JJ Ramberg and Enbridge CEO Greg Ebel talk to Justin Bourque, President of Athabasca Indigenous Investments, and Mark Podlasly, Chief Sustainability Officer of First Nations Major Project Coalition. They discuss how a partnership deal between Enbridge and 23 Indigenous communities in northern Alberta is improving life for those communities and how Indigenous peoples are investing in the energy transition—and their futures. Listen to this episode at gzeromedia.com/energized, or on Apple, Spotify, Goodpods, or wherever you get your podcasts.

Prime Minister Giorgia Meloni pays tribute to the 309 victims of the earthquake that struck, in L'Aquila, Italy, on 5 April 2009.
Andrea Mancini/NurPhoto via Reuters

1.25: Surrogacy has been banned in Italy for 20 years, but Prime Minister Giorgia Meloni’sconservative government has just gone a step further and criminalized seeking surrogacy abroad.