Biden pushes forward on AI

Midjourney

Joe Biden is starting to walk the talk on artificial intelligence. Federal agencies have until December to get a handle on how to use — and minimize the risks from — AI, thanks to new instructions from the White House Office of Management and Budget. The policies mark the next step along the path laid out by Biden’s October AI executive order, adding specific goals after a period of evaluation.

What’s new

Federal agencies will need to “assess, test, and monitor” the impact of AI, “mitigate the risks of algorithmic discrimination,” and provide “transparency into how the government uses AI.”

It’s unclear to what extent AI currently factors into government work. The Defense Department already has key AI investments, while other agencies may only be toying with the new technology. Under Biden’s new rules, agencies seeking to use AI must create an “impact assessment” for the tools they use, conduct real-world testing before deployment, obtain independent evaluation from an oversight board or another body, do regular monitoring and risk-assessment, and work to mitigate any associated risks.

Adam Conner, vice president of technology policy at the Center for American Progress, says that the OMB guidance is “an important step in articulating that AI should be used by federal agencies in a responsible way.”

The OMB policy isn’t solely aimed at protecting against AI’s harms. It mandates that federal agencies name a Chief AI Officer charged with implementing the new standards. These new government AI czars are meant to work across agencies, coordinate the administration’s AI goals, and remove barriers to innovation within government.

What it means

Dev Saxena, director of Eurasia Group's geo-technology practice, said the policies are “precedent-setting,” especially in the absence of comprehensive artificial intelligence legislation like the one the European Union recently passed.

Saxena noted that the policies will move the government further along than industry in terms of safety and transparency standards for AI since there’s no federal law governing this technology specifically. While many industry leaders have cooperated with the Biden administration and signed a voluntary pledge to manage the risks of AI, the new OMB policies could also serve as a form of “soft law” to force higher standards of testing, risk-assessment, and transparency for the private sector if they want to sell their technology and services to the federal government.

However, there’s a notable carveout for the national security and defense agencies, which could be targets for the most dangerous and insidious uses of AI. We’ve previouslywritten about America’s AI militarization and goal of maintaining a strategic advantage over rivals such as China. While they’re exempted from these new rules, a separate track of defense and national-security guidelines are expected to come later this year.

Fears and concerns

Still, public interest groups are concerned about the ways in which the citizens’ liberties could be curtailed when the government uses AI. The American Civil Liberties Union called on governments to do more to protect citizens from AI. “OMB has taken an important step, but only a step, in protecting us from abuses by AI. Federal uses of AI should not be permitted to undermine rights and safety, but harmful and discriminatory uses of AI by national security agencies, state governments, and more remain largely unchecked,” wrote Cody Venzke, ACLU senior policy counsel, in a statement.

Of course, the biggest risk to the implementation of these policies is the upcoming presidential election. Former President Donald Trump, if reelected, might keep some of the policies aimed at China and other political adversaries, Saxena says, but could significantly pull back from the rights- and safety-focused protections.

Beyond the uncertainty of election season, the Biden administration has a real challenge going from zero to full speed. “The administration should be commended on its work so far,” Conner says, “but now comes the hard part: implementation.”

More from GZERO Media

U.S. President Donald Trump meets with Italian Prime Minister Giorgia Meloni in the Cabinet Room at the White House in Washington, D.C., U.S., April 17, 2025.

REUTERS/Evelyn Hockstein

A federal judge set up a showdown with the Trump administration on Wednesday with a ruling that threatens to find the government in contempt if it fails to comply with a judicial order to provide due process to Venezuelans deported to a prison in El Salvador.

Gavin Newsom speaks at the Vogue World: Hollywood Announcement at the Chateau Marmont in West Hollywood, CA on March 26, 2025.
Photo by Corine Solberg/Sipa USA

California governor Gavin Newsom kicked off a campaign to promote Canadian tourism in his state, pitching its sunny beaches, lush vineyards, and world-class restaurants.

An employee checks filled capsules inside a Cadila Pharmaceutical company manufacturing unit at Dholka town on the outskirts of Ahmedabad, India, April 12, 2025.
REUTERS/Amit Dave

Donald Trump’s administration announced that it is opening investigations into pharmaceutical and semiconductor supply chains, which will likely result in tariffs that will hurt suppliers in Europe, India, and Canada.

Anderson Clayton, chair of the North Carolina Democratic Party speaks after Democrat Josh Stein won the North Carolina governor's race, in Raleigh, North Carolina, U.S., November 5, 2024.
REUTERS/Jonathan Drake

As the Democrats start plotting their fight back into power in the 2026 midterms, one issue has come up again and again.

People gather after Friday prayers during a protest in solidarity with Palestinians in Gaza, in Amman, Jordan, on April 4, 2025.
REUTERS/Jehad Shelbak

Jordanian authorities announced on Wednesday the arrest of 16 people accused of planning terrorist attacks inside Jordan. The country’s security services say the suspects had been under surveillance since 2021, and half a dozen of them were reportedly members of the Muslim Brotherhood, a transnational Islamist organization.