


The Biden administration prepared new action Monday to regulate artificial intelligence before President-elect Donald Trump takes office next week, forcing the incoming administration to make quick decisions about restricting emerging technology.
The Interim Final Rule on Artificial Intelligence Diffusion plans new controls for advanced AI chips and proposed model weights with the intention of keeping top tech out of foreign adversaries’ hands while sharing it with U.S. allies.
Mr. Trump can undercut the regulation, but President Biden’s team is aiming to force the incoming administration to work from Mr. Biden’s blueprint that has the semiconductor industry on edge.
Commerce Secretary Gina Raimondo told reporters on Sunday that she intended for “ordinary commercial business in AI to continue uninterrupted” and was focused on the most advanced AI developers.
She said the Biden administration scheduled 120 days for regulators to solicit feedback from the public, leaving Mr. Trump’s team four months to decide the fate of the new regulations.
“We hope that the next administration takes full advantage of those 120 days to listen to experts, industry, industry players, partner countries, [and] consider their input,” Ms. Raimondo said. “And I fully expect the next administration may make changes as a result of that input so we’ve provided for 120 days, which is a very long comment period, and we’ve provided for one year, 365 days for compliance for the standards at AI data centers.”
Comment periods for proposed rules vary, often ranging between one to two months but lasting six months or longer for more complex regulations, according to the Office of the Federal Register.
Mr. Trump has pledged to undo Mr. Biden’s AI executive order and the Republican Party included the push to undermine Mr. Biden’s AI regulatory policy in its 2024 platform.
The Biden administration believes its new AI rule will incentivize the responsible use and equitable access of AI.
White House National Security Adviser Jake Sullivan said the Biden administration’s regulation was a “balanced and practical approach.”
“The rule makes it hard for our strategic competitors to use smuggling and remote access to abate our export controls,” Mr. Sullivan told reporters. “It creates incentives for our friends and partners around the world to use trusted vendors for advanced AI, both U.S. vendors and local vendors who meet strong security requirements.”
The semiconductor sector has feared the Biden administration’s rush to regulate rather than allow for a smoother transition could affect the sector’s business.
The Semiconductor Industry Association said last week that it was “deeply concerned” about the potential AI regulation’s scope and complexity.
Oracle Executive Vice President Ken Glueck wrote on his company’s website that he feared the Biden administration’s AI restrictions were poised to “go down as one of the most destructive to ever hit the U.S. technology industry.”
“To retroactively and surreptitiously issue a final rule of this magnitude without industry consultation and only days before the change in administration is highly consequential,” Mr. Glueck wrote. “For the first time, we are applying draconian new regulations to largely unregulated public, commercial cloud.”
Ms. Raimondo was adamant that federal regulators carved exceptions for businesses and friendly nations.
“Supply chain activities are explicitly excluded, so chips can move where they need to be packaged or tested,” she said. “We’ve also been crystal clear that this does not apply to gaming chips, even gaming chips which can approach the capability of AI chips are carved out from the rule.”
Countries on arms-embargoed lists such as Cuba, Iran, and Russia would all face new restrictions on transfers of powerful AI models, according to a senior administration official. Other nations would face no such restrictions, including Taiwan, South Korea, and Japan, among others.
Biden administration officials said such export controls were necessary to create the secure spread of AI capabilities while countering their use in adversaries’ weapons systems.
Oracle is not a fan of heavy-handed AI rules and Mr. Glueck wrote that the tech sector does not want the federal regulators’ protection.
“Apart from the previously articulated and agreed upon national security concerns, we don’t need a ride, we need government to get out of the way,” Mr. Glueck wrote.
• Ryan Lovelace can be reached at rlovelace@washingtontimes.com.