Against the Federal Moratorium on State-Level Regulation of AI

Cast your mind back to May of this year: Congress was in the throes of debate over the massive budget bill. Amidst the many seismic provisions, Senator Ted Cruz dropped a ticking time bomb of tech policy: a ten-year moratorium on the ability of states to regulate artificial intelligence. To many, this was catastrophic. The few massive AI companies seem to be swallowing our economy whole: their energy demands are overriding household needs, their data demands are overriding creators’ copyright, and their products are triggering mass unemployment as well as new types of clinical psychoses. In a moment where Congress is seemingly unable to act to pass any meaningful consumer protections or market regulations, why would we hamstring the one entity evidently capable of doing so—the states? States that have already enacted consumer protections and other AI regulations, like California, and those actively debating them, like Massachusetts, were alarmed. Seventeen Republican governors wrote a letter decrying the idea, and it was ultimately killed in a rare vote of bipartisan near-unanimity.

The idea is back. Before Thanksgiving, a House Republican leader suggested they might slip it into the annual defense spending bill. Then, a draft document leaked outlining the Trump administration’s intent to enforce the state regulatory ban through executive powers. An outpouring of opposition (including from some Republican state leaders) beat back that notion for a few weeks, but on Monday, Trump posted on social media that the promised Executive Order is indeed coming soon. That would put a growing cohort of states, including California and New York, as well as Republican strongholds like Utah and Texas, in jeopardy.

The constellation of motivations behind this proposal is clear: conservative ideology, cash, and China.

The intellectual argument in favor of the moratorium is that “freedom“-killing state regulation on AI would create a patchwork that would be difficult for AI companies to comply with, which would slow the pace of innovation needed to win an AI arms race with China. AI companies and their investors have been aggressively peddling this narrative for years now, and are increasingly backing it with exorbitant lobbying dollars. It’s a handy argument, useful not only to kill regulatory constraints, but also—companies hope—to win federal bailouts and energy subsidies.

Citizens should parse that argument from their own point of view, not Big Tech’s. Preventing states from regulating AI means that those companies get to tell Washington what they want, but your state representatives are powerless to represent your own interests. Which freedom is more important to you: the freedom for a few near-monopolies to profit from AI, or the freedom for you and your neighbors to demand protections from its abuses?

There is an element of this that is more partisan than ideological. Vice President J.D. Vance argued that federal preemption is needed to prevent “progressive” states from controlling AI’s future. This is an indicator of creeping polarization, where Democrats decry the monopolism, bias, and harms attendant to corporate AI and Republicans reflexively take the opposite side. It doesn’t help that some in the parties also have direct financial interests in the AI supply chain.

But this does not need to be a partisan wedge issue: both Democrats and Republicans have strong reasons to support state-level AI legislation. Everyone shares an interest in protecting consumers from harm created by Big Tech companies. In leading the charge to kill Cruz’s initial AI moratorium proposal, Republican Senator Masha Blackburn explained that “This provision could allow Big Tech to continue to exploit kids, creators, and conservatives? we can’t block states from making laws that protect their citizens.” More recently, Florida Governor Ron DeSantis wants to regulate AI in his state.

The often-heard complaint that it is hard to comply with a patchwork of state regulations rings hollow. Pretty much every other consumer-facing industry has managed to deal with local regulation—automobiles, children’s toys, food, and drugs—and those regulations have been effective consumer protections. The AI industry includes some of the most valuable companies globally and has demonstrated the ability to comply with differing regulations around the world, including the EU’s AI and data privacy regulations, substantially more onerous than those so far adopted by US states. If we can’t leverage state regulatory power to shape the AI industry, to what industry could it possibly apply?

The regulatory superpower that states have here is not size and force, but rather speed and locality. We need the “laboratories of democracy” to experiment with different types of regulation that fit the specific needs and interests of their constituents and evolve responsively to the concerns they raise, especially in such a consequential and rapidly changing area such as AI.

We should embrace the ability of regulation to be a driver—not a limiter—of innovation. Regulations don’t restrict companies from building better products or making more profit; they help channel that innovation in specific ways that protect the public interest. Drug safety regulations don’t prevent pharma companies from inventing drugs; they force them to invent drugs that are safe and efficacious. States can direct private innovation to serve the public.

But, most importantly, regulations are needed to prevent the most dangerous impact of AI today: the concentration of power associated with trillion-dollar AI companies and the power-amplifying technologies they are producing. We outline the specific ways that the use of AI in governance can disrupt existing balances of power, and how to steer those applications towards more equitable balances, in our new book, Rewiring Democracy. In the nearly complete absence of Congressional action on AI over the years, it has swept the world’s attention; it has become clear that states are the only effective policy levers we have against that concentration of power.

Instead of impeding states from regulating AI, the federal government should support them to drive AI innovation. If proponents of a moratorium worry that the private sector won’t deliver what they think is needed to compete in the new global economy, then we should engage government to help generate AI innovations that serve the public and solve the problems most important to people. Following the lead of countries like Switzerland, France, and Singapore, the US could invest in developing and deploying AI models designed as public goods: transparent, open, and useful for tasks in public administration and governance.

Maybe you don’t trust the federal government to build or operate an AI tool that acts in the public interest? We don’t either. States are a much better place for this innovation to happen because they are closer to the people, they are charged with delivering most government services, they are better aligned with local political sentiments, and they have achieved greater trust. They’re where we can test, iterate, compare, and contrast regulatory approaches that could inform eventual and better federal policy. And, while the costs of training and operating performance AI tools like large language models have declined precipitously, the federal government can play a valuable role here in funding cash-strapped states to lead this kind of innovation.

This essay was written with Nathan E. Sanders, and originally appeared in Gizmodo.

EDITED TO ADD: Trump signed an executive order banning state-level AI regulations hours after this was published. This is not going to be the last word on the subject.

Posted on December 15, 2025 at 7:02 AM13 Comments

Comments

Rontea December 15, 2025 10:26 AM

In the manner of Plato, I would observe thus: America, though adorned with the mantle of a singular republic, is in truth a chorus of many city-states, each with its own laws, customs, and guardians of the public good. If the federal hand were to forbid these states from governing the new artifices of mind—these engines of intelligence—it would be as if Athens forbade Corinth or Thebes from shaping their own destinies. For the health of the polis lies not in silencing its many voices, but in permitting each city to seek wisdom through its own trials. Just as the Greek city-states once pursued diverse paths to virtue, so too must the states of America be allowed to test and refine the governance of this new power, lest the commonwealth drift into the tyranny of a single will or the folly of unexamined rule.

KC December 15, 2025 11:40 AM

The ‘Scaling Laws’ podcast also rang in on this EO. An excerpt:

“Again, I will make my usual disclaimer whenever I talk about executive orders, that executive orders are not themselves generally binding legal documents, right? I like to describe them as truth social posts on nicer stationary.”

The episode links to further commentary on legal issues raised.

Kevin December 15, 2025 7:41 PM

The argument of “states are closer to their people so they should make the laws” can be true for almost ANY law. And if we allow that, then why do you need a federal government at all??? Why not get rid of federal and make each state it’s own ruling body???

Personally (and I’m an Aussie so…) I don’t think the people of California are that much different to the people of Kansas that you need different laws in each state. How much time, energy and money are wasted with each state having to come up with its own laws that are 90% the same as each other state, and could easily be the same.

With the world going global in terms of manufacturing, surely it makes more sense to align the laws of all the states, even try to align the laws of all countries, to make things simpler.

JG5 December 15, 2025 8:15 PM

I like the diversity of city and state regulations as a natural experiment in how to get things right. The states have a strong grip on some short hairs. I would be forcing the data centers to finance their own generation capacity, which generally falls under state regulatory agencies.

I have been busy and distracted again. Wanted to touch base with Clive about quinine. I used to love the stuff, but long before I gave it up, I read a disturbing article in the aviation literature. It is implicated in vertigo and the bad things that follow. I have searched for the literature twice. Came up squatburger the first time. Had better luck ploughing with Gemini this time.

The main reason I am stopping by is to provide the best “AI” video that I ever have seen. I may have posted the 250th Anniversary of TroubleMaking in the Colonies earlier this year – it wasn’t tainted by AI. Those re-enactments are ripe for AI to correct the buildings back to 1775. And get rid of the cars and paving. And thin the crowd of tourists while correcting their attire.

Please don’t hurt yourself laughing:

Redneck Star Trek – Beam Me Up, Bubba | AI Country Star Trek Parody
Neural Derp 41.8K subscribers
1.8M views | 4 months ago | #TrailerPark #SuperPanavision70 #AISongs
https://youtu.be/1eqYswiW4eo

anon December 16, 2025 7:12 AM

Forget the feds vs the states. What if the counties were to regulate AI? Instead of fighting for 50 sets of regulations, you’re now fighting for 3100 conflicting sets of regulations. Good luck to you.

Paul Sagi December 16, 2025 7:13 AM

​Regarding
https://www.schneier.com/blog/archives/2025/12/against-the-federal-moratorium-on-state-level-regulation-of-ai.html
won’t state level regulation maybe result in 50 different regulations?
Wouldn’t that be an unworkable mess? (AI developed in one state might be illegal in another state.)
​A Federal law would give more consistency and the US could then work with other nations for an international law regulating AI. That would help international business and trade, otherwise an AI legal in one country might be illegal in other countries.

Clive Robinson December 16, 2025 10:26 AM

@ JG5,

I gave a long answer to your question, but it appears to have vanished.

The answer can be found by doing a DuckDuck search on “quinine inner ear” and you will get research papers that answer it.

eli December 17, 2025 12:03 PM

there is nothing “conservative” about consolidation of power over the sovereignty of the individual or of the individual states.

we should all be opposed on this point alone.

Executive Orders are being used (to the extent possible) to bypass a Congress that is united in stone-walling the sitting President. This is not unique to the current Administration.

To the subject of this article, the $600 Billion+ that Oracle, NVidia, and OpenAI have generated amongst themselves will then be used to influence one Executive as opposed to working its way through the individual states.

Leave a comment

Blog moderation policy

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.