Overview
I. Introduction
The Trump Administration's Executive Order Ensuring a National Policy Framework for Artificial Intelligence (the "Order") goes beyond the headlines, signaling a new era of federal executive involvement in AI governance and policy, an area that has been largely shaped by state and local regulation.
In contrast to prior federal approaches that emphasized risk‑mitigation or safety‑first frameworks, this Order openly prioritizes national competitiveness and streamlined regulatory oversight. It establishes new federal coordination mechanisms, including an AI Litigation Task Force, and signals the Administration's intent to reshape the balance of authority between federal and state regulators. But whether the Administration can use federal executive tools to preempt state law is a different, and largely unsettled, question.
As a result, companies developing, deploying, or investing in AI technologies should expect increased federal activity, new compliance expectations, and litigation developments that could directly affect their operations.
II. Brief Overview
The Order on artificial intelligence, issued on December 11, 2025, outlines an assertive federal strategy focused primarily on accelerating US AI innovation and reducing regulatory burdens that the Administration views as harmful to national competitiveness. The Order seeks to counter divergent state AI laws by directing federal agencies to evaluate and challenge state requirements deemed inconsistent with federal policy. It establishes an AI Litigation Task Force to coordinate these efforts, signaling elevated federal intervention in what has historically been a state‑driven regulatory space.
The Order will likely influence how federal oversight will evolve across the American economy. In the infrastructure and energy space, it aligns with ongoing Federal Energy Regulatory Commission (FERC) proceedings on large‑load interconnection and prioritizes expedited development of data‑center capacity essential for AI workloads. In financial services, the Order reinforces a principles‑based supervisory approach while increasing expectations for accurate AI‑related disclosures and governance. Although federal preemption efforts may curtail certain state‑law claims, companies remain exposed to actions under federal law and common‑law theories.
The Order also has significant political implications. Congress remains divided over federal preemption of state AI laws, and early reactions from state officials indicate likely legal challenges to any withholding of federal funds or federal efforts to invalidate state statutes. The Order, therefore, sets up a multi‑front confrontation involving federal agencies, state governments, the courts, and regulated companies, even as it accelerates federal standard‑setting and encourages deployment of AI across government functions.
Overall, the Order reflects a decisive executive shift toward centralized federal authority over AI governance, prioritizing innovation and national security over a distributed regulatory model. Companies should expect a heightened federal role, increased litigation, and rapid policy evolution with long‑term implications for compliance, investment, and operational strategy.
III. Key Takeaways
- A clear signal that federal AI policy will focus on accelerating private-sector development by reducing regulatory burdens and aggressively pursuing global dominance. Companies should monitor implementation closely, particularly in areas related to procurement, permitting, and supply chain security.
- Although the Order offers broad directives and goals, the way these directives and goals will be implemented under existing federal law and potential new federal legislation will determine the success of the Order.
Our interdisciplinary AI team has identified the following key takeaways for companies focused on specific aspects of the Order and its implications for key industry sectors.
A. Conflicts between State and Federal Requirements
The Administration appears to view state‑by‑state AI regulation as incompatible with national AI competitiveness and federal constitutional principles governing interstate commerce. The Order directs federal agencies to challenge state laws that impose constraints on AI development, require alterations to "truthful outputs," or mandate disclosures that may conflict with First Amendment considerations.
Despite its sweeping language, the Order cannot, on its own, preempt state law. Only Congress may enact statutes with preemptive effect, and federal agencies might preempt state requirements only when acting pursuant to valid statutory authority. An executive order itself cannot create new legal powers or override state legislative enactments; it can merely direct federal agencies to use existing authority more aggressively or, where permissible, to condition federal grants or pursue litigation aimed at invalidating conflicting state laws. As a result, the Order's preemption‑oriented directives operate primarily as policy signals and strategic instructions to federal agencies—not as legally binding overrides of state statutes. States will likely invoke these constitutional limits as they defend their AI laws in court, challenging both the scope of federal authority and the legality of any federal funding restrictions or preemptive rulemaking efforts.
The expected escalation in federal-state conflict will likely create significant uncertainty. Courts will be asked to determine the boundaries of federal preemption, agencies' authority to condition federal funds, and whether particular state laws—including those addressing deepfakes, algorithmic discrimination, or transparency—improperly burden interstate commerce. Companies should prepare for a fluid regulatory environment in which federal challenges may limit the enforceability of state rules, but do not eliminate compliance risks.
B. Impact on Specific Fields
In this section, we take a closer look at five areas where the Order's impact will be felt most acutely: (1) Consumer Protection and Privacy; (2) Data Centers and Energy; (3) Financial Services; (4) Government Data and Procurement; and (5) Intellectual Property. Understanding these pressure points now will help organizations anticipate regulatory friction and align their operations with emerging federal expectations.
1. Consumer Protection and Privacy
The Order cannot substantively expand federal consumer‑protection or privacy obligations; instead, it signals that federal agencies will increasingly scrutinize inconsistent state rules in these areas. The Federal Trade Commission (FTC) is directed to issue a policy statement clarifying when state laws requiring changes to AI model outputs constitute deceptive practices preempted under the FTC Act.
While the Order frames state‑level consumer‑protection mandates as potentially burdensome to innovation, federal agencies remain committed to enforcing existing privacy, transparency, and anti‑fraud requirements. Clients should expect increased examination of claims made to consumers about AI capabilities and limitations, even as the Administration seeks to narrow the scope of permissible state‑level interventions.
2. Data Centers and Energy
The Administration has identified data centers and the energy infrastructure necessary to support them as essential to national AI competitiveness. While state laws affecting data centers generally remain outside the scope of the Order, it includes a call for a uniform federal policy framework that could preempt generally applicable state permitting processes that affect data center infrastructure development. Processes for interconnecting energy-intensive data centers to the grid are already under review at the federal level at the FERC.
FERC, at the direction of the Department of Energy, is actively engaged in an Advance Notice of Proposed Rulemaking (ANOPR) on large load interconnection. That proceeding is considering the extent of FERC's jurisdiction over data center interconnection to the grid and how FERC may exercise that jurisdiction, as well as how any FERC action would interact with existing state laws and regulations that govern the service of power to large data center loads. Many states have laws or regulations in place that govern how large loads such as data centers interconnect to the grid and procure energy. FERC's emphasis on data centers aligns with the Administration's—at her first open meeting as FERC Chair, Laura Swett stated, "In addition to our core mission of keeping the lights on for all Americans at reasonable costs, my priority as chairman is to ensure that our country can connect and power data centers as quickly and as durably as possible." Stakeholders should keep a close eye on FERC's actions in this space, as they will have nationwide implications for data center development and energy infrastructure.
3. Financial Services
The Order impacts financial market participants by accelerating federal standard-setting and by seeking to reduce reliance on state-focused AI compliance programs. It aims to establish a unified federal framework for AI governance, defining what is "reasonable" AI governance, and strengthening federal enforcement. These changes heighten the risks associated with regulatory examinations, enforcement actions, disclosure obligations, and litigation.
Financial regulators appear likely to proceed with a principle-based approach to AI. The Securities and Exchange Commission (SEC) Investor Advisory Committee recently recommended that the SEC issue AI disclosure guidance based on materiality. This recommendation, along with the SEC's withdrawal of the "predictive data analytics" conflicts-of-interest rulemaking and the Order, increases the likelihood of a principles-based approach focused on existing fiduciary duties, Regulation Best Interest, anti-fraud provisions, supervision, and recordkeeping.
Guidance will continue to define what constitutes "reasonable" controls. The Consumer Financial Protection Bureau (CFPB) and Federal Deposit Insurance Corporation (FDIC) maintain AI governance and compliance frameworks aligned with federal priorities. While not binding, these materials establish baseline governance concepts that appear in examinations and supervisory discussions across agencies.
Centralized federal AI policy could further cross-agency coordination on examination and enforcement. The SEC Division of Examinations' Fiscal Year 2026 priorities include assessing firm AI policies, supervision, and the accuracy of registrant claims. The SEC has previously addressed high-profile AI-related misstatements. Recent developments indicate continued scrutiny of the accuracy of AI statements by advisers and issuers, including marketing claims, offering materials, periodic reports, and risk factors. The SEC AI Task Force plans to use AI tools to improve efficiency and accuracy.
The Order should prompt further consideration of compliance strategies designed to meet the strictest state regimes. Given ongoing legal uncertainty about preemption and the stability of state regimes, financial services firms should separate core investor protection controls from jurisdiction-specific requirements to maintain defensible governance as state requirements evolve.
4. Government Data and Procurement
The Order's broader emphasis on national competitiveness suggests that procurement reforms may ultimately encompass expedited acquisition pathways, increased reliance on commercial AI products, and enhanced interagency coordination. These developments should benefit vendors seeking clearer, faster entry points into federal markets.
However, the Order also indicates that recommendations for establishing a uniform federal framework should not preempt state rules on government procurement and use of AI. Accordingly, vendors that sell to both federal and state governments may continue to face a patchwork of requirements that are tailored to the nuances of individual states and their agencies. As a result, states may still be able to influence AI standards by leading vendors to account for individual state requirements in their offerings to facilitate sales across multiple jurisdictions.
5. Intellectual Property
Intellectual property, with some exceptions, is largely a federal enterprise, and the Order would seem to want to keep things that way. One of the goals of the Order is that "copyrights are respected." Efforts under the Order will likely seek to preempt efforts by state regulators to regulate copyright-related issues at the state level. For example, state contract law preventing the use of training data in violation of Terms of Use might be preempted by Federal Copyright Law, and the Administration might support those efforts. We will expect that federal copyright law will continue to develop, as might federal policies and regimes around AI training data licensing.
Additionally, various state-level IP laws in the domains of trade secrets and rights of publicity are increasingly being discussed in the context of AI products and services. While the Federal Defend Trade Secrets Act is largely coextensive with trade secret obligations under state law, there is significant variability in state-level right of publicity statutes and common law rules. The right of publicity will become increasingly relevant as products incorporate the AI-based imagery and voices for profit. While there is no federal right of publicity today, relevant federal legislation has been proposed and discussed as recently as last year. Therefore, the ongoing dynamics between federal and state IP laws will likely be a focal point of the Administration. Moreover, because so much of IP law implicates AI training data and model outputs, the administration's recommendation that the Federal Communications Commission (FCC) adopt a preemptive "Federal reporting and disclosure standard for AI models" will be relevant to ongoing IP fights over AI models and products.
* * *
Litigation will be the primary mechanism through which the federal-state dynamics are pressure-tested, as courts begin to weigh in on how far federal authority reaches and where state laws still hold sway.
AI litigation is expanding across multiple fronts, including energy, privacy, intellectual property, and state enforcement actions. Plaintiffs rely on a mix of state statutes, federal law, and common law theories to challenge, among other things, automated decision-making, deceptive AI practices, AI-generated content, and AI responses that affirm or encourage harmful behavior.
The Administration's Order and new AI Litigation Task Force directly target state efforts to impose AI-specific duties and liability. This means that state regulations—whether enforced by state attorneys general (like the Colorado AI Act) or via private rights of action (like New Jersey's deepfake law)—will now face increased scrutiny. However, the Order does not materially reduce overall AI litigation exposure, as core AI related claims will continue to proceed under legal frameworks, including federal copyright actions challenging how large language models are trained (see, e.g., The New York Times Company, et al. v. Microsoft Corporation, et al, No. 1:23-CV-11195 (S.D.N.Y. filed Dec. 27, 2023)), traditional tort claims like product liability (see, e.g., P.J. v. Character Technologies, Inc., No. 1:25-cv-01295 (N.D.N.Y. filed Sept. 16, 2025)), consumer protection and fraud actions targeting misleading AI representation (see, e.g., Federal Trade Commission v. Air Ai Technologies, Inc., et al, No. 2:25-cv-03068 (D. Ariz. filed Aug. 25, 2025)), and employment and civil rights claims based on discriminatory AI driven decision making (See, e.g., Mobley v. Workday, Inc., No. 3:23-cv-00770 (N.D.C.A filed Feb 21, 2023)). In short, the Order may limit certain state law theories of liability, but it leaves companies exposed to litigation arising under federal law and longstanding common law doctrines.
IV. How Congress Might React
Although Congress has yet to pass comprehensive AI legislation, its repeated attempts and growing bipartisan interest suggest that the Order may serve as a catalyst for renewed legislative activity. Unlike an executive order, legislation could change the preemption landscape.
In June 2025, Senator Ted Cruz (R‑TX), chair of the Senate Committee on Commerce, Science, and Transportation, believed he had secured an agreement with Senator Marsha Blackburn (R‑TN) to amend and preserve a provision in the FY2026 Budget Resolution to limit federal broadband grants to states that enforced AI regulations. Blackburn, a champion of legislation to protect children online, ultimately withdrew her support, killing the proposal and dealing a setback to Republican efforts to block state-level AI rules. A subsequent attempt in December 2025 to include an AI moratorium in the National Defense Authorization Act also failed. These episodes are among several unsuccessful attempts by Congress to address not just AI policy but tech policy more broadly, including measures as seemingly bipartisan as protecting children online.
There is a slight possibility that the Order could break through the legislative gridlock. It directs the Special Advisor for AI and Crypto, together with the Assistant to the President for Science and Technology, to "develop a legislative framework for AI that preempts State AI laws that conflict with the policy set forth in this order," which is to maintain American AI dominance through minimally burdensome regulation.
While the primary purpose of the legislative framework is to preempt state law—an explosive issue on Capitol Hill—if the Administration approaches the task with serious intent to get legislation passed, its framework could catalyze substantive policy discussions regarding national AI policy.
However, it's more likely that members of Congress will act on, or opine about, the Order's funding restrictions. The Order includes two such limitations.
"States with onerous AI laws identified pursuant to [this EO] are ineligible for [Broadband Equity, Access, and Deployment (BEAD) grants] non-deployment funds, to the maximum extent allowed by federal law." Agencies are to review their grants to determine if they can condition such grants on "States either enacting an AI law that conflicts with the policy of this order…or for those States that enacted such laws," their entering into a binding agreement not to enforce them.
However, it remains uncertain whether the Administration has the legal authority to withhold BEAD funds or other grants in this manner. This is because Congress is unlikely to have included, at the time these programs were created, any condition requiring alignment with policies that did not yet exist. Further, regarding BEAD, the Administration announced in its November 21, 2025 Frequently Asked Questions that it has already reversed its approval of all non-deployment uses.
Primarily Democratic members of Congress are expected to raise this issue during upcoming budget and appropriations hearings. Indeed, after seeing a draft of the Order in November, Members of the House Committee on Energy and Commerce wrote to the NTIA. At the same time, given the Administration's uncertain legal footing, Republicans in Congress may choose to let the matter be resolved in the courts rather than engage in direct conflict with the White House.
It is their colleagues in the states who are most likely to pursue legal action against the Administration if their grant funding is denied or their laws are challenged. Republican governors and state legislators have resisted federal efforts to limit their ability to regulate Big Tech and, in their view, fulfill their responsibility to oversee AI.
For example, Governor Ron DeSantis (R-FL), who recently proposed an "AI Bill of Rights," noted in opposition that "An executive order doesn't/can't preempt state legislative action."
Governor Spencer Cox (R-UT), another state with a law aimed at regulating AI, noted on X, "An alternative AI executive order focused on human flourishing would strike the balance we need: safeguard our kids, preserve our values, and strengthen American competitiveness."
Utah Republican state Rep. Doug Fiefia criticized the order as "an overreaching act that fundamentally disregards the Tenth Amendment and the necessary role of the States in technology governance."
While the Order clearly outlines the Administration's AI priorities, the extent to which the Order will spur Congress to establish a federal AI framework is far from settled. The divisions on the Hill remain deep, particularly around preemption of state laws, and the Administration's Order did nothing to ease those tensions.
V. Conclusion
We will continue tracking regulatory developments across federal agencies and legal developments at the state level through Steptoe's AI Legislative Tracker.
As the federal government moves quickly to consolidate authority over AI governance and challenge state‑level requirements, the regulatory landscape will continue to evolve at an unprecedented pace. Our team will closely monitor federal implementation efforts, agency rulemaking, enforcement trends, and the inevitable state and judicial responses through our tracker.
Given the Executive Order's sweeping implications—from federal preemption efforts and funding conditions to sector‑specific impacts on energy, financial services, litigation risk, and beyond—companies should proactively assess how the shifting federal-state dynamic may affect their operations, compliance posture, and strategic planning. Our interdisciplinary artificial intelligence team, spanning Commercial, Consumer, and Government Litigation; Energy; Financial Innovation and Regulation; Intellectual Property; International Trade and Regulatory Compliance; Investigations and White Collar Defense; and Telecom and Technology, is prepared to help organizations navigate these developments.
In addition, our Government Affairs and Public Policy team stands ready to support companies engaging with federal agencies charged with implementing this Order, leveraging deep experience with the Administration, Congress, and the complex intersection of emerging technology and public policy. We can assist in identifying risks, shaping strategy, and positioning your organization effectively as the federal government accelerates its push toward a national AI framework.