Overview
Introduction
Artificial intelligence (AI) is transforming many sectors, including energy, but the rapid growth of data centers, cloud computing and digital services is driving a sharp increase in electricity demand, raising concerns about infrastructure capacity and sustainability. Beyond energy use, AI also relies on significant resources across its value chain – including water, rare metals and land – resulting in a clear environmental footprint. At the same time, AI can support the energy transition by improving efficiency, integrating renewables and optimizing electricity networks, while also creating complex legal and regulatory challenges.
According to the International Energy Agency, data centers currently account for around 1.5% of global electricity consumption, with demand expected to more than double by 2030, while the global AI sector could use at least 10 times more energy in 2026 than in 2023. By way of comparison, a single ChatGPT query can consume up to 10 times more electricity than a standard Google search, and a large data center may use as much electricity annually as around 100,000 households, with additional pressure arising from substantial water-cooling requirements.
The development of data centers raises a series of competition issues, some of which are closely linked to energy use and environmental impact. These include:
- Information sharing in colocation
- Restricted access, due to coordinated behavior and potential foreclosure, arising from concentration (mergers or strategic non-merger joint ventures)
- AI service efficiency as a competitive factor
- Standardizing the environmental impact of AI
- State aid, where state support is deployed by member states in support of national investment initiatives
Competition Issues Related to Data Centers
1. Information Sharing Risk Due to Colocation
Data centers are often inter-linked to provide the resilience, speed, and networking required by AI systems. Data centers, particularly carrier-neutral colocation facilities hosting multiple clients, including direct competitors, face unique antitrust risks due to the potential inadvertent sharing of competitively sensitive information. These risks go beyond overt collusion or price fixing to include subtle or unintentional exchanges that could influence market behavior, reduce competition, and trigger regulatory scrutiny.
Information sharing can occur in various circumstances. Employees managing shared facilities may inadvertently access or share client information if robust information barriers and training are not in place. Participation in joint ventures, industry consortia, or collaborative infrastructure projects can create opportunities for improper exchanges of sensitive information during meetings or through shared documentation, making legal oversight and clear procedural guardrails essential. Outsourcing maintenance, IT, or security services introduces additional risk, as third-party providers may have access to confidential client data without proper controls. Centralized management platforms or cloud-based monitoring tools can inadvertently aggregate data across clients, while mergers or acquisitions may require sharing detailed business information, creating risks if clean team procedures and safeguards are not followed.
To mitigate risks of data leakage and access restrictions, data center operators should implement robust antitrust and competition compliance programs. Key measures include strict access controls, information barriers between business units and clients, technical and organizational firewalls, employee training on competition law and confidentiality, regular audits, and contractual safeguards with clients and partners. These steps help prevent abuse of dominance, ensure fair access, and maintain compliance with European Union competition rules.
2. Big Tech Coordination – Restrictions on Third-Party Access
Competition regulators have raised concerns about the financing and structure of the AI ecosystem, including high concentration across the supply chain, massive investment requirements, limited interoperability, and foreclosure risks arising from mergers, joint ventures, and partnership arrangements. Major technology companies – such as Google, Amazon, Microsoft, and ASML – are investing heavily in emerging AI firms like OpenAI and France's Mistral AI. These investments create close financial ties between dominant incumbents and newer innovators, forming tightly interconnected networks rather than independent competing actors, a phenomenon Jonathan Kanter[1]has described as "circular investments." Such interdependencies may generate systemic risks, where financial harm or anticompetitive conduct by one major player could ripple across the ecosystem, entrenching market power and reducing competition and innovation.
Large incumbents also control critical infrastructure, including data centers and associated energy and grid connections, giving them the ability to restrict or condition access for smaller players. This control may extend to differential treatment in maintenance or recovery during outages. Competition authorities[2]are increasingly examining how access to energy and data center infrastructure may influence competition in the AI sector. Meanwhile, global AI infrastructure investments – surpassing even the oil and gas sector in 2023 – are reshaping regional power grids and raising concerns about energy availability, allocation, and competitive fairness.
From a competition law perspective, denying or restricting access to essential infrastructure – or providing inferior support – could constitute abuse of dominance under Article 102 of the Treaty on the Functioning of the European Union (TFEU), particularly where it forecloses market entry or strengthens incumbents' market power. Coordination among major players to manage infrastructure collectively may also raise Article 101 TFEU concerns, potentially amounting to collusion if it restricts third-party access or limits competition. Regulators have also highlighted risks across the broader AI supply chain, including high concentration in chip design and relevant software, increasing cross-holdings and partnerships, and the speed of technological developments, which can lead to potential vertical foreclosure (see, for example, the UK’s Competition and Market Authority update paper AI Foundation Models, 2024)[3]. As energy demand from AI infrastructure rises, smaller players and local communities may face higher costs or limited access, further amplifying systemic competition risks across the ecosystem.
3. AI Service Efficiency as a Competitive Factor
The level of frugality of AI services – understood as optimizing resource use to minimize environmental impact – is increasingly emerging as a competitive parameter in response to AI’s growing energy and environmental footprint. Signals on both the demand side (greater interest from public and private buyers in frugal solutions) and the supply side (development of smaller models and increased transparency on environmental footprint) indicate this trend, which also extends upstream to more sustainable data center operations. According to the recent study by the French Competition Authority[4], frugality can stimulate competition by lowering costs through efficiency, enabling differentiated quality suited to smaller-scale or existing infrastructures, and steering innovation towards more diverse and resource-efficient solutions. At the same time, the French authority warned of competition risks, including misleading claims about frugality (akin to exaggerated environmental 'greenwashing' claims), inadequate disclosure of environmental information where there is demand, and practices that may restrict innovation in this area, whether arising from coordination between competitors or strategies by dominant players.
4. Standardizing the Environmental Impact of AI
AI's environmental impact is significant, but transparency is limited due to scarce data, inconsistent reporting, and hard-to-compare measures. Initiatives such as eco-design and frugal AI frameworks (ARCEP, ARCOM, AFNOR, French Ministry of Ecological Transition) and tools like Green Algorithms, Carbone 4, CodeCarbon, and CarbonTracker are helping to measure and improve sustainability. From a competition perspective, standardization can make environmental efficiency a competitive factor, but risks include unequal access, non-robust methodologies, and misuse of sensitive data. To ensure fair competition, tools and standards should be scientifically robust, transparent, third-party verified, and accessible to all operators.
5. State Aid Rules
Competition law also matters from a state aid perspective, as EU funding and public financing are key to supporting innovative AI and expanding capital-intensive digital infrastructure. The upcoming EU Cloud and AI Development Act notes that financial support – subject to State Aid rules – could benefit data centers with high sustainability impact, promoting both capacity growth and eco-friendly development. Broader EU frameworks, like the Clean Industrial Deal State Aid Framework, may indirectly support the sector by encouraging energy-efficient design, renewable energy use, and advanced cooling or power-management solutions in line with EU digital and sustainability goals.
Data Centers in the AI Act
As data centers increasingly use AI to optimize operations, the EU AI regulatory framework becomes directly relevant. General-purpose AI (GPAI) models can improve efficiency, reliability, and costs – for example, by optimizing cooling, power management, and traffic flows – potentially yielding significant energy savings. While the EU AI Act (Regulation 2024/1689) does not regulate data centers directly, it promotes transparency obligations for GPAI providers, including documenting or estimating energy use, encourages voluntary codes of conduct on data center efficiency, and requires the Commission to monitor standards for energy-efficient GPAI deployment.
Certain GPAI models may be classified as posing "systemic risk" due to potential impacts on public health, safety, security, fundamental rights, or society. Providers of such models face enhanced compliance obligations, and energy consumption is a factor in this assessment. These obligations complement broader EU energy rules, including the Energy Efficiency Directive (2023/1791), which imposes efficiency requirements on energy-intensive digital infrastructure. Together, these regimes create overlapping incentives for data centers and GPAI providers to reduce energy use, supporting both systemic-risk mitigation and compliance with general efficiency standards.
Conclusion
The rapid growth of AI and of the data centers to support AI services is raising significant competition concerns, particularly around the concentration of infrastructure and preferential access to energy-efficient or renewable-powered facilities. The EU is responding with measures such as the Apply AI Strategy (October 2025), the upcoming Cloud and AI Development Act, and the Data Centre Energy Efficiency Package (2026), alongside scrutiny of AI-related mergers and partnerships to safeguard fair access to cloud and computing resources. These efforts intersect with broader energy initiatives, including the Digitalizing the Energy System – EU Action Plan and the Strategic Roadmap for Digitalization and AI in the Energy Sector, highlighting the link between energy efficiency, AI expansion, and competitive neutrality.
Data center operators should implement strong compliance programs – access controls, audits, training, and contractual safeguards – to mitigate competition and regulatory risks. Businesses relying on AI infrastructure must monitor these developments closely to protect operations and future-proof investments.
Please turn to the authors, Charles Whiddington and Domniki Mari, or to other members of Steptoe's antitrust team if you wish to discuss any of these developments and their potential impact on your business.
[1] Former head of the US Justice Department’s antitrust division
[2] Recent study of the French Competition Authority on the competition issues surrounding the energy and environmental impact of artificial intelligence, published December 17, 2025
[3] https://www.gov.uk/government/publications/ai-foundation-models-update-paper
[4] See above study of the French Competition Authority