11 November 2025
Iliana Portugues
What ADIPEC 2025 Really Revealed About AI's Energy Future
ADIPEC 2025 broke every imaginable record: 239,709 attendees, 35,000 cross-sector agreements, and $46 billion in energy deals across all sectors [1]. Yet hidden in the small AI Zone was the paradox that undermines every one of those investments: the technology meant to enable the energy transition is accelerating energy consumption beyond what any transition can handle.
It was my first time attending. An overwhelming convergence of capital, optimism, and ambition. Yet walking through the halls of the world's largest energy conference, one thought kept echoing:
AI, the technology meant to save energy, is now consuming it faster than oil ever did.
The AI Zone was small, dominated by a handful of major corporations, and though organisers promised it would triple in size next year, what struck me wasn't the scale but the contradiction. Between the panels on carbon capture, renewable forecasting, and grid optimisation, no one was discussing the elephant in the convention centre. These AI systems are triggering the most significant surge in electricity demand we've seen since the Industrial Revolution.
The Numbers Nobody Wants to Discuss
During my time at ADIPEC, three data points kept emerging from grid operators and utility executives:
Dominion Energy Virginia: Data centre load will triple to 13.3 GW by 2038 — from 5% to 40% of total demand [2]
PJM Interconnection: 2025-26 capacity auction prices jumped 833% to $269.92/MW-day due to AI-driven demand [13]
International Energy Agency: Global data centre electricity consumption will hit 1,580 TWh by 2030 — more than India's entire grid [3]
These aren't projections anymore. They're locked-in infrastructure commitments.
The Reality Check
Let me show you what's actually happening in three of my favourite energy AI platforms right now:
Tomorrow.io's Weather Intelligence Platform processes millions of satellite images daily to forecast wind and solar generation. Their models help National Grid UK reduce balancing costs by 17%. Impressive, REALLY — until you realise their NVIDIA-powered infrastructure consumes 2.4 MW continuously, enough to power 1,800 European homes and, at current industrial rates of €0.12/kWh, €2.5M annually just for compute.
Electricity Maps provides real-time carbon intensity data to Google, Microsoft, and Salesforce for their "24/7 carbon-free energy" goals. Their API handles 100 million requests daily. The irony? Those requests run through AWS data centres in Ireland, drawing 300 MW from a grid that's 40% gas-powered.
Kayrros detected 152 methane super-emitter events in the US last year using Sentinel satellite data. Their AI prevented emissions equivalent to 3 Mt of methane. But their image processing infrastructure — analysing data from ESA's entire Copernicus constellation — burns through 18 GWh annually.
What the Energy Giants Are Actually Building
Let's examine what actually happened at ADIPEC beyond the press releases:
Shell's Quantum-AI Platform claims to optimise carbon capture at the molecular level. Impressive, definitely. Unfortunately, their quantum computing infrastructure requires continuous cooling to near absolute zero, consuming 4.2 MW, just to maintain operational temperature [4].
TotalEnergies' Continental Grid AI manages renewable flows across seven European countries. The system prevented 2.3 Mt of emissions last year. But their data centres in Ireland and the Netherlands consumed 127 GWh - powered by grids that are 42% fossil-fueled [5].
Saudi Aramco's Autonomous Exploration reduces exploration costs by 60%. Each autonomous drilling operation requires 8 MW of continuous compute power. Multiply that across 147 active rigs [6].
The pattern is consistent: Every efficiency gain requires exponential increases in computing power.
What 2025's Data Actually Shows
The International Energy Agency's April report revealed the scale of our self-deception [7]:
Training Costs: GPT-4 consumed 50 GWh—enough to power San Francisco for 3 days.
Inference Reality: ChatGPT handles 1 billion queries daily, consuming 109 GWh annually.
Growth Trajectory: US data centres alone will hit 426 TWh, an increase of 133% from today.
Notably, the IEA buried in footnote 47 that AI now drives 35-50% of data centre growth, a significant rise from 5-15% just two years ago [7].
The Uncomfortable Truth
We're burning fossil fuels at unprecedented rates to build AI systems that promise to reduce those very consumption levels.
Dr Sultan Al Jaber, ADNOC's CEO, stated at ADIPEC's opening that "electricity demand will keep surging through 2040, as power for data centres grows fourfold" [8]. The same speech promised AI would enable the energy transition. This paradox highlights the urgent need for a paradigm shift in our approach to AI and energy.
This isn't just a transition. It's an alarming acceleration in the wrong direction. Consider the evidence:
Google's emissions increased 48% since 2019, despite net-zero pledges. [9]
Microsoft's AI infrastructure is projected to require 50 GW by 2027. [10]
OpenAI's Stargate initiative needs five data centres, each consuming 5 GW—more than New Hampshire's total demand. [11]
Where Europe Gets It Wrong
European companies at ADIPEC expressed concerns about "high energy costs making AI development uncompetitive". However, they are missing the crucial point. The EU's approach—balancing AI innovation with sustainability through regulatory frameworks—assumes that such a balance is feasible. It is not. Every major AI deployment I observed at ADIPEC increases net energy consumption.
Take predictive maintenance, the poster child for AI efficiency:
Argonne National Laboratory reports 43-56% maintenance cost reduction [12]
But their AI system requires 3.2 MW of continuous operation.
Net benefit? Marginal at best when you factor in the compute infrastructure
The Market's $143 Billion Delusion
ADIPEC speakers identified three "opportunities" for 2026:
Grid Flexibility ($47B market): AI-powered demand response sounds promising, but the reality is that the AI systems themselves are now driving peak demand. The PJM market witnessed $9.3 billion in price increases solely due to data centre loads [13].
Carbon Management ($28B market): We cannot ignore that while AI is leveraged to track emissions, it is simultaneously responsible for driving those emissions up. Microsoft's carbon capture platform runs on Virginia's grid—48% gas-powered [14].
Energy Trading ($15B market): AI optimising markets is simultaneously destabilising through massive demand spikes.
The Physical Constraints Nobody Discusses
Walking through ADIPEC's technical sessions, I found a key point repeatedly made: companies that are successful at leveraging AI are privatising efficiency gains while socialising computational costs and infrastructure burdens.
Transformer Shortage: Minimum of 52-week lead times for equipment rated above 100 MVA
Cooling Crisis: Each MW of compute requires 1.5 million gallons of water annually
Grid Reality: 20% of planned data centres face multi-year interconnection delays
The bottleneck isn't technology. It's physics.
What This Means for Founders
Some decisions that matter:
1. The Efficiency Trap: Stop optimising systems that shouldn't exist. If your AI solution requires a 5 MW data centre to save 10 MW of grid power, you're part of the problem; improve your algorithms.
2. The Honest Accounting: Calculate total lifecycle energy: training, inference, cooling, infrastructure. If you can't show 10x efficiency gains, pivot now.
3. The Constraint Advantage: Build for edge computing. Design for intermittent computing. The companies that thrive are those that have learned to operate without massive computing.
4. Complexity Arbitrage: Do not deploy 50-layer neural networks for problems that linear regression could solve. Yes, AI attracts investment, complexity fees, and nobody can assess a neural network's actual efficiency and effectiveness. Still, sooner or later, you'll fail, as prices for computational power will inevitably go up.
The Path Forward
Abdulmunim Al Kindy, ADIPEC's chairman, called for "an intelligent and pragmatic approach that embraces all viable sources and technologies" [15]. Here's pragmatic:
Stop pretending AI will save us from AI's energy consumption.
The winners from ADIPEC weren't the companies making significant AI announcements. They were those who were honest about the tradeoffs:
Enlitia's approach: Running inference locally on wind turbines, eliminating 90% of data transmission
Clean Connect AI: Triggered sampling only when anomalies are detected, cutting compute by 85%
Gecko Robotics: Physical inspection drones that gather data without continuous processing
The Bottom Line
ADIPEC 2025 facilitated $46 billion for energy infrastructure that the very AI systems meant to optimise it will consume.
And so, the energy sector faces a choice: Continue the unstructured AI arms race that's driving us toward 1.7 gigatons of additional emissions by 2030, or acknowledge that burning coal to optimise solar farms isn't true innovation; it's merely a means to sell more infrastructure.
It's the energy equivalent of high-frequency trading: individually rational, collectively catastrophic
CurrentWorks: We do actual intelligence. Not artificial. You do the building.
Founders (Pre-Seed to Series A): Join our January cohort for 12 weeks. We will help you accelerate the creation, development and growth of your business or solutions, through unique intelligence, energy-focused tools and support that help you develop a sustainable value proposition and commercialisation pathway. [Apply → here]
Everyone else: Please leave a comment with your views. I can only improve if you give me some feedback.
Photo of the Abu Dhabi National Library and Archives
References
ADIPEC, "ADIPEC 2025 Conference Report," Abu Dhabi International Petroleum Exhibition & Conference, Abu Dhabi, UAE, Nov. 4-7, 2025.
Dominion Energy, "Integrated Resource Plan," Richmond, VA, USA, Tech. Rep. Q3-2024, Oct. 2024.
International Energy Agency, "Electricity 2025: Analysis and Forecast to 2030," IEA, Paris, France, Jan. 2025.
Royal Dutch Shell, "Quantum Computing Applications in Carbon Capture," presented at ADIPEC 2025, Abu Dhabi, UAE, Nov. 5, 2025.
TotalEnergies, "Continental Grid AI Performance Metrics Q3 2025," Paris, France, Oct. 2025.
Saudi Aramco, "Autonomous Drilling Operations Report," presented at ADIPEC 2025, Abu Dhabi, UAE, Nov. 6, 2025.
IEA, "Data Centres and AI: Energy Implications," Paris, France, Apr. 2025, p. 47, footnote 47.
S. A. Al Jaber, "Opening Keynote Address," ADIPEC 2025, Abu Dhabi, UAE, Nov. 4, 2025.
Google, "2024 Environmental Report," Mountain View, CA, USA, Mar. 2025.
Microsoft Corporation, "Infrastructure Roadmap 2025-2027," Redmond, WA, USA, Internal Report, Sept. 2025.
OpenAI, "Project Stargate Infrastructure Requirements," San Francisco, CA, USA, Oct. 2025.
Argonne National Laboratory, "AI in Predictive Maintenance: Energy Cost Analysis," Lemont, IL, USA, Tech. Rep. ANL-25/03, Mar. 2025.
PJM Interconnection, "2025/2026 Base Residual Auction Results," Valley Forge, PA, USA, July 2025.
Microsoft Corporation, "Carbon Capture Platform Energy Analysis," Redmond, WA, USA, Q2 2025.
A. Al Kindy, "Chairman's Address," ADIPEC 2025, Abu Dhabi, UAE, Nov. 4, 2025.