Artificial intelligence has become the growth engine of the tech sector, but its rapid expansion comes with a steep environmental bill. A new industry-wide forecast of electricity and water use in hyperscale data centres suggests that—even with aggressive efficiency gains—most AI-heavy companies will struggle to meet their self-imposed net-zero targets for 2030.
The Promise of Net Zero by 2030
Over the past three years, virtually every major cloud or AI service provider has announced a net-zero commitment. The pledges typically include:
- Full decarbonisation of on-site operations.
- 100 per cent renewable electricity procurement.
- Elimination of residual emissions through offsets after 2030.
These declarations have attracted investor confidence and public goodwill, yet few firms have disclosed concrete, year-by-year pathways that reconcile rising AI demand with declining emissions.
How AI Workloads Inflate Energy Demand
AI workloads differ from traditional cloud applications in two critical ways:
- Training phase: Complex models such as large language models (LLMs) can require thousands of GPUs running for weeks. Training a single state-of-the-art model today can consume upwards of 1 GWh—roughly the annual electricity use of 100 U.S. homes.
- Inference phase: After deployment, the model continues to draw power every time a user prompts it. Widespread adoption therefore keeps energy demand permanently elevated.
Headline Numbers From the Latest Forecast
The new forecast—compiled by independent researchers using public disclosures, grid data, and machine-learning growth trends—projects the following by 2030:
- Electricity demand: AI data centres could require between 85 TWh and 134 TWh annually, a 3- to 5-fold increase over 2023 consumption.
- Water use: Cooling systems may draw 4.2–6.6 million cubic metres of water per day—roughly the daily needs of a city the size of London.
- Carbon gap: Even if companies procure enough renewable energy certificates (RECs), hourly mismatches between supply and demand leave an estimated 40–60 Mt of residual CO₂e in 2030.
These figures illustrate a widening disparity between ambition and projected reality.
Barriers to Closing the Gap
1. Renewable Capacity Shortfalls
Solar and wind installations are growing, but not at the pace needed to serve both general grid decarbonisation and the extra AI load. Queue bottlenecks in transmission approvals further impede progress.
2. Hardware Efficiency Limits
GPUs and specialised AI accelerators improve every generation, yet performance gains are slowing to single-digit percentages. Meanwhile, model sizes often double within 6–12 months, outpacing hardware advances.
3. Water-Intensive Cooling
Most hyperscale facilities rely on evaporative cooling, which is efficient but water-hungry. Alternatives such as liquid immersion or refrigerant-based systems reduce water use but raise capital and operational costs.
4. Temporal Mismatch of Renewables
Data centres require constant power. Peak solar output rarely aligns with nighttime inference demand, and large-scale storage remains expensive. Hourly accounting methods, now favoured by regulators, expose the real emissions that traditional annual REC strategies mask.
The Hidden Environmental Cost of Model Training
Beyond electricity and water, training infrastructure has an embedded carbon footprint from manufacturing semiconductors, building servers, and constructing facilities. Life-cycle analyses estimate that embodied emissions can account for 15–25 per cent of a data centre’s total climate impact over its first decade.
What Would It Take to Get Back on Track?
Efficiency-First Development
Refactoring algorithms, optimising code, and implementing sparsity techniques can cut training energy by up to 40 per cent without sacrificing accuracy.
Grid-Connected Storage & Demand Shifting
Large-scale battery deployment and time-shifted workloads could align compute demand with periods of renewable oversupply, reducing reliance on fossil-fuel peaker plants.
Water-Free Cooling Technologies
Adoption of closed-loop liquid cooling or direct-to-chip systems can eliminate nearly all evaporative water loss, albeit with higher upfront costs.
Transparent Reporting
Mandatory, standardised disclosure of hourly energy use and water withdrawals would enable stakeholders to verify progress and apply pressure where companies fall short.
Policy Levers on the Horizon
Governments are starting to react. Proposed measures include:
- Performance-based energy codes for new data centres.
- Tiered water pricing in drought-prone regions.
- Clean-energy purchasing mandates tied to actual hourly generation.
Failure to comply could result in significant fines, restricted expansion permits, or public-market penalties.
Conclusion: Ambition vs. Physics
AI’s transformative promise does not exempt it from the laws of thermodynamics. Unless companies couple technological innovation with radical transparency and aggressive procurement of verifiably clean energy, the net-zero goals loudly proclaimed for 2030 will remain aspirational slogans rather than achievable targets.



