The data center industry lies at the core of the AI revolution where its plays a crucial role in delivering vital infrastructure necessary to fuel its transformative power. The boom in the adoption of generative AI, cloud adoption, and edge computing, is transforming power demands from the data centers leading to the implementation of significant transitions across its which comprises primarily the grid and utilities.
Estimates from the International Energy Agency (IEA) suggest record electricity consumption by data centers, in 2022 which accounted for approximately 460 TWh and are anticipated to double by 2026, potentially exceeding 1,000 TWh, owing to the booming surge in the development of data centers across the globe. The projected increase in electricity consumption approaches levels comparable to the total annual electricity demand of Japan, highlighting the urgency to adopt innovative energy strategies and sustainable solutions to support the exponential growth in electricity demand, while safeguarding the reliability and resilience of power systems.
Are Utilities Ready for the Massive Energy Demands of AI and Hyperscale Data Centers?
Advancements in the development and adoption of AI models, such as ChatGPT, Grok, Gemini and many others are reshaping the digital landscape across the globe owing to them being more energy-intensive than past data center applications, demanding up to ten times the electricity compared to traditional Google searches, eventually posing significant restraint on the electrical and water infrastructure and necessitating the adoption of countermeasures by utilities and Independent Power Producers (IPPs) to serve hyperscale customers with:
- Custom grid balancing solutions for large loads
- Voltage stabilization systems to avoid disruptions
- Flexible infrastructure planning to support next-gen data centers
Deployment of AI chips gaining traction in modern data centres act as the major contributors towards the power consumption surge, which require vast energy to process large-scale data in real time. AI-driven data centres, primarily powered by graphics processing units (GPUs), now consume more electricity than entire nations like South Africa and Indonesia. For instance, a single query in OpenAI's ChatGPT, consumes 2.9Wh of electricity which is nearly 10 times that of a Google search.
Category |
Power Consumption |
Descriptions |
ChatGPT |
2.9Wh |
~10x of Google general search (0.3Wh) |
AI data center |
~260TWh in 2024 |
The level of South Africa (210TWH) or Indonesia (270TWh) |
~500TWh in 2027 |
The level of France (470TWh) or Germany (S00TWh) |
Source: World Economic Forum
This massive power demand has raised environmental concerns, emphasizing the need for sustainable industry practices. The manufacturing sector alone accounts for over 40% of global power consumption, making it a focal point for energy efficiency improvements.
Governments across the globe are implementing strategic measures to control power consumption by data centers and adopting renewable energy generation frameworks to cater the enhanced demand arising from the boom in the AI data centers. For instance, Singapore introduced regulations limiting data centre capacity due to energy shortages. The country currently operates more than 70 data centres, accounting for 60% of total data centre capacity of Southeast Asia. The country further ceased approvals between 2019 and 2022, with those under review subject to capacity restraints.
The challenge has been acknowledged by AI leaders where at the 2024 World Economic Forum in Davos, Switzerland OpenAI CEO Sam Altman stated: “An energy breakthrough is necessary for future artificial intelligence, which will consume vastly more power than people have expected”, signalling inclusion of technological innovations, comprising primarily of more efficient data centers or new renewable energy solutions to support future growth of AI data centers.
Is Scaling AI Worth the Cost? The Hidden Energy Price of Bigger Models
Back in 2019, AI researcher Rich Sutton sparked a major shift through his research, depicting how the machine learning community approached progress. His message was clear: the path to better-performing AI lies in scaling — bigger models, more data, and more compute power. The industry took that idea to heart. Since then, the principle of “bigger is better” has driven the development of ever-larger AI models.
But this rapid scaling comes at a steep cost as training cost of AI models in present times is not just a technical feat but is a costly and an energy-intensive process. In some cases, the compute needed for a single model run can cost hundreds of millions of dollars and consume thousands of megawatt-hours of electricity. The table below illustrates just how massive these energy demands can be, along with their corresponding carbon footprints.
Energy Consumption of Large-Scale AI Models - 2024
MODEL NAME |
NUMBER OF PARAMETERS |
ENERGY CONSUMPTION |
GPT-3 |
175 billion |
1,287 MWh |
Gopher |
280 billion |
1,066 MWh |
OPT |
175 billion |
324 MWh |
BLOOM |
176 billion |
433 MWh |
Source: Cornell Tech
How Will This Impact the Growth of Data Centers?
The intersection of AI, cloud, and power demand is no longer a future concern and is a reality in today’s world. Availability of constrained power resources to feed energy demanding data center facilities is acting as a major bottleneck in the expansion of the data centers across the globe. Economies in coherence with industry players are adopting innovative power generation methodologies to transition from conventional power generation towards more sustainable and greener ones. The shift from traditional utility reliance to a power-forward mindset will allow data centers to scale confidently, cost-effectively, and will eventually boost sustainably. Industry veterans predict nuclear power to become the prime source of energy that will tackle the AI boom which is evident from the strategic developments undertaken by industry stakeholders in this regard. The Microsoft plans to restart Three Mile Island, investments by Sam Altman in Helion, a startup which is working to develop and commercialize nuclear fusion, or agreement of Google with Kalios to purchase nuclear energy from small modular reactors (SMRs) for powering Google’s data centers present notable instances showcasing slow transition of AI data center away from the conventional utility dependency towards self-sustained, next-generation energy ecosystems that are resilient, scalable, and carbon-conscious.