Investing in AI's Growing Energy Needs

20 Oct, 2025
 

The rise of AI is fuelling a structural transformation in global energy demand—creating powerful, long-term investment opportunities across utilities, infrastructure, and the broader power ecosystem.

Investing in AI's Growing Energy Needs

Since 2023, Jennison has published insights that have explored investment opportunities across the AI value chain—from semiconductors and cloud infrastructure to software applications. A fast-emerging theme is the surging demand for electric power driven by artificial intelligence. As AI models scale, so too does the energy required to train and run them, creating ripple effects across utilities, data centres, and infrastructure providers.

 

Powering the Intelligence Economy

The rapid deployment of generative AI has triggered a global race to build high-density data centres—facilities that consume significantly more power than traditional workloads. Projections from the International Energy Agency (IEA) show that global data centre electricity consumption is expected to double by 2030 (Exhibit 1).

Exhibit 1: Data Centre Electricity Consumption is Expected to Soar
Exhibit 1: Data Centre Electricity Consumption is Expected to Soar
zoom_in
Source: International Energy Agency (IEA), Energy and AI, April 2025
close
Exhibit 1: Data Centre Electricity Consumption is Expected to Soar
Source: International Energy Agency (IEA), Energy and AI, April 2025

Similarly, projections from Goldman Sachs research show that global power demand from data centres is expected to increase 50% by 2027 and by as much as 165% by the end of the decade (compared with 2023).1

 

Why Does AI Require So Much Power?

AI workloads are far more power-hungry than traditional computing tasks. The growing use of artificial intelligence—particularly large language models (LLMs)—requires massive amounts of electricity due to the intensity of both training and inference workloads. As a result, hyperscalers, data centre operators, and asset managers are committing significant capital to construct larger, high-capacity, next-generation data centres designed to meet the growing performance and power requirements of AI workloads. Ten years ago, a 30 megawatt (MW) data centre was considered large; today, a 200 MW data centre is considered normal and several hyperscalers are currently planning AI data centre campuses with power demands of 1 GW or larger.

For perspective, estimates from McKinsey show that 18 gigawatts (GW) of additional power capacity is expected to be needed to service U.S. data centers by 2030.

For comparison, the total power demand for New York City is currently around 6 GW. In other words, to meet the growing power demands of AI, it is expected that the U.S. will have to add the equivalent of “three New York Cities” to its power grid by 2030.2

 

Is Demand for AI Sustainable?

In our view, AI represents a generational paradigm shift in how consumers and enterprises interact with and use computing services. For enterprises, AI offers enhanced efficiency, superior execution, strategic differentiation, and deeper insights. For consumers, AI provides instantaneous access to information, personalised content experiences, and advanced problem-solving capabilities. 

The latest AI models—known as inference-time scaling or reasoning models— have the potential to deliver these capabilities at new levels of efficiency and effectiveness. These models can reflect, reassess, and revise answers, making them far more sophisticated and capable of handling complex, real-world tasks. From an energy perspective, these reasoning models require significantly more compute power, as they engage in longer, more resource-intensive inference cycles. As these models become the standard for AI interactions, they are expected to meaningfully accelerate demand for power and infrastructure.

 

What about DeepSeek?

The launch of DeepSeek R1, a generative AI model from a Chinese startup, challenged assumptions about China's competitiveness in AI by matching top-tier U.S. models in performance while operating on less powerful—and less expensive—hardware. While DeepSeek's performance relative to its cost is impressive, the company's claimed training cost advantages can be misleading, as they are not directly comparable to those of models developed by leading U.S. companies.

Nonetheless, as efficiency improves, we believe that AI will become more affordable and accessible, accelerating adoption across consumers, enterprises, and the broader tech ecosystem. This dynamic also illustrates Jevons Paradox—the idea that as technological efficiency increases, total consumption can actually rise rather than fall—suggesting that lower AI costs may ultimately drive greater demand for compute and power, not less.

 

The Opportunity Set is Much Broader than Nuclear Energy

The power demands of AI are creating a wide and expanding opportunity set for investors. While nuclear energy often grabs headlines, the infrastructure required to support AI extends far beyond nuclear power generation alone. Utilities are already aligning capital investments with tech-driven demand. New solar, wind, and natural gas-powered generation – some built alongside data centres – are expected to, on a combined basis, play an even larger role than nuclear in meeting the growing power demands of AI. Utilities are also investing in modernising their transmission and distribution (T&D) grids to leverage underutilised generation capacity and ensure grid stability, especially for AI data centres which typically require very high levels of reliability.

But meeting AI’s electricity needs also requires a broader ecosystem: data centres depend heavily on advanced HVAC systems to manage heat from high-density compute workloads, creating opportunities for companies specialising in cooling technologies. Additionally, natural gas is expected to play a key role in bridging near-term energy needs due to the scalability and reliability of natural gas-fired power plants (Exhibit 2). As AI’s growth accelerates, a diverse set of energy and infrastructure providers—large and small—stand to benefit from this structural shift.

Exhibit 2: Data Centre Capacity Additions Need Natural Gas
Exhibit 2: Data Centre Capacity Additions Need Natural Gas
zoom_in
Source: Wells Fargo Equity Research – AI Power Surge: Gas Pipeline and Data Centre Project Tracker 9/12/24
close
Exhibit 2: Data Centre Capacity Additions Need Natural Gas
Source: Wells Fargo Equity Research – AI Power Surge: Gas Pipeline and Data Centre Project Tracker 9/12/24

Conclusion

As AI adoption accelerates, so too does its demand for electricity—reshaping the landscape of global infrastructure and opening a broad array of investment opportunities. While nuclear power plays a role, the buildout required to support AI touches everything from natural gas, renewable electricity, and grid expansion to data center cooling and high-efficiency HVAC systems. This shift is not cyclical but structural, driven by a new generation of AI models that require more compute, more power, and more infrastructure. For long-term investors, the rise of AI represents not just a technological revolution—but a foundational transformation of the global energy economy.

1 Source: Goldman Sachs

2 Source: McKinsey, Investing in the rising data center economy, January 17, 2023.


4833727