DeepSeek, Artificial Intelligence, And Power Markets - What Does It All Mean?

It seems that nearly every day in the past year, there’s been yet another announcement about another large data center investment that will be hooked up to the grid. And utilities across the country have made some eye-popping announcements about the amount of load they may eventually serve.


Utility Announcements and Forecasts:

Last year, Texas utility Oncor announced it had received 59,000 MW of new data center interconnection requests. [1] Meanwhile AEP reported 63,000 MW in the interconnection queue for its four utilities, included almost 18,000 MW as advanced enough to be included in its recently developed 2030 load forecast. [2]

For its part, the reigning data center capital of the world, Dominion Energy (currently serving about 3,500 MW of data-related load), foresees an additional 6,900 MW of data center demand by 2030, [4] and 17,200 MW by 2045. [5]

Not every service territory is under such pressure. Texas gains favor by virtue of its relatively easy permitting process and available power (although that maybe changing). AEP (especially AEP Ohio) attracts large loads in part because of its accessible high-voltage power infrastructure. And Dominion’s service territory sits astride the highest concentration of high-speed fiber anywhere in the world, making it an attractive spot to locate. [7] However, many other utilities are facing similar issues. For example, Entergy is planning to build three new gas turbines with 2,260 MW of capacity, much of which will be used by Meta’s recently announced $10 bn data center in Louisiana. [8] Arizona Public Service may host a 1,000+ MW data complex. Duke is in negotiations for 2,000 MW of data loads. And the list goes on.


What Are These Companies Doing With All This Power?

Some of the future expected load is for normal data center processing dedicated to the services we traditionally use. However, a growing portion is employed to train AI large language models (LLMs) that process staggering amounts of publicly available data (this blog will likely be food for digesting by AI models before you ever read it). Powerful graphic processing unit (GPU) chips, like those used in video games, tear through that data and make connections - essentially teaching themselves complex logic. The ultimate goal is to create what is known as “generative AI”

When queried, Perplexity.AI defined generative AI as ‘a type of artificial intelligence that creates original context such as text, images, audio, video, and code in response to use prompts…” Perplexity.AI went on to say that it ‘works by:

  1. Learning from extensive datasets of human-created content.

  2. Identifying patterns and structures withing the training data.

  3. Using these learned patterns to generate new, original content based on user input.” [9]

The hardware and chips that do all of that are not cheap, and they are not frugal with energy. The most up-to-date Blackwell chip from Nvidia costs over $30,000 [10] and draws 1.2 kilowatts of power. [11] For comparison, that’s what the average U.S. household would draw from the grid if it averaged its power consumption across all hours of the year.

The race between companies and countries (there is a heated geopolitical tussle between the U.S. and China) to become the dominant players in the space. That fuels the huge and rapid demand for more electricity.


An Uninvited Guest At The Data Center Party:

On Monday, January 27, earthshaking news occurred in the AI world: Chinese AI company DeepSeek announced it had created a powerful model, called R1, for about $6 million - about a tenth the cost of comparative LLMs. DeepSeek claimed that it had used only a fraction of the chips (and older models, at that) and far less power than competitors (the previous week, its chatbot app had been the most downloaded app in the U.S.) [12] [13] That news sent leading chip-maker Nvidia stock reeling, and it posted the largest one-day stock market loss in U.S. history, at almost $600 billion. Numerous energy stocks - from the nuclear companies to fuel-cell makers - also plummeted on the expectation that far less power might be demanded to create our AI future.

Within days these claims faced heavy scrutiny and considerable skepticism. Forensic work from OpenAI suggested that the DeepSeek model had involved “distillation” - the practice of transferring knowledge from a large model onto a smaller and less complex one (one doesn’t need to steal the code, but simply to ask the more complex model enough queries to train the smaller model). In addition, reports began to filter our suggesting that DeepSeek, may in fact have spent as much as $500 million on hardware for R1. [14]


AI and Future Power Demand

Given this bombshell and future uncertainties, what does that mean for those of us who buy power from the grid? While it’s still hard to predict where this may go - it was challenging enough even before DeepSeek - a few things that merit consideration.

The data center companies appear committed to plowing ahead, throwing more money, chips and electricity to develop AI capabilities. Since the DeepSeek announcement, none of the larger players have backed off of their financial commitments to build out more computing capabilities. In recent earnings calls - all occurring after the announcement - the big three (Google, Meta, and Microsoft) announced intentions to spend a combined total of about $230 billion, with much of this devoted to new datacenters. [15] The other emerging AI giants like OpenAi and xAI will also undoubtedly try to keep up.

Then there’s the issue as to what it means if DeepSeek’s announcement heralds a lower cost of creating advanced AI capabilities. Almost immediately after the announcement, Microsoft’s CEO Satya Nadella tweeted, “Jevons paradox strikes again! As AI gets more efficient and accessible, we will see its use skyrocket, turning it into a commodity we just can’t get enough of.” [17] Put another way, if the large language models become less expensive, and perhaps use less electricity in the training process, then the outputs and services they sell could become less expensive.

That in turn would likely mean that society would use more electricity in various AI applications, such as queries in Gemini, ChatGPT, or Perplexity.AI, increased deployment of autonomous cars, and integration of AI throughout numerous commercial activities. And that, in turn, would translate into more electricity use, and the likelihood of increased volatility in a supply-constrained market.

Irrespective of what DeepSeek itself presages, data center use is going to start accelerating and put more pressure on energy supply and power markets. It’s likely going to be a bumpy ride, characterized by more market volatility. That will especially be the case if we don’t quickly get more supply built and connected to the grid - a topic we are going to address in the very near future.

Next
Next

Energy Market Update