Barclays NVIDIA's AI growth will require significant energy
Image credits: Stocksbnb

Barclays NVIDIA’s AI growth will require significant energy

Barclays analysts say that the rise of AI, especially with NVIDIA’s GPUs, will make a big difference in the amount of energy needed. By 2030, US data centers could use more than 9% of the country’s electricity.

In a recent thematic investing report, Barclays analysts talked about how the rise of artificial intelligence (AI) technologies will increase the need for energy. They focused on NVIDIA‘s role in this situation.

Analysts say that the expected energy needs of AI advances highlight an important part of NVIDIA’s market outlook.

According to Barclays research, data centres could consume more than 9% of all electricity in the United States by 2030.

This is primarily because AI requires a lot of power. Analysts said that “AI power baked into the NVIDIA consensus” is one of the main reasons for this big energy forecast.

The report also says that AI models are getting bigger and more complicated very quickly, even though AI gets better with each new generation of GPUs. For example, major large language models (LLMs) have grown about 3.5 times per year.

Despite these improvements, the increasing use of AI in various areas will lead to an increase in the global energy demand.

GPUs like NVIDIA’s Hopper and Blackwell series use less power with each new generation. Still, the bigger and more complicated AI models need a lot of computing power.

“For real-time performance, large language models (LLMs) need a lot of computing power,” the report says. Because LLMs need a lot of computing power, they also use more energy because they need more memory, accelerators, and servers to fit, train, and draw conclusions from these models.

Barclays said, “Organisations that want to use LLMs for real-time inference must deal with these problems.”

Barclays estimates that nearly 8 million GPUs will need about 14.5 gigawatts of power, which is equal to about 110 terawatt-hours (TWh) of energy. This shows how big the energy demand is. The forecast is based on an average load factor of 85%.

By the end of 2027, the U.S. will use about 70% of these GPUs. This will require over 10 gigawatts and 75 TWh of power and energy for AI in the U.S. alone in the next three years.

NVIDIA’s Rising Market Value

Experts say that NVIDIA’s market value shows that this is only the beginning of AI power demand deployment.

GPUs are still being developed and used by the chipmaker, which is expected to cause data centres to use a lot more energy.

Data centres also depend on electricity from the power grid, which makes it even more important to deal with peak power demands. Data centres require a balanced power supply due to their constant operation.

At the Davos World Economic Forum, CEO of OpenAI Sam Altman said something important: “We do need way more energy in the world than I think we thought we needed before.”

The report quotes that statement. I believe we are still not aware of how much energy this technology needs.

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Whistleblowers claim OpenAI's NDAs are illegal Previous post Whistleblowers claim OpenAI’s NDAs are illegal
Generative AI boosts writers' creativity but reduces originality Next post Generative AI boosts writers’ creativity but reduces originality