Although it has risen again, it is very scary to think about it!

Mondo Entertainment Updated on 2024-03-05

Energy stocks soared today.

The reason is that Musk said that AI consumes a lot of energy, and there will be a shortage of electricity in the future, and the bottleneck of AI in the future may be energy.

Then, all the big Vs in China copied brainlessly, and Lao Ma's views became his own.

Fueled by various voices, AI and energy and power are all in the best of the rage today.

Well, it isOld rice and new stir-fry。There was a wave last year.

At that time, I also felt that there was some truth.

Because last year I was equipped with a set of desktops, with a top CPU + graphics card, tens of thousands. The power supply of the chassis was directly dried to 1000W+, and as a result, the electricity bill increased significantly. As soon as I look at the guys, the computing power does consume power.

But, I've done the math again today. AI consumes a lot of power, but it's not that big either.

In order to strive for accuracy and reliability, I made several sets of data:

1) In January 2023, OpenAI's power consumption in a month may be equivalent to 17The annual electricity consumption of 50,000 Danish households.

I checked, a family in Denmark has 4,000 kilowatt-hours of electricity a year. In this way, OpenAI will consume 700 million kWh of electricity in one month in 2023, and it will be 9.8 billion kWh in a year.

2) Google AI consumes 23 TWh, equivalent to the electricity consumption of all households in Atlanta for 1 year.

1 too = 10 12. In this way, Google's AI consumes 2.3 billion kWh of electricity per year. Due to the size of the computing power, whether to include water cooling, and the year of the data, the annual energy consumption of Google's AI is less than that of OpenAI, but it is an order of magnitude.

3) According to Schneider Electric of France, the total power consumption of global AI workloads in 2023 is about 43GW, slightly less than the power consumption of the Cyprus state in 2021 (4.).7gw)。

I calculated that the total global AI energy consumption in 2023 will be 43 * 10 6 * 24 * 365 10 8 = 37.6 billion kWh.

OpenAI is 9.8 billion kWh a year, Google AI is 2.3 billion kWh a year, and global AI is 37.6 billion kWh a year. The three data** are different, but they are verified by each other.

Let's take a look at the **::

4) According to Schneider Electric** of France, the power consumption of global AI workloads will grow at a compound annual growth rate (CAGR) of 26% to 36%, which means that the power consumption of AI workloads will increase to 20GW by 2028.

I still calculated that the total global AI energy consumption in 2028 will be 175.2 billion kWh.

5) I also found research literature in Joule, a top journal in the field of energy.

The study shows that by 2027, the energy consumed by AI could power a country the size of the Netherlands for a year, equivalent to about 85-134 TWh.

After conversion, it can be seen that in 2027, the global AI energy consumption will be 85,0134 billion kWh.

This is mutually validated with the data from Schneider Electric in France. Schneider is a bit more optimistic.

To sum it up,We have obtained 5 sets of different AI energy consumption data, which can be verified with each other after conversion.

And these 5 sets of data show thatIn 2023, the global AI power consumption will be less than 40 billion kWh, and in 2028, the AI power consumption will climb to 1500 200 billion.

What level is this?

Not low, indeed. After all, in 2022, Australia generated 270 billion kWh of electricity annually, ranking 19th in the world. Therefore, by 2028, the global AI energy consumption may be equivalent to that of a medium-sized country.

But the problem is that China is too awesome, and China will generate as much as 91 trillion degrees, the first in the world, is the sum of 2 5, and is 33 times that of Australia.

Given China's size, that's another story:

Assuming that the world's AI moves to China in 2028, it will only account for 2% of China's power generation.

Consider that China's annual power generation can grow by 5%. This power consumption of AI is just a younger brother.

So, a more likely scenario is,The bottleneck in the future of AI may indeed be energy, but mainly for countries other than China

For a medium-sized country, a few AI models can consume a significant portion of the country's electricity generation. That's really not going to be played, and you're not even eligible to play at the table.

But for China, energy is not a problem at all, and the bottleneck is still in chips.

In short, today's hype about AI energy consumption is generally nonsense. As for Musk's words, just take a look, and don't take it seriously. After all, Lao Ma also promoted the AI threat theory last year, and turned around and opened an AI company. A few years ago, Lao Ma also said that lithium is the oil of the future, and the results are obvious to all.

In other words, now energy stocks have to rely on AI to hype. It's scary to think about.

The next grid refers to 26。It's not easy to write some dry goods, and I hope you will like it more, watch it and support it

Related Pages