Data Center s Carbon Time Bomb How Google is using software to reduce emissions?

Mondo Technology Updated on 2024-02-26

Imagine how inconvenient and anxious you would be if your mobile phone, computer, tablet and other devices suddenly couldn't access the Internet? The networking function of these devices depends on the operation of the data center. Data centers are the brains of the Internet, they store, process and transmit massive amounts of data, and provide us with a variety of services, such as search, social, gaming, etc.

However, what you may not know is that the operation of a data center is also very energy-intensive and emission-intensive. According to the International Energy Agency, data centers and transmission networks each account for 15%。The total amount of carbon dioxide they emit each year is comparable to that of "One Brazil".

As tech giants race to build massive data centers around the world, the carbon ticking time bomb created by this has become a problem for them. As more and more power-hungry artificial intelligence (AI) comes online, the energy consumption and carbon emissions of data centers will increase dramatically. So, is there a way to make data centers more environmentally friendly and energy-efficient?

Artificial intelligence is one of the hottest technologies today, allowing computers to simulate human intelligent behavior, such as recognizing images, speech, and text, generating content, recommending products, and playing games. The application of artificial intelligence has penetrated into all aspects of our lives, bringing us a lot of convenience and fun.

However, the operation of artificial intelligence is also very energy-intensive. At its core, AI is a large language model that requires massive amounts of data and computing resources to train and deploy. These language models typically use graphics processing units (GPUs) to speed up computations, but GPUs consume more power than the processing units (CPUs) used in other forms of computing. The International Energy Agency estimates that training an AI model requires more than 100 homes to use electricity in a year.

Energy consumption of AI models

The climate risks posed by the need for AI-driven computing are far-reaching, and will worsen if we do not switch from fossil fuel electricity to clean energy. Nvidia CEO Jensen Huang said AI had reached a "tipping point."

He also said that the cost of data centers will double within five years.

Faced with pressure on data center energy consumption and carbon emissions, hyperscalers – the world's largest data center owners like Google, Microsoft, and Amazon – have set climate targets and are under internal and external pressure to meet them. These lofty goals include decarbonizing their business.

Google, for example, has pledged to achieve carbon neutrality in all of its operations by 2030 and carbon negative emissions in all of its operations by 2040. Microsoft has pledged to halve carbon emissions from all of its operations by 2030 and eliminate carbon emissions from all of its operations since 1975 by 2050. Amazon has pledged to achieve carbon neutrality in all of its operations by 2040 and 100% renewable energy in all of its operations by 2030.

However, the rise of artificial intelligence has made it difficult to accomplish these goals. The energy consumption of AI is unstable and more akin to a zigzag chart than the smoothing line that most data center operators are used to. This makes decarbonization a challenge, not to mention ensuring the stability of the power grid.

According to D**e Sterlace, global data center account director at Hitachi Energy, the growth of AI is being driven by North American companies, which has led to the concentration of computing power and energy use there. This is a trend he didn't anticipate two years ago.

To reduce CO2 emissions from data centers, hyperscale and large data center vendors have funded large solar or wind farms and used credits to offset emissions.

However, this alone is not enough, especially with the increasing use of artificial intelligence. That's why operators are turning to Alphabet IncThe "load shifting" strategy employed by Google, a subsidiary of Google. The idea is to reduce emissions by disrupting the way data centers operate.

Today, most data centers seek to operate in a "steady state," which means that power demand and power are as balanced as possible to avoid power fluctuations and waste. However, this approach is not conducive to reducing carbon emissions, as it does not take full advantage of the benefits of renewable energy.

Renewable energy, such as solar and wind, is a clean energy source, but it is also an unstable energy source because their production is affected by the weather and seasons. If data centers simply use renewable energy to replace fossil fuels, they will need to stock up on large amounts of batteries or other energy storage devices to cope with the volatility of renewable energy, which increases cost and complexity.

Google's "load shifting" strategy is to break this "steady state" thinking and make the operation of data centers more flexible and intelligent to adapt to changes in renewable energy. Specifically, it is to use artificial intelligence and cloud computing technology to monitor the energy consumption and carbon emissions of data centers around the world in real time, as well as the situation of renewable energy, and then dynamically adjust the load of the data center according to this information, and move computing tasks to places with cleaner and cheaper energy to perform, so as to reduce the overall energy consumption and carbon emissions.

For example, if a wind farm in one region is generating a lot of electricity and the local data center is under low load, Google can take advantage of the low-carbon power in that region by moving computing tasks from other regions to data centers in that region. Conversely, if there is a shortage of renewable energy** in a region and the load on the local data center is high, Google can move the computing tasks in that region to data centers in other regions, thereby reducing the use of high-carbon electricity in that region.

This "load shifting" strategy can not only reduce the carbon emissions of the data center, but also reduce the operating costs of the data center, because it allows the data center to choose the cheapest power to use at different times and places. Moreover, this strategy has little to no impact on the user's experience, as Google's cloud computing platform guarantees data security and service continuity.

Google's "load shifting" strategy is an innovative solution that demonstrates the great potential of artificial intelligence and cloud computing to promote energy efficiency and emission reduction in data centers. If more data center operators adopt this strategy, the carbon footprint of data centers will be significantly lower, making an important contribution to global climate goals. Technology meets reading

Related Pages