Dell spoilers Nvidia B200 will be released next year, energy consumption is up to 1000W, and the ult

Mondo Technology Updated on 2024-03-04

After the "chip shortage", the artificial intelligence market will also face a "power shortage"? Dell, one of the world's largest server manufacturers, has more than doubled its share price in the past 12 months, thanks to surging demand for AI servers. After the earnings report, Jeff Clarke, chief operating officer of Dell, revealed in a press release that Nvidia will launch a B200 product with a "Blackwell" architecture in 2025, and the power consumption may reach 1000W. Clarke also said that Dell's flagship PowerEdge XE9680 rack server features NVIDIA GPUs, the "fastest" solution in the company's history.

The power consumption of B200 is more than 40% higher than that of H100At present, Nvidia has not revealed the details of the Blackwell architecture, if from the perspective of chip manufacturing, refer to the basic rule of thumb for heat dissipation (the maximum heat dissipation per mm chip area is 1W): Nvidia's H100 (built on a custom 4nm-level process technology) has a power consumption of about 700W (including HBM memory power), and considering that the area size of the chip die is 814mm, the power consumption per square millimeter is actually less than 1W. This equates to a 40% increase in power consumption for the B200 compared to the H100. According to some institutional analysis, H200 is likely to be built on another performance-enhancing process technology, such as a 3nm-level process technology. And given the power consumed by the chip and the amount of heat dissipation required, the B100 could be the company's first dual-chip design to produce a GPU, giving it a larger surface area to dissipate heat. It is reported that AMD and Intel have adopted a GPU architecture with a multi-chip design, which may become an industry trend. In addition to the energy consumption that demands chip design, when it comes to AI and high-performance computing (HPC) applications, it is also important to consider how to balance the high power required by these FLOPS with the heat energy released at the same time. Floating-point operations per second (FLOPS) refers to the number of floating-point operations per second, which is generally used to measure the performance of hardware. For software developers, what matters is how to use these flops efficiently; And for hardware developers, what matters is how to cool the processors that produce these flops. And that's where Dell Blackwell processors come in. Clarke said

"[NVIDIA's next-generation AI and HPC GPUs] will be implemented on next year's B200. ”

We'll have the opportunity to showcase our engineering and our speed of action, and the work we're doing as an industry leader, putting our expertise to work on liquid cooling at scale, whether it's the fluid chemistry and performance work, or our interconnect work, the telemetry work we're doing, the power management work we're doing. And that really sets us up to bring it to market on a large scale to take advantage of this incredible computing power or strength or capacity that is going to be present in the market. ”

The B200 did not appear in Nvidia's technology roadmap released in October last year. Nvidia has yet to announce details about the B100, but it is likely that they will be released at the upcoming developer conference later this month.

The ultimate benefit of AI is energy With the development of artificial intelligence technology, the market is currently experiencing a surge in demand for chips, but it will also face a surge in electricity demand after that. From an industry perspective, the boom in the field of artificial intelligence has almost reshaped the already hot data center market. According to some data, the global data center market consumed 10 billion watts of electricity a decade ago, and now 100 billion watts is common. This is despite the fact that AI is currently only a small fraction of the size of data centers worldwide. But according to the Uptime Institute, the share of AI in global data center electricity consumption will soar from 2% to 10% by 2025. Some strategists said that the development of AI technology is good for energy stocks

"More and more people are starting to realize that large AI server farms will require a lot of energy, which is increasing the interest of some investors to start expanding their investment to related energy sectors such as power, oil and gas, and nuclear energy is also starting to gain traction. ”

Musk has also previously expressed concern about the energy outlook. Late last year, he said on a podcast that there is a chip shortage in the United States now, a transformer shortage in a year, and a power shortage in about two years. According to a report, the current demand for transformers in the United States is mainly supplemented by imports. As the transition to a cleaner power system continues to expand, demand for transformers will surge, and without further action, the U.S. will face an insurmountable domestic gap by 2030. Star Wall Street news, good content is not missedThis article does not constitute personal investment advice, does not represent the views of the platform, the market is risky, investment needs to be cautious, please make independent judgment and decision-making.

Related Pages