The performance of the Nvidia special version of the chip has shrunk by 80%, and the price is the best, which has attracted the attention of the industry.
This article is original. Do not plagiarize or copy!
The United States continues to change chip manufacturing rules to restrict the export of Nvidia's high-end artificial intelligence chips to Chinese mainland. I think Nvidia is in a very low mood. It has the potential to win billions or even tens of billions of yuan worth of orders in Chinese mainland and make considerable profits.
However, chip manufacturing regulations in the United States have forced Nvidia to change its long-term strategy of reducing chip performance and bring special-spec chips to the market. And the general concern in the industry is that Nvidia's special specification chips are finally here, and the performance has been reduced by 80%, but the performance has been improved*.
NVIDIA Special Edition chip.
If you want to ask what the hottest and most talked-about technology trends are right now, it's artificial intelligence. High-tech companies are scrambling to build large-scale models and develop the best AI products. Nvidia occupies 90% of the global artificial intelligence chip market, and it is in the limelight and occupies an absolute advantage.
Almost all of the high-end AI chips on the market are NVIDIA products, and although these products are good, they can't be sold if they want to. According to relevant U.S. regulations, NVIDIA's A100, H100, A800, H800 and other high-end AI chips are prohibited products.
The more the situation gets out of control, the more likely it is to abandon the Chinese market? This is because the H20 series was developed on the basis of the H100 design, before which the chip power was reduced again in order to meet export standards.
As reported by Reuters in early February, Nvidia has begun accepting pre-orders for new Chinese-made AI chips, which are on par with Huawei's rivals.
Nvidia is H20 compared with Huawei Ascend 910B, in terms of performance, H20 is 80% lower than H100, and Huawei products are much inferior to Huawei technology FP32 in some key areas. It can be observed that the competitiveness of such products in the Chinese market may still affect the pricing of H20, starting from 12,000 yuan.
For dealers, it will cost around 17,000 yuan, which is equivalent to 120,000 rand, which is in the same range as the Huawei RISE 910B.
Huawei's opportunity?
In the case of the H20 series, this is the second time that Nvidia has developed a dedicated version of the chip for China, and the United States is constantly wary of China's high-tech industry, and should establish export control rules for high-end advanced products on the chip, reduce the computing power of the chip, and thus reduce the performance level.
Customers are choosing more and more alternatives, after all, NVIDIA can't maintain the core competitiveness of its high-end products, and with so much compressed performance from similar reference products, customers won't necessarily do NVIDIA's business.
Jiang Tao, vice president of KDDI, said that Huawei's Ascend 910B can be compared to the Nvidia A100. So, what level is the Ascend 910B?
According to publicly available information, the AI performance of the Ascend 910B is 256 Tflops, and the Atlas 300T product using this chip integrates 30 AI cores. The Nvidia A100 has an FP32 floating-point performance of 195 TFLOPS with a maximum power consumption of 400 W. The NVIDIA A100 is the world's most powerful AI chip.
Obviously, if Envidia's high-end AI chips are no longer sold to the masses, there will be more opportunities for Chinese manufacturers such as Huawei to offer AI chips. It doesn't make sense for Envidia to release another special version of the H20 chip. Moreover, the regulations in the United States are changing all the time, and who knows if sales of Nvidia's H20 will be restricted again after this release?
It's best to solve problems on your own by mastering basic skills rather than relying on others.
If you agree, please like us! If you agree with us. In the meantime, please**, comment and share.