In the global AI chip market, AMD (Advanced Micro Devices) recently released two flagship AI chip products, MI300X and MI300A, at the Advancing AI Conference, officially announcing AMD's strategic upgrade in the AI field. This action has not only attracted widespread attention in the industry, but also will bring a new pattern to the AI chip market.
The Mi300X is AMD's latest data center GPU designed for generative AI intelligent computing power, and is highly anticipated by the industry as a powerful challenge to NVIDIA's monopoly in the AI field.
The product is based on AMD's latest third-generation cDNA architecture and has a staggering 153 billion transistors, surpassing the 80 billion transistors of the Nvidia H100. In terms of 8-bit precision floating-point number (FP8) calculations, the MI300X can reach 42 petaflops, which is a significant computing advantage compared to the H100's 32 petaflops. In terms of memory, the MI300X is as high as 192GB, far exceeding the 120GB memory of the NVIDIA H100 chip. This increase in memory capacity is believed to help handle larger, more complex model sets, demonstrating AMD's relentless commitment to technological innovation.
According to AMD, the MI300X is manufactured by TSMC and uses AMD's most advanced process technology. Under the comprehensive consideration of performance and technical parameters, the launch of the MI300X has aroused strong interest from giants such as Meta and Microsoft. Meta and Microsoft were previously the largest buyers of Nvidia's H100 GPUs, with around 150,000 units purchased. Kevin Scott, CTO of Microsoft, said at the launch event that Microsoft's plan to use the Mi300X in Azure cloud services has won strong market recognition for AMD's new products.
However, regarding the price of the Mi300X, AMD did not publicly disclose it at the press conference. According to market rumors, one of Nvidia's chips costs about $40,000. In this regard, AMD CEO Lisa Su once said, "AMD's new products** and operating costs must be lower than NVIDIA to convince customers." This statement underscores AMD's confidence in the market competition, and also shows careful consideration of the pricing of the MI300X market.
Going hand in hand with the MI300X is the MI300A, AMD's APU product designed specifically for supercomputers. APU, the accelerator that fuses CPU and GPU on the same chip, is a technology direction that AMD has been working on for the past few years. The release of the Mi300A marks another step forward for AMD in this area.
The MI300A is powered by HBM3 memory with a capacity of 128GB, which is the next generation of MI250X with nearly double the performance-per-watt ratio. It is designed to meet the growing demand for high-performance computing in supercomputers. This is also a reflection of AMD's continuous efforts in the field of supercomputing, providing more diverse options for global high-performance computing.
In addition, the MI300A also received the attention of industry giants at the press conference. Meta and Oracle executives said on the spot that they would use the Mi300A accelerator in their own AI and data center services, earning AMD more support in the supercomputer space.
AMD's move to release the MI300 series of AI chips has triggered a rethinking of the AI chip market pattern in the industry. With NVIDIA currently accounting for nearly 90% of the global AI computing market, AMD has shown a strong desire to compete in the market through the release of MI300X and MI300A.
Judging from the market reaction, the positive response of giants such as Meta, Microsoft, and Oracle shows that AMD's influence in this field is gradually expanding. Meta and Microsoft were previously the main buyers of Nvidia's H100 GPUs, and their high interest in the Mi300X bodes well for AMD to gain a larger share of the global data center market.
AMD President Lisa Su said at the press conference that in the next four years, the data center AI chip market will exceed $400 billion, more than twice as much as its previous one. This optimistic outlook shows AMD's confidence in the future market and lays a solid foundation for its development in the AI field.
Disclaimer: The above content is compiled from the Internet and is only for communication Xi purposes. If you have any content or copyright issues, please leave a message and contact us to delete it.