Micron Technology said Monday that it had begun mass production of its HBM3E memory. The company's HBM3E known good stack chip (KGSD) will be used in NVIDIA's H200 compute GPU for artificial intelligence (AI) and high-performance computing (HPC) applications, which will ship in the second quarter of 2024.
Micron revealed that it is mass-producing 24 GB 8-Hi HBM3E devices with a data transfer rate of 9 per device2 gt s, with a peak memory bandwidth of more than 12 tb/s。Compared to HBM3, HBM3E increases data transfer rates and peak memory bandwidth by 44%, which is especially important for bandwidth-hungry processors like NVIDIA's H200.
NVIDIA's H200 products use the Hopper architecture and have the same computing performance as the H100. At the same time, it comes with 141 GB of HBM3E memory with a bandwidth of up to 48 TBS, 100 GB more than the H80 HBM3 and 3The 35 TBS bandwidth is a significant increase.
Micron uses its 1 (1-Beta) process technology to produce its HBM3E, which is a major achievement for the company as it uses its latest production node for data center-class products, which is a test of manufacturing technology. With Micron's upcoming release of the 36 GB 12-Hi HBM3E in March 2024, the company's AI memory roadmap is further solidified, while it remains to be seen where these devices will be used next.
The start of mass production of HBM3E memory ahead of competitors SK Hynix and Samsung is a major achievement for Micron, which currently has a 10% market share in the HBM segment. The move is crucial for the company because it allows Micron to launch high-end products earlier than its competitors, potentially increasing revenue and margins while gaining a larger market share.
Sumit Sadana, Executive Vice President and Chief Business Officer of Micron Technology, said:"Micron has achieved three consecutive titles on this HBM3E milestone: leading time-to-market, best-in-class industry performance, and differentiated energy efficiency features. AI workloads rely heavily on memory bandwidth and capacity, and Micron is well-positioned to support the massive growth of AI in the future through our industry-leading HBM3E and HBM4 roadmaps, as well as our full portfolio of DRAM and NAND solutions for AI applications. "