In order to ensure the stability of HBM**, Nvidia has paid hundreds of millions of dollars in advance payments to SK hynix and Micron, which is equivalent to the confirmed** contract. Samsung Electronics also recently concluded product testing and signed an HBM product ** agreement with NVIDIA.
The inference and training of large models is memory-intensive, and the ability to store and retrieve data from memory will limit the performance of AI chips, i.e., memory walls. HBM solves the memory wall problem by significantly increasing the number of pins to reach a memory bus 1024 bits wide per HBM stack, which is equivalent to 16 times that of DDR5 64 bits wide per DIMM, on top of traditional DDR memory and on-chip cache. According to Omdia research, the DRAM market revenue is expected to grow at an annual rate of 21% from 2023 to 2027, while the HBM market is expected to soar by 52%. HBM's share of DRAM market revenue is expected to exceed 10% this year and close to 20% by 2027.
According to the theme database of the Financial Associated Press, among the relevant listed companies:
Guoxin TechnologyAt present, together with partners, we are carrying out high-performance interconnection IP technology work on chiplet chips related to tape-out verification based on advanced technology, and actively carry out high-end chip packaging cooperation including HBM technology with upstream and downstream cooperative manufacturers.
Only special couplesMicroelectronic materials can be used for the stacking of HBM chips and high-speed serial connections.