According to CEO Jeong-Ho Park of SK hynix, the company's shipments of high-bandwidth memory are expected to reach 100 million units per year by 2030, compared to 500,000 units in 2023.
It all started with NVIDIA's announcement of the H200, the most powerful AI chip in history, which uses HBM3E memory for the first time to accelerate generative AI and large language models, while advancing scientific computing for HPC workloads.
The power of HBM3E memory is that it allows the Nvidia H200 to deliver up to 4 per second8TB of speed, while memory capacity up to 141GB, almost double the capacity compared to the A100, with 2 more bandwidth4 times.
With the rapid development of artificial intelligence, the demand for HBM and DDR5 continues to grow. SK hynix** expects to double sales of HBM and DDR5 in 2024.
Undoubtedly, the demand for HBM and DDR5 will continue to grow due to the advancement of artificial intelligence, and the future prospects are extremely promising. Let's look forward to the future development of this field!
A-share listed company HBM concept stock combing:
Shannon Xinchuang:Hynix's main partner;
Deep Tech:The company has 8-layer and 16-layer DRAM stacking processes, which is expected to enter the HBM packaging track.
Huahai Chengke :Granular epoxy molding compounds (GMCs) can be used for HBM encapsulation, as well as for HBM materials.
Guoxin Technology:25D chip packaging technology, and actively promote the research and development and application of chiplet technology.
Jacques Technologies:The company is the core business of Hynix and Hefei Changxin, and Hynix's revenue accounts for 50% in 22 years
Yawei shares:It indirectly holds shares in GSI in South Korea, has a memory chip testing machine business with high technical difficulty, and stably supplies to industry leaders such as Hynix and Amkor.
Related New Materials:Hynix is a major customer of the company.
Risk Warning: The above data is public information on the Internet, and the views are for reference only, and do not constitute investment advice.