HBM stands for high-bandwidth memory, and the application scenario is concentrated in server GPU memory
HighbandwidthMemory (HBM) is a high-bandwidth memory used for high-speed data transmission between GPUs and CPUs, and the main application scenarios are concentrated in the GPU memory of high-performance servers in data centers, and a small part is used in CPU memory chips. The HBM is also a 3D structure, which is vertically stacked by multi-layer DRAMDIE, and each layer of DIE is connected to the logic DIE through TSV (Through Silicon VIA) + BUMPS technology, and then connected to the GPU CPU SoC through the middle layer, so that 4-layer, 8-layer, 12-layer and other DIE are packaged in a small volume space.
HBM helps AI servers upgrade to higher bandwidth and capacity.
At present, the video memory used by mainstream graphics cards is GDDR5, but there are still pain points in the application: 1) It consumes a lot of PCB area: 12 16 GDDR5 chips are distributed around the GPU core, and the size of GDDR chips cannot continue to shrink, occupy more space, and a larger voltage regulation module is also needed, 2) The power consumption increase of GDDR5 has reached an inflection point that has a negative impact on the improvement of GPU performance: because the platform and equipment need to achieve power balance between logic chips and DRAM, While the current GDDR enters the inefficient zone of the functional performance curve, the increasing memory power consumption will hinder the improvement of GPU performance in the future.
On the demand side: HBM is widely used in AI servers, and mainstream servers in the market have adopted HBM solutions.
HBM is widely used in AI server scenarios due to its high bandwidth, low power consumption, and small size. The application of HBM is mainly concentrated in high-performance servers, which was first implemented in the NVP100GPU (HBM2) in 2016, then applied to V100 (HBM2) in 2017, A100 (HBM2) in 2020, and H100 (HBM2E HBM3) in 2022, and the latest generation HBM3E is equipped with H200 released by NVIDIA in 2023, providing faster speed and higher capacity for servers.
On the demand side: The growth of AI server shipments has catalyzed the explosion of HBM demand, and the market size has grown rapidly
The growth of AI server shipments has catalyzed the explosion of HBM demand, and it is estimated that the market size of HBM will exceed $15 billion in 2025, with a growth rate of more than 50%. In 2022, AI server shipments will be 860,000 units, and the penetration rate will continue to increase, with AI servers accounting for about 10% of the overall servers, and it is expected that AI server shipments will exceed 2 million units in 2026, with an expected compound annual growth rate of 29%. At the same time, the HBM capacity requirements of mainstream AI servers have been upgraded from 40 GB to 80 GB and 141 GB, driving the average HBM capacity to increase. We estimate that the global market size of HBM will reach US$15 billion in 2025, with a year-on-year growth rate of 68%.
Supply side: The three major storage manufacturers occupy the main market share.
HBM suppliers are mainly gathered in the three major factories of SK hynix, Samsung, and Micron, and SK hynix is the leader. Among them, SK hynix cooperated with AMD to release the world's first HBM, taking the lead in the new generation of HBM3E in 23 years, and laying a market position first, mainly **NVIDIA, Samsung** and other cloud manufacturers, according to TrendForce data, in 2022, SK hynix will have a market share of 50%, Samsung will have a market share of 40%, and Micron will have a market share of about 10%. In 2023, SK hynix is expected to have a market share of 53%, Samsung 38%, and Micron a 9% market share.
Supply side: The original storage factory has increased the HBM production capacity, which is expected to increase by 2 in 24 years5 times.
The original storage plant has increased its HBM production capacity, and SK hynix plans to double its production capacity in 24 years. SK hynix HBM3E will be mass-produced in the first half of '24, with the goal of doubling HBM production capacity in '24, and although the capital expenditure plan for '24 is basically the same as that of '23, the investment related to TSV will more than double year-on-year. Micron HBM3E will start mass production in early 24, and it is expected that the capital expenditure in 24 years will be 7.5-8 billion US dollars, slightly higher than the same period last year, mainly for HBM mass production. Samsung plans to build a new packaging line at the Cheonan plant for large-scale production of HBM, with an additional investment of USD 700 million.
The changes in HBM's packaging process are mainly in COWOs and TSVs. 1) CoUVes: The DRAMDIE is placed on the silicon interposer together, and connected to the underlying substrate through the packaging process of Chiponwafer (COW), that is, the chip is connected to the silicon wafer through the packaging process of Chiponwafer (COW), and then the COW chip is connected to the substrate to integrate into CODOS. At present, the mainstream solution for HBM and GPU integration is TSMC's CODOS, which has been widely used in computing chips such as A100 and GH200 to achieve higher speed data transmission by shortening the interconnection length.
Changes on the supply side: changes in core processes bring about increments
The changes in HBM's packaging process are mainly in COWOs and TSVs. 2) TSV: TSV through-silicon vias are at the heart of capacity and bandwidth expansion, creating thousands of vertical interconnects between the front and back sides of the chip by punching holes throughout the thickness of the silicon wafer. In the HBM, multiple layers of DRAMDIE are stacked, connected via through-silicon vias and solder bumps, and only the bottommost die can be connected outwards to the memory controller, while the rest of the dies are interconnected via internal TSVs.
Supporting chain - packaging and testing.
Tongfu Microelectronics: The company has the top 2 in China5D 3D packaging platform and ultra-large size FCBGA R&D platform, and completed the development of high-layer rewiring technology, providing customers with wafer-level and substrate-level chiplet packaging and testing solutions, and has mass-produced multi-layer stacked NAND Flash and LPDDR packages, and is the first packaging and testing factory in China to complete the development of 3DS DRAM packaging based on TSV technology. AMD launched the MI300 in 2023 and will continue to decline in 23Q4, and it is expected to usher in a significant increase in volume in 24 years, and the company will fully benefit.
JCET: The company and customers jointly develop 25DFCBGA products, TSV heterogeneous bonding 3DSOC FCBGA is certified. The company's packaging and testing services cover DRAM, FLASH, etc., and have been deeply involved in the industry for more than 20 years, leading the industry in 16-layer Nandflash stacking, 35um ultra-thin chip process capabilities, and hybrid special-shaped stacking. The company's XDFOI technology platform is deployed in AI, 5G, automotive, industrial and other fields, and XDFOI chiplets are mass-produced.
Taiji Industrial: The company's subsidiary, Haitai Semiconductor, signed a five-year cooperation agreement with SK hynix, with SK hynix holding 45% of the equity of Haitai Semiconductor, which is deeply bound to SK hynix, and Haitai provides DRAM packaging services for SK hynix. SK hynix has occupied about 50% of the HBM market in 23 years, and with the explosion of HBM shipments in 24 years, the company is expected to undertake the overflow packaging and testing demand. Deep Technology: The company entered the storage packaging and testing through the acquisition of Peidun Technology, which focuses on high-end packaging and testing, and has the production capacity of DDR5 and LPDDR5 sealing and measurement. The Peidun Bumping project has passed small batch trial production, focusing on the FC flip process, the research and development of POPT stacking packaging technology, and the optimization of 16-layer ultra-thin chip stacking technology.
Supporting chain – equipment.
Saiteng shares: the company through the acquisition of the world's leading wafer testing equipment supplier Japan Optima into the field of wafer testing equipment, the company's products involve solid die equipment, sorting equipment, wafer packaging machine, wafer defect detection machine, chamfer roughness measurement, wafer character detection machine, wafer laser marking machine, wafer laser grooving machine, etc., through Optima cut into Samsung, SK Siltron, Sumco and other major customer chains, Samsung increased 24 years of capital expenditure, The company is expected to benefit from the planned expansion of HBM production line in 24 years to more than double that in 23 years, and the demand for measurement equipment will increase.
With the launch of NANOV**E HP and Nanova Lux, the verification process scope of ICP etching equipment continues to expand, and the coverage rate of ICP verification etching process in advanced logic chips, advanced DRAM and 3D NAND is expected to expand to 50%-70%. The company's 8-inch and 12-inch Primo TSV 200E and Primo TSV 300E are used in wafer-level advanced packaging, 2There are abundant orders for 5D packaging and MEMS-on-chip production lines, and the through-silicon via etching process for 12-inch 3D chips has been successfully verified.
With the mainstream manufacturing process of memory chips has developed from 2D NAND to 3D NAND structure, the complexity of the structure has led to a gradual increase in the demand for thin film deposition equipment, and the number of stacked layers of 3D NAND flash chips has been increasing, gradually developing from 32 to 64 layers to more layers and more advanced processes, and the trend of increasing demand for thin film deposition equipment will also continue.
Supporting chain – material.
Jacques Technology: In 2016, the company officially entered the precursor industry through the acquisition of UP Chemical, which has become the core precursor supplier of SK hynix since 04, forming a deep binding for many years. SK hynix has a 50% market share and will supply HBM3 exclusively to NVIDIA in 22Q3. The company's products cover silicon precursors, high-K precursors, and metal precursors, and the company will fully benefit from the increase in demand for HBM and the increase in SK hynix's shipments.
Novoray New Materials: The company has been deeply involved in the inorganic filler and particle carrier industry for nearly 40 years, and is the leading silicon powder in China. The company continues to focus on high-end chip AI, 5G, HPC packaging, heterogeneous integration of advanced packaging chiplets, HBM, and a new generation of high-frequency and high-speed copper clad laminates, and launches a variety of specifications of low-cut point Low micron sub-micron spherical silicon powder, low-cut point Low micron submicron spherical alumina powder, low-loss ultra-low loss spherical silicon powder for high-frequency and high-speed copper clad laminates, and high-thermal conductivity micron submicron spherical alumina powder for new energy batteries.
Yishitong: The company has laid out core technologies such as low-high-purity quartz and low-purity alumina preparation technology, low-powder preparation technology, and spherical production process for memory packaging, and has the industrialization ability of low-ray spherical alumina. The company plans to build a new low-spherical alumina project with an annual output of 200 tons for high-end chip packaging, which is expected to be partially put into production in the second half of 2023. Low-Spherical alumina powders are about 80%-90% volumetric filling in EMC or GMC, which is expected to benefit from increased HBM shipments.
Huahai Chengke: The company is the leading epoxy molding material in China, and in the field of advanced packaging, the company has successfully developed packaging materials used in QFN BGA, FC, SIP, FOWLP FOPLP and other packaging forms. GMC granular epoxy molding material can be used for HBM encapsulation, and the company's related products have been verified by customers and are now in the sample delivery stage, which is expected to benefit from the substitution of domestic materials.
This is an abridged excerpt from the report, the original PDF of the report
Information Technology-Computing Power Series Report (1): AI Server Catalyzes the Explosion of HBM Demand, and Core Process Changes Bring Supply-side Increments-Pacific [Zhang Shijie, Li Juehan]-20240125[Page 26].
Report**: Value Catalog