Samsung and Micron s HBM3E chips will be shipped soon

Mondo Technology Updated on 2024-03-01

HBM3E DRAM will improve generative AI services and reduce costs.

The AI boom is in full swing, and chipmakers are rolling out new, advanced memory technologies. The next generation of high-bandwidth memory (HBM) is expected to significantly increase bandwidth and capacity, and Samsung aims to lead the industry.

Despite being a bit late in the HBM3E market, Samsung introduced the HBM3E 12H DRAM chip as a groundbreaking achievement in 3D layered memory technology. The South Korean giant's latest memory chip features a novel 12-layer stack that offers a 50% increase in performance and capacity compared to the HBM3E chip with an 8-layer stack.

Samsung claims that the HBM3E 12H chip can achieve up to 1,280 giggabytes of bandwidth per second, providing an unprecedented 36 gigabyte capacity. Samsung utilizes an advanced heat-compressed non-conductive film (TC NCF) to achieve 12-layer stacking in the same chip, seemingly maintaining the same height specifications as the 8-layer chip, meeting the requirements of current HBM memory packaging applications.

TC NCF also offers the added benefit of achieving the industry's smallest 7-micron gap between chips, while also reducing voids between layers. Compared to the HBM3E 8H chip, the vertical DRAM density can be increased by up to 20%. To the best of our knowledge, improvements in the manufacturing process also provide better thermal performance and higher product yields.

The Seoul-based company expects its latest-generation HBM3E (12H) chips to provide the "best-in-class" solution for AI accelerators as demand for DRAM memory continues to grow. Compared to the HBM3 8H chip, Samsung's HBM3E 12H memory appears to be 34% faster on average for AI model training. In addition, the company claims that the number of concurrent users of the inference service can be scaled up "115 times more".

Samsung is currently sampling the first HBM3E 12H chip to select customers, with mass production expected in the first half of 2024. At the same time, Micron, another major player in the HBM3E market, announced the full production of its latest 3D memory chips. The Idaho-based company is investing heavily in a "traditional" 8-layer HBM3E chip design to improve its financial performance for fiscal year 2024.

Micron will provide a 200GB 8Hbm chip for NVIDIA's upcoming H8H HBM3E for its upcoming H8 Tensor Core GPU, a powerful AI accelerator that will be available in the second half of 2024. Similar to Samsung, Micron is positioning its HBM3E technology as a leading memory solution for intensive applications and generative AI services.

The Hbm3E chip has a pin speed of more than 9 per second2 gigabits (gbs), providing more than 1 per second2 terabytes (TBS) of memory bandwidth. The company says it consumes 30% less power than the competition, and the 24GB capacity enables data center operators to "scale seamlessly" for a wide range of AI applications.

Sumit Sadana, Micron's executive vice president and chief commercial officer, highlighted that the company's new HBM3E chip could support business growth amid surging demand for AI. Looking ahead, Micron is preparing to provide the first sample of the 36GB 12-high HBM3E chip in March.

The Year of Science and Technology in China

Related Pages