Samsung to double HBM chip production to lead on-device AI chip era
Jeong-Soo Hwang
Jan 12, 2024 (Gmt+09:00)
LAS VEGAS – Samsung Electronics Co., the world’s largest memory chipmaker, plans to more than double its high bandwidth memory (HBM) chip production volume as it aims to take the lead in the artificial intelligence chip segment.
Han Jin-man, executive vice president responsible for Samsung's US semiconductor business, said on Thursday the company is pinning high hopes on high-capacity memory chips, including the HBM series, to lead the fast-growing AI chip segment.
“We will raise our HBM chip production volume by 2.5 times this year compared to last year’s output. The pace will continue with another twofold increase next year,” he told reporters during a media session at CES 2024.
“Memory chips will play the leading role in the AI era. Samsung will not be influenced by the industry’s ups and downs. We will steadily expand our investment in the growth sector,” he said.
Han is the highest-level Samsung executive to unveil the company’s HBM chip production plans for this year and next.
HBM is a high-capacity, high-performance semiconductor chip, demand for which is soaring as it is used to power generative AI devices like ChatGPT, high-performance data centers and machine learning platforms.
The HBM series of DRAM is the talk of the town these days as electronics makers are unveiling products equipped with on-device AI technology, which enables customized and personalized AI functions on smartphones and other smart gadgets.
HBM3, one of the most advanced such chips currently available, is said to have a capacity 12 times higher and a bandwidth 13 times higher than GDDR6, the latest DRAM product.
According to market tracker TrendForce, the global HBM market is forecast to grow to $8.9 billion by 2027 from an estimated $3.9 billion this year.
Samsung said it aims to raise its HBM competitiveness by offering its clients a turnkey service, in which the company packages the graphic processing unit (GPU) made by Samsung Foundry and HBM chips into a single chipset.
“We’re positively considering producing next-generation HBM chips not at the memory process but at the foundry process to maximize business efficiency as we do both memory and foundry,” Han said at CES 2024.
Leading chipmakers such as foundry leader Taiwan Semiconductor Manufacturing Co. (TSMC) and Intel Corp. are fiercely competing for advanced packaging to enhance chip performance without having to shrink the nanometer through ultra-fine processing, which is technologically challenging and more time consuming.
At this year’s electronics show, Samsung is showcasing several latest memory chips currently under development or already in supply to its clients.
To meet growing demand from generative AI chip users, the company has put on display 12-nanometer 32-gigabyte double data rate 5 (DDR5) DRAM chips; Shinebolt, its HBM3E chip; and CMM-D, a computer express link (CXL) DRAM module.
For on-device AI functions, Samsung is showcasing LPDDR5X-PIM, an advanced DRAM chip that helps process data like a central processing unit (CPU).