Skip to content
  • KOSPI 2612.43 +29.16 +1.13%
  • KOSDAQ 740.48 +13.07 +1.80%
  • KOSPI200 347.27 +3.90 +1.14%
  • USD/KRW 1384 -4.00 0.29%
View Market Snapshot
Korean chipmakers

SK Hynix chief says no delay in 12-layer HBM3E supply as demand soars

TrendForce researcher Avril Wu expects the HBM market to grow 156% on-year to $46.7 billion in 2025

By Oct 23, 2024 (Gmt+09:00)

3 Min read

SK Hynix CEO Kwak Noh-jung unveils its HBM chip development roadmap during a press conference at its headquarters on May 2, 2024
SK Hynix CEO Kwak Noh-jung unveils its HBM chip development roadmap during a press conference at its headquarters on May 2, 2024

Kwak Noh-jung, chief executive of SK Hynix Inc., the world’s second-largest memory chipmaker, has said there will be no delay in the company’s planned mass production and supply of its most advanced AI chip, 12-layer HBM3E, to its clients, including Nvidia Corp.

“There’s no change in our schedule for the mass production of the 12-layer HBM3E by the end of the year. Everything’s going well in terms of shipment and supply timing,” he told reporters Tuesday on the sidelines of a Semiconductor Day event in Seoul.

His comments come as a senior researcher at TrendForce, the Taiwan-based semiconductor research firm, expects the market for HBM, or high-bandwidth memory, to continue to post strong growth next year, driven by soaring demand from Nvidia and other AI chipmakers.

(Graphics by Dongbeom Yun)
(Graphics by Dongbeom Yun)

ROBUST HBM CHIP DEMAND GROWTH

During the TrendForce Roadshow Korea held in Seoul on Tuesday, Avril Wu, senior vice president of research operations at TrendForce, said she expects the global HBM market to grow 156% next year, reaching $46.7 billion, from $18.2 billion this year. Its share of the overall DRAM market is forecast to rise from 20% this year to 34% in 2025.

From the perspective of major AI solution providers, there will be a significant shift in HBM specification requirements toward HBM3E, with an increase in 12-layer stack products anticipated, according to TrendForce. This shift, it added, is expected to drive up HBM capacity per chip.

Among HBM products, Wu said the share of HBM3E, the fifth-generation HBM chip, is forecast to increase to 85% in 2025 from 46% this year, mainly driven by Nvidia's Blackwell GPU.

Avril Wu, senior vice president of research operations at TrendForce, speaks at the TrendForce Roadshow Korea 2024 in Seoul (Courtesy of Edaily)
Avril Wu, senior vice president of research operations at TrendForce, speaks at the TrendForce Roadshow Korea 2024 in Seoul (Courtesy of Edaily)

She said Nvidia will continue to dominate the AI market next year, intensifying competition among major memory chipmakers such as Samsung Electronics Co., SK Hynix and Micron Technology Inc. to secure the US AI chip designer as a customer.

While companies like Advanced Micro Devices Inc. (AMD) ramp up their AI chip sales, Nvidia will consume 73% of the total HBM output in 2025, up from 58% this year, Wu said.

Google's HBM intake is expected to drop to 11% from 18%, and AMD's share to 7% from 8%, she said.

The TrendForce researcher predicted that samples of the sixth-generation HBM4 will be released later next year, with official adoption by companies like Nvidia expected in 2026.

Nvidia's Blackwell GPU architecture (Photo captured from its website)
Nvidia's Blackwell GPU architecture (Photo captured from its website)

SK HYNIX TIES UP WITH TSMC

Among memory chipmakers, SK Hynix is the biggest beneficiary of the explosive increase in AI adoption, as it dominates the production of HBM, critical to generative AI computing, and is the top supplier of AI chips to Nvidia.

SK Hynix controls about 52.5% of the overall HBM market. Its crosstown rival Samsung, the world’s top memory chipmaker, has a 42.4% market share.

In May, SK Hynix CEO Kwak said its HBM chips were sold out for this year and its capacity is almost fully booked for 2025.

To hold its lead, SK joined hands with Taiwan Semiconductor Manufacturing Co. (TSMC), the world’s top contract chipmaker, in April to develop the sixth-generation AI chip, HBM4.

AI chip (Courtesy of Getty Images)
AI chip (Courtesy of Getty Images)

SAMSUNG STRUGGLES

Currently, Samsung supplies fourth-generation 8-layer HBM3 chips to clients such as Nvidia, Google, AMD and Amazon Web Services Inc. and HBM2E chips to Chinese companies.

Samsung is still striving to gain Nvidia’s approval for its HBM3E chips.

As Samsung struggles to compete with its rivals in the HBM segment, Vice Chairman Jun Young-hyun, the head of Samsung’s DS division, which oversees its semiconductor business, has vowed to drastically cut its chip executive jobs and restructure semiconductor-related operations.

Samsung, which set up a dedicated HBM chip development team, has also forged a tie-up with TSMC for HBM4 chips.

Write to Eui-Myung Park at uimyung@hankyung.com

In-Soo Nam edited this article.
More to Read
Comment 0
0/300