SK Hynix has already locked down most of its 2026 high bandwidth memory (HBM) sales, CEO Kwak Noh-jung revealed at the company’s annual shareholder meeting. The South Korean chipmaker expects to finalize deals for next year’s HBM output soon, mirroring this year’s success selling out HBM to AI giants like Nvidia.
Kwak highlighted “explosive” growth in HBM demand, driven by AI’s hunger for ultra-fast memory. The company also noted a recent spike in orders as clients rush to secure supplies before U.S. semiconductor tariffs hit. President Trump has threatened 25%, 50%, or even 100% duties on imported chips, with a major announcement expected April 2nd.
When asked about Chinese AI startup DeepSeek’s cost-efficient models, Kwak framed competition as a net positive: more players adopting AI means more demand for SK Hynix’s chips. The response echoes industry leaders downplaying AI “bubble” risks by betting on adoption growth.
SK Hynix is preparing to mass-produce 12-layer HBM4 by year-end, having shipped samples early. The new memory handles over two terabytes per second—60% faster than its HBM3E predecessor—positioning it for next-gen AI and supercomputing workloads.