SEOUL (Reuters) — SK Hynix Inc said on Tuesday it has begun mass production of next-generation high-bandwidth memory (HBM) chips used in artificial intelligence chipsets, with sources saying initial shipments will go to Nvidia (NASDAQ:NVDA) this month.
The new type of chip — called the HBM3E — is a focal point of intense competition. Last month, Micron Technology (NASDAQ:MU) said it had started mass production of the chips while Samsung Electronics (KS:005930) said it had developed the industry's first 12-stack HBM3E chips.
SK Hynix has, however, led the HBM chip market by virtue of being the sole supplier of the version currently used — the HBM3 — to Nvidia which has 80% of the market for AI chips.
«The company expects successful mass production of HBM3E and with our experience… as the industry's first provider of HBM3, we expect to cement our leadership in the AI memory space,» SK Hynix said in a statement.
The new HBM3E chip by the world's second-largest memory chipmaker offers 10% improvement in heat dissipation and processes up to 1.18 terabytes of data per second.
SK Hynix's HBM capacity is fully booked for 2024, analysts said, as explosive demand for AI chipsets drives up demand for high-end memory chips used in them.
«SK Hynix has secured an absolute market position… and its volume increase in high-end memory chips is also expected to be the most aggressive among chipmakers,» said Kim Un-ho, analyst at IBK Investment & Securities.
Nvidia unveiled on Monday its latest flagship AI chip, the B200, said to be 30 times speedier at some tasks than its predecessor as it seeks to maintain its dominant position in the artificial-intelligence industry.
Shares in SK Hynix have doubled in value over the past 12 months on
Read more on investing.com