South Korean technology giant Samsung Electronics has fallen behind in the artificial-intelligence race—at least in the first heat. It would be foolish to count it out, though. Recent signs indicate it might be narrowing the technological gap with rivals SK Hynix and Micron in high-performance AI memory chips.
Even if it takes longer than expected to catch up, a tighter overall memory market thanks to the AI boom could still be a significant tailwind for Samsung. Nvidia’s AI chips have been selling like hot cakes since the rise of generative AI apps such as ChatGPT. Memory-chip makers have, in turn, sold out their high-performance products to Nvidia and others.
High-bandwidth memory, or HBM, offers enhanced data-processing speed, which is crucial for AI number crunching. Korea’s SK Hynix has taken an early lead in HBM. It is virtually the only supplier to Nvidia for the latest-generation memory chip, called HBM3.
Samsung only started mass-producing HBM3 in the second half of last year. It does produce earlier generations of HBM chips used by some slower AI chips, rather than the most cutting-edge ones made by Nvidia. And now SK Hynix has started mass producing its next-generation chips, called HBM3E.
SK’s smaller rival Micron, which essentially skipped the previous generation, is doing the same. Both companies said they have already sold out their full HBM production volume this year and are already filling orders for next year. Even so, Samsung is working hard to catch up.
The company expects to mass-produce next-generation HBM chips in the first half of this year. That would leave it about a fiscal quarter rather than a full year—as with the previous generation of HBM chips—behind the competition. Moreover, on March
. Read more on livemint.com