Nvidia’s graphics-processing units (GPUs), AI-modellers’ favourite type of processor, is so insatiable that they are in short supply. No matter. Nvidia announced the launch later this year of a new generation of superchips, named Blackwell, that are many times more powerful than its existing GPUs, promising bigger and cleverer AIs.
Thanks to AI, spending on global data centres was $250bn last year, Mr Huang says, and is growing at 20% a year. His firm intends to capture much of that growth. To make it harder for rivals to catch up, Nvidia is pricing Blackwell GPUs at $30,000-40,000 apiece, which Wall Street deems conservative.
In order to reap the fruits of this “accelerated-computing", Nvidia wants to vastly expand its customer base. Currently the big users of its GPUs are the cloud-computing giants, such as Alphabet, Amazon and Microsoft, as well as builders of gen-AI models, such as OpenAI, maker of ChatGPT. But Nvidia sees great opportunity in demand from firms across all industries: health care, retail, manufacturing, you name it.
It believes that many businesses will soon move on from toying with ChatGPT to deploying their own gen-AIs. For that, Nvidia will provide self-contained software packages that can either be acquired off the shelf or tailored to a company’s needs. It calls them NIMs, or Nvidia Inference Microservices.
Crucially, they will rely on (mostly rented) Nvidia GPUs, further tying customers into the firm’s hardware-software ecosystem. So far, so star-spangled. But it is not all peace and love at Woodstock.
Read more on livemint.com