Subscribe to enjoy similar stories. Future computer chips may be able to help ease the alarming energy demands of generative artificial intelligence, but chip makers say they need something from AI first: a slowdown in the sizzling pace of change. Graphics processing units so far have dominated the bulk of training and running large-scale AI models.
The chips, originally built for gaming graphics, offer a unique blend of high performance with the flexibility and programmability required to keep up with today’s constantly shifting swirl of AI models. Nvidia’s dominance in the GPU market has propelled it to a trillion-dollar valuation, but others, including Advanced Micro Devices, also make the chips. As the industry coalesces around more standardized model designs, however, there will be an opportunity to build more custom chips that don’t require as much programmability and flexibility, said Lisa Su, chief executive at AMD.
That will make them more energy-efficient, smaller and cheaper. “GPUs right now are the architecture of choice for large language models, because they’re very, very efficient for parallel processing, but they give you just a little bit of programmability," Su said. “Do I believe that that’s going to be the architecture of choice in five-plus years? I think it will change." What Su expects in five or seven years’ time isn’t a shift away from GPUs, but rather a broadening beyond GPUs.
Nvidia and AMD haven’t been vocal around specific plans here. Nvidia declined to comment for this article. Some custom chips are already hard at work handling aspects of AI.
Read more on livemint.com