(Reuters) — Nvidia (NASDAQ:NVDA) unveiled its next big AI chip along with several partnerships and software tools at its annual developer conference this week, seeking to maintain its lead in the race to power the artificial intelligence boom.
Here are the key takeaways from the event:
BLACKWELL CHIPS
The B200 «Blackwell» AI chip is 30 times faster than its predecessor, the Hopper series, at tasks including helping chatbots such as OpenAI's ChatGPT and Google (NASDAQ:GOOGL)'s Gemini deliver answers.
The chip can run large language models while using 25 times less energy and money than the highly popular Hopper series that helped turn Nvidia into a $2 trillion company.
The new processor combines two squares of silicon the size of the company's previous offering.
The chips are expected to hit the market later in 2024, according to CFO Colette Kress.
Nvidia is working with contract chip manufacturer TSMC to avoid holdups in packaging chips, CEO Jensen Huang said.
Huang said in an interview with CNBC on Tuesday that the new chip will be priced between $30,000 and $40,000.
SOFTWARE TOOLS
The company unveiled new software tools aimed at making it easier for businesses to incorporate AI systems into their work.
The tools include cloud-based Nvidia Inference Microservices (NIM), which businesses can use to create and deploy custom applications on their own platforms while retaining full ownership and control of their intellectual property.
PARTNERSHIPS
Nvidia and Oracle (NYSE:ORCL) said they had expanded their collaboration to help government and enterprises run «AI factories» on local cloud services.
The chip designer also said engineering and chip software makers Ansys (NASDAQ:ANSS), Cadence and Synopsys (NASDAQ:SNPS)
Read more on investing.com