- Nvidia announced on Monday that it will be releasing a new version of its top-tier AI chip.
- The chip is expected to surpass Nvidia’s current flagship H100 chip.
- A Korean company called SK Hynix is another supplier of memory that Nvidia purchases.
Nvidia announced on Monday that it will be releasing a new version of its top-tier AI chip with Oracle, Google, and Amazon.com early in the following year.
It is anticipated that the chip, dubbed the H200, will outperform Nvidia’s flagship H100 chip. More high-bandwidth memory, which is one of the most expensive components of the chip and determines how much data it can process rapidly, is the main improvement.
Flagship chip
Nvidia is the industry leader in AI chips and powers many other generative AI services, such as ChatGPT from OpenAI, which offers responses to questions that are written in a manner resembling that of a human. With additional high-bandwidth memory and a faster connection to the chip’s processing components, such services will be able to react more quickly.
High-bandwidth memory in the H200 is 141 gigabytes, compared to 80 gigabytes in the H100. Although Micron Technologies announced in September that it was seeking to become an Nvidia supplier, Nvidia withheld the identity of the companies that provided the memory on the new chip.
A Korean company called SK Hynix, which stated last month that AI chips are boosting sales, is another supplier of memory that Nvidia purchases.
According to Nvidia, on Wednesday, in addition to specialized AI cloud providers CoreWeave, Lambda, and Vultr, the first cloud service providers to provide access to H200 chips will be Microsoft Azure, Oracle Cloud Infrastructure, Google Cloud, and Amazon Web Services.