BEST OF THE WEB

Nvidia launches GH200 chip for AI inference market

Nvidia has announced a new chip designed to run artificial intelligence models, the GH200. The chip has the same GPU as the company’s current highest-end AI chip, the H100, but it pairs that GPU with 141 gigabytes of cutting-edge memory, as well as a 72-core ARM central processor.

The GH200 is designed for inference, the process of using AI models to make predictions or generate content. Inference is computationally expensive, and it requires a lot of processing power every time the software runs. Nvidia says that the GH200 will allow for significantly faster inference speeds, which will make it possible to run larger and more complex AI models.

The GH200 is also designed for scale-out, meaning that it can be used in large data centers to power multiple AI models simultaneously. This makes it well-suited for cloud computing providers and other businesses that need to run a large number of AI models.

Nvidia’s announcement comes as the company faces increasing competition in the AI chip market from AMD and Google. AMD recently announced its own AI-oriented chip, the MI300X, which can support 192GB of memory. Google is also developing its own custom AI chips for inference.

The sources for this piece include an article in CNBC.

IT World Canada Staff
IT World Canada Staffhttp://www.itworldcanada.com/
The online resource for Canadian Information Technology professionals.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

ITW in your inbox

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

More Best of The Web