Anirban Ghoshal
Senior Writer

Nvidia’s new Grace Hopper superchip to fuel its DGX GH200 AI supercomputer

News
30 May 20232 mins
Artificial IntelligenceComputer ComponentsData Center

The DGX GH200 AI supercomputer is targeted toward developing and supporting large language models. Google Cloud, Meta, and Microsoft already have access to it.

nvidia dgx gh200
Credit: Nvidia

Nvidia has unveiled a new DGX GH200 AI supercomputer, underpinned by its new Grace Hopper superchip and targeted toward developing and supporting large language models.

“DGX GH200 AI supercomputers integrate Nvidia’s most advanced accelerated computing and networking technologies to expand the frontier of AI,” Nvidia CEO Jensen Huang said in a blog post.

The supercomputer, according to Huang, combines the company’s GH200 Grace Hopper superchip and Nvidia’s NVLink and Switch System, to allow the development of large language models for generative AI language applications, recommender systems, and data analytics workloads.

Nvidia’s DGX GH200 uses the NVLink interconnect technology to combine 256 Grace Hopper superchips into a single graphics processing unit (GPU) in order to extract “1 exaflop of performance and 144 terabytes of shared memory — nearly 500 times more memory than the previous generation NVIDIA DGX A100, which was introduced in 2020.”

The chip maker will emulate the strategy it took with its DGX Pods in making the new supercomputer available.

Earlier in March, Huang said the company struck a deal to make its DGX systems available through multiple cloud providers, rather than installing the necessary hardware on-premises.

Currently, Microsoft, Meta, and Google Cloud have access to the new supercomputer, the company said.

Nvidia also said the new Grace Hopper superchip that fuels the DGX GH200 AI supercomputer is in full production mode and systems with the superchip are expected to be available later this year.

The company also said that it was using the new Grace Hopper superchip to help SoftBank design next-generation distributed data centers that will be capable of handling generative AI and 6G applications.

These data centers will be distributed across Japan, the companies said in a blog post.

Earlier in March, the company had launched new data processing units (DPUs) and GPUs, including the BlueField 3 DPU.

Exit mobile version