Q3 FY24 Earnings Summary
New NVIDIA HGX H200
Supercharges Hopper
NVIDIA
NVIDIA H200 is the first GPU to offer HBM3e — faster,
larger memory to fuel the acceleration of generative Al
and large language models, while advancing scientific
computing for HPC workloads
H200 delivers 141GB of memory at 4.8 terabytes per
second, nearly double the capacity and 2.4X more
bandwidth compared with its predecessor, NVIDIA A100
Boosts inference speed by up to 2X compared to H100
GPUs when handling LLMs such as Llama2
Microsoft announced plans to add the H200 to Azure next
year for larger model inference with no increase in latency
H200-powered systems from the world's leading server
manufacturers and cloud service providers are expected
to begin shipping in the second quarter of 2024
NVIDIA®
NVIDIA
NVIDIA
NVIDIAView entire presentation