NVIDIA Investor Presentation Deck slide image

NVIDIA Investor Presentation Deck

PetaFLOPS - Days @ctnzr 1.E+04 1.E+03 1.E+02 1.E+01 1.E+00 1.E-01 1.E-02 1.E-03 AlexNet 2012 2013 Exploding Model Complexity Doubling Every 2 Months 2014 ResNet LARGE LANGUAGE MODELS 2016 Megatron-BERT Megatron-GPT2 GPT-2 2017 BERT 2018 GPT-3 Turing NLG 2020 Compute required to train 175B OpenAI GPT-3 314 ZettaFLOP for training (3640 PFLOP/s* day) P KA NVIDIA 100T parameter single model by 2023 Language Models Constrained by Economics NVIDIA
View entire presentation