Checkout
Start free trial
Take Naologic for a spin today, no credit card needed and no obligations.
Start free trial
Question

Generative Pretrained Transformer - How big is GPT-3 pretraining data?

Answer

By a wide margin, GPT-3 outshines all prior language models with its estimated 175 billion parameters. It was trained using around 45 terabytes of text data extracted from many databases.