Checkout
Personalized AI apps
Build multi-agent systems without code and automate document search, RAG and content generation
Start free trial
Question

Generative Pretrained Transformer - How big is GPT-3 pretraining data?

Answer

By a wide margin, GPT-3 outshines all prior language models with its estimated 175 billion parameters. It was trained using around 45 terabytes of text data extracted from many databases.