![Checkout](https://naologiccom.imgix.net/website-update/general/checkout.png?auto=compress&w=64&fm=png)
Start free trial
Take Naologic for a spin today, no credit card needed and no obligations.
Start free trial Question
Generative Pretrained Transformer - How is GPT pretrained?
Answer
In order to train GPT to predict the next word in a sequence, a method known as masked language modeling is employed. To get it to function, we need to conceal some words in the input sequence and train the model to deduce their meaning from contextual cues.