Checkout
Start free trial
Take Naologic for a spin today, no credit card needed and no obligations.
Start free trial
Question

Pre-Trained Model - Is BERT a Pretrained model?

Answer

To be sure, BERT is unlike any pretrained model out there. It differs from its predecessors in that it trains exclusively on a raw text corpus and uses a deep bidirectional unsupervised language representation.