Start free trial
Take Naologic for a spin today, no credit card needed and no obligations.
Start free trial

Tokenization - What is tokenizing NLP?


Tokenization is the process of dividing a string of text into smaller units called tokens, which are used in the context of machine learning and Natural Language Processing (NLP). The size of these tokens might vary from single characters to entire words.