tokenize
/tɒkənaɪz/Definitions
1. verb
to divide (text or speech) into words or tokens, especially for analysis or processing by a computer.
“The software will tokenize the text to analyze its sentiment.”
2. verb
to represent (text or speech) as a sequence of tokens, such as words or symbols.
“The program will tokenize the input string for further processing.”
3. noun
a process or algorithm for dividing text or speech into words or tokens.
“The team developed a new tokenization algorithm for their NLP system.”