Multiple Choice Question
Tokenization is a fundamental step in text preprocessing.
In natural language processing, what does "tokenization" refer to?
Explanation:
Tokenization involves splitting text into smaller units, such as words or phrases, which can then be analyzed and processed.