Tokenization
Tokenization may refer to:
- Tokenization (lexical analysis) in language processing
- Tokenization (data security) in the field of data security
- Word segmentation
- Tokenism of minorities.
This article is issued from Wikipedia - version of the Sunday, March 15, 2015. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.