Talk:Tokenization
From Wikipedia, the free encyclopedia
Tokenizing is the operation of replacing one set of symbols with another, typically to make the resulting set of symbols smaller.
This is not the common usage of the term. In computer science it normally means to split a string up into tokens (e.g. key words, separators, etc.), not to replace a list of tokens with smaller tokens.
I am not familiar with the usage tokenizing given in the previous version of the article, but I will leave it as an alternative meaning of tokenizing until I can verify whether it is incorrect or not.
Steve-o 03:47, 15 Apr 2004 (UTC)
Tokenizing in politics has a different meaning that would be worth adding to this article, or putting into another one.--Lizzard 22:47, 13 September 2006 (UTC)
[edit] the section on human perception
- remote enough to be situated in a different article
- needs a cite
- was left in for the time being pending suggestions for a more appropriate location