Tokenizer
From Wikipedia, the free encyclopedia
For computer science usage, see Lexical analysis.
Tokenizer is an experimental web crawler and price comparison engine, with heuristic product categorization. As initially planned, Tokenizer should provide real-time financial indicators of different economics.
[edit] History
In September 2006 Tokenizer Inc. started an experimental crawler under agent name Bambarbia. Free public service became available in January 2007 after heueristic analysis and generic detection of Canadian online internet shops.