LVQ
From Wikipedia, the free encyclopedia
LVQ, or Learning Vector Quantization, is a prototype-based supervised classification algorithm.
LVQ can be understood as a special case of an artificial neural network, more precisely, it applies a winner-take-all Hebbian learning-based approach. It is a precursor to Self-organizing maps (SOM) and related to Neural gas, and to the k-Nearest Neighbor algorithm (k-NN). LVQ was invented by Teuvo Kohonen.
The network has two layers: a layer of input neurons, and a layer of output neurons. The network is given by prototypes W=(w(i),...,w(n)). It changes the weights of the network in order to classify the data correctly. For each data point, the prototype (neuron) that is closest to it is determined (called the winner neuron). The weights of the connections to this neuron are then adapted, i.e. made closer if it correcly classifies the data point or made less similar if it incorrectly classifies it.
An advantage of LVQ is that it creates prototypes that are easy to interpret for experts in the field.
LVQ can be a source of great help in classifying text documents.
[edit] References
- Self-Organizing Maps and for Feature Sequences, Somervuo and Kohonen. 2004 (pdf)
- Classification of Textual Documents using LVQ, Fahad and Sikander. 2007 (pdf)