Generalization error
From Wikipedia, the free encyclopedia
The generalization error of a machine learning model is a function that measures how far the student machine is from the teacher machine in average over the entire set of possible data that can be generated by the teacher after each iteration of the learning process. It has this name because this function indicates the capacity of a machine that learns with the specified algorithm to infer a rule (or generalize) that is used by the teacher machine to generate data based only on a few examples.
The performance of a machine learning algorithm is measured by plots of the generalization error values through the learning process and are called learning curves.
The generalization error of a perceptron is the probability of the student perceptron to classify an example differently from the teacher and is given by the overlap of the student and teacher synaptic vectors and is a function of their scalar product.