Normalization (statistics)
From Wikipedia, the free encyclopedia
In one usage in statistics, normalization is the process of removing statistical error in repeated measured data. A normalization is sometimes based on a property. Quantile normalization for instance is normalization based on the magnitude of the measures.
In another usage in statistics, normalization refers to the division of multiple sets of data by a common variable in order to negate that variable's effect on the data, thus allowing underlying characteristics of the data sets to be compared.
In an experimental context, normalisations are used to standardise microarray data to enable differentiation between real (biological) variations in gene expression levels and variations due to the measurement process. In microarray analysis, normalisation refers to the process of identifying and removing the systematic effects, and bring the data from different microarrays onto a common scale.
For another usage in statistics and probability theory, see normalizing constant.