Variational Bayesian methods
From Wikipedia, the free encyclopedia
This article only describes one highly specialized aspect of its associated subject. Please help improve this article by adding more general information. |
Variational Bayesian methods, also called ensemble learning, are a family of techniques for approximating intractable integrals arising in Bayesian statistics and machine learning. They can be used to lower bound the marginal likelihood (i.e. "evidence") of several models with a view to performing model selection, and often provide an analytical approximation to the parameter posterior which is useful for prediction. It is an alternative to Monte Carlo sampling methods for making use of a posterior distribution that is difficult to sample from directly.
[edit] Mathematical derivation
In variational inference, the posterior distribution over a set of latent variables given some data D is approximated by a variational distribution
The variational distribution Q(X) is restricted to belong to a family of distributions of simpler form than P(X | D). This family is selected with the intention that Q can be made very similar to the true posterior. The difference between Q and this true posterior is measured in terms of a dissimilarity function d(Q;P) and hence inference is performed by selecting the distribution Q that minimises d. One choice of dissimilarity function where this minimisation is tractable is the Kullback-Leibler divergence (KL divergence), defined as
We can write the log evidence as
-
.
As the log evidence is fixed with respect to Q, maximising the final term will minimise the KL divergence between Q and P. By appropriate choice of Q, we can make tractable to compute and to maximise. Hence we have both a lower bound on the evidence and an analytical approximation to the posterior Q.
[edit] See also
- Variational message passing: a modular algorithm for variational Bayesian inference.
- Expectation-maximization algorithm: a related approach which corresponds to a special case of variational Bayesian inference.
[edit] External links
- Variational-Bayes.org - a repository of papers, software, and links related to the use of variational Bayesian methods.
- The on-line textbook: Information Theory, Inference, and Learning Algorithms, by David J.C. MacKay provides an introduction to variational methods (p. 422).