Junction tree algorithm

The junction tree algorithm (also known as 'Clique Tree') is a method used in machine learning to extract marginalization in general graphs. In essence, it entails performing belief propagation on a modified graph called a junction tree. The basic premise is to eliminate cycles by clustering them into single nodes.

Junction tree algorithm

Hugin algorithm

Note that this last step is inefficient for graphs of large treewidth. Computing the messages to pass between supernodes involves doing exact marginalization over the variables in both supernodes. Performing this algorithm for a graph with treewidth k will thus have at least one computation which takes time exponential in k.

Underlying theory

The first step concerns only Bayesian networks, and is a procedure to turn a directed graph into an undirected one. We do this because it allows for the universal applicability of the algorithm, regardless of direction.

The second step is setting variables to their observed value. This is usually needed when we want to calculate conditional probabilities, so we fix the value of the random variables we condition on. Those variables are also said to be clamped to their particular value.

The third step is to ensure that graphs are made chordal if they aren't already chordal. This is the first essential step of the algorithm. It makes use of the following theorem:[1]

Theorem: For an undirected graph, G, the following properties are equivalent:

Thus, by triangulating a graph, we make sure that the corresponding junction tree exists. A usual way to do this, is to decide an elimination order for its nodes, and then run the Variable elimination algorithm. This will result to adding more edges to the initial graph, in such a way that the output will be a chordal graph. The next step is to construct the junction tree. To do so, we use the graph from the previous step, and form its corresponding clique graph.[2] Now the next theorem gives us a way to find a junction tree:[1]

Theorem: Given a triangulated graph, weight the edges of the clique graph by their cardinality, |A∩B|, of the intersection of the adjacent cliques A and B. Then any maximum-weight spanning tree of the clique graph is a junction tree.

So, to construct a junction tree we just have to extract a maximum weight spanning tree out of the clique graph. This can be efficiently done by, for example, modifying Kruskal's algorithm. The last step is to apply belief propagation to the obtained junction tree.[3]

References

  1. 1 2 Wainwright, Martin (31 March 2008). "Graphical models, message-passing algorithms, and variational methods: Part I" (PDF). Berkeley EECS. Retrieved 16 November 2016.
  2. "Clique Graph". Retrieved 16 November 2016.
  3. Barber, David (28 January 2014). "Probabilistic Modelling and Reasoning, The Junction Tree Algorithm" (PDF). University of Helsinki. Retrieved 16 November 2016.


This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.