Kruskal's algorithm
From Wikipedia, the free encyclopedia
Kruskal's algorithm is an algorithm in graph theory that finds a minimum spanning tree for a connected weighted graph. This means it finds a subset of the edges that forms a tree that includes every vertex, where the total weight of all the edges in the tree is minimized. If the graph is not connected, then it finds a minimum spanning forest (a minimum spanning tree for each connected component). Kruskal's algorithm is an example of a greedy algorithm.
It works as follows:
- create a forest F (a set of trees), where each vertex in the graph is a separate tree
- create a set S containing all the edges in the graph
- while S is nonempty
- remove an edge with minimum weight from S
- if that edge connects two different trees, then add it to the forest, combining two trees into a single tree
- otherwise discard that edge
At the termination of the algorithm, the forest has only one component and forms a minimum spanning tree of the graph.
This algorithm first appeared in Proceedings of the American Mathematical Society, pp. 48–50 in 1956, and was written by Joseph Kruskal.
Other algorithms for this problem include Prim's algorithm, and Borůvka's algorithm.
Contents |
[edit] Performance
Where E is the number of edges in the graph and V is the number of vertices, Kruskal's algorithm can be shown to run in O(E log E) time, or equivalently, O(E log V) time, all with simple data structures. These running times are equivalent because:
- E is at most V2 and log V2 = 2 × logV is O(log V).
- If we ignore isolated vertices, which will each be their own component of the minimum spanning tree anyway, V ≤ 2E, so log V is O(log E).
We can achieve this bound as follows: first sort the edges by weight using a comparison sort in O(E log E) time; this allows the step "remove an edge with minimum weight from S" to operate in constant time. Next, we use a disjoint-set data structure to keep track of which vertices are in which components. We need to perform O(E) operations, two find operations and possibly one union for each edge. Even a simple disjoint-set data structure such as disjoint-set forests with union by rank can perform O(E) operations in O(E log V) time. Thus the total time is O(E log E) = O(E log V).
Provided that the edges are either already sorted or can be sorted in linear time (for example with counting sort or radix sort), the algorithm can use more sophisticated disjoint-set data structures to run in O(E α(V)) time, where α is the extremely slowly-growing inverse of the single-valued Ackermann function.
[edit] Example
[edit] Proof of correctness
Let P be a connected, weighted graph and let Y be the subgraph of P produced by the algorithm. Y cannot have a cycle, since the last edge added to that cycle would have been within one subtree and not between two different trees. Y cannot be disconnected, since the first encountered edge that joins two components of Y would have been added by the algorithm. Thus, Y is a spanning tree of P.
For simplicity, assume that all edges have different weights. Let Y1 be a minimum spanning tree. If Y1=Y then Y is a minimum spanning tree. Otherwise, let e be the first edge considered by the algorithm that is in Y but not in Y1. Y1+e has a cycle, because you cannot add an edge to a spanning tree and still have a tree. This cycle contains another edge f which at the stage of the algorithm where e is added to Y, has not been considered. This is because otherwise e would not connect different trees but two branches of the same tree. Then Y2=Y1+e-f is also a spanning tree. Its total weight is less than the total weight of Y1. This is because the algorithm visits e before f. It follows that Y1 is no minimum spanning tree, and the assumption that there exists an edge in Y, but not in Y1, was incorrect. This proves that Y=Y1, i.e., Y is a minimum spanning tree.
[edit] Pseudocode
1 function Kruskal(G) 2 for each vertex v in G do 3 Define an elementary cluster C(v) ← {v}. 4 Initialize a priority queue Q to contain all edges in G, using the weights as keys. 5 Define a tree T ← Ø //T will ultimately contain the edges of the MST 6 while T has fewer than n-1 edges do 7 (u,v) ← Q.removeMin() 8 Let C(v) be the cluster containing v, and let C(u) be the cluster containing u. 9 if C(v) ≠ C(u) then 10 Add edge (v,u) to T. 11 Merge C(v) and C(u) into one cluster, that is, union C(v) and C(u). 12 return tree T
[edit] References
- J. B. Kruskal: On the shortest spanning subtree and the traveling salesman problem. In: Proceedings of the American Mathematical Society. 7 (1956), pp. 48–50
- Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein. Introduction to Algorithms, Second Edition. MIT Press and McGraw-Hill, 2001. ISBN 0-262-03293-7. Section 23.2: The algorithms of Kruskal and Prim, pp.567–574.
- Michael T. Goodrich and Roberto Tamassia. Data Structures and Algorithms in Java, Fourth Edition. John Wiley & Sons, Inc., 2006. ISBN 0-471-73884-0. Section 13.7.1: Kruskal's Algorithm, pp.632.