Branch and bound
Graph and tree search algorithms |
---|
Listings |
|
Related topics |
Branch and bound (BB or B&B) is an algorithm design paradigm for discrete and combinatorial optimization problems, as well as general real valued problems. A branch-and-bound algorithm consists of a systematic enumeration of candidate solutions by means of state space search: the set of candidate solutions is thought of as forming a rooted tree with the full set at the root. The algorithm explores branches of this tree, which represent subsets of the solution set. Before enumerating the candidate solutions of a branch, the branch is checked against upper and lower estimated bounds on the optimal solution, and is discarded if it cannot produce a better solution than the best one found so far by the algorithm.
The algorithm depends on the efficient estimation of the lower and upper bounds of a region/branch of the search space and approaches exhaustive enumeration as the size (n-dimensional volume) of the region tends to zero.
The method was first proposed by A. H. Land and A. G. Doig[1] in 1960 for discrete programming, and has become the most commonly used tool for solving NP-hard optimization problems.[2] The name "branch and bound" first occurred in the work of Little et al. on the traveling salesman problem.[3][4]
Overview
The goal of a branch-and-bound algorithm is to find a value x that maximizes or minimizes the value of a real-valued function f(x), called an objective function, among some set S of admissible, or candidate solutions. The set S is called the search space, or feasible region. The rest of this section assumes that minimization of f(x) is desired; this assumption comes without loss of generality, since one can find the maximum value of f(x) by finding the minimum of g(x) = −f(x). A B&B algorithm operates according to two principles:
- It recursively splits the search space into smaller spaces, then minimizing f(x) on these smaller spaces; the splitting is called branching.
- Branching alone would amount to brute-force enumeration of candidate solutions and testing them all. To improve on the performance of brute-force search, a B&B algorithm keeps track of bounds on the minimum that it is trying to find, and uses these bounds to "prune" the search space, eliminating candidate solutions that it can prove will not contain an optimal solution.
Turning these principles into a concrete algorithm for a specific optimization problem requires some kind of data structure that represents of sets of candidate solutions. Such a representation is called an instance of the problem. Denote the set of candidate solutions of an instance I by SI. The instance representation has to come with two operations:
- branch(I) produces two or more instances that each represent a subset of SI. (Typically, the subsets are disjoint to prevent the algorithm from visiting the same candidate solution twice, but this is not required. The only requirement for a correct B&B algorithm is that the optimal solution among SI is contained in at least one of the subsets.[5])
- bound(I) computes a lower bound on the value of any candidate solution in the space represented by I, that is, bound(I) ≤ f(x) for all x in Si.
- solution(I) determines whether I represents a single candidate solution. (Optionally, if it does not, the operation may choose to return some feasible solution from among SI.[5])
Using these operations, a B&B algorithm performs a top-down recursive search through the tree of instances formed by the branch operation. Upon visiting an instance I, it checks whether bound(I) is greater than the upper bound for some other instance that it already visited; if so, I may be safely discarded from the search and the recursion stops. This pruning step is usually implemented by maintaining a global variable that records the minimum upper bound seen among all instances examined so far.
Generic version
The following is the skeleton of a generic branch and bound algorithm for minimizing an arbitrary objective function f.[2] To obtain an actual algorithm from this, one requires a bounding function g, that computes lower bounds of f on nodes of the search tree, as well as a problem-specific branching rule.
- Using a heuristic, find a solution xh to the optimization problem. Store its value, B = f(xh). (If no heuristic is available, set B to infinity.) B will denote the best solution found so far, and will be used as an upper bound on candidate solutions.
- Initialize a queue to hold a partial solution with none of the variables of the problem assigned.
- Loop until the queue is empty:
- Take a node N off the queue.
- If N represents a single candidate solution x and f(x) < B, then x is the best solution so far. Record it and set B ← f(x).
- Else, branch on N to produce new nodes Ni. For each of these:
- If g(Ni) > B, do nothing; since the lower bound on this node is greater than the upper bound of the problem, it will never lead to the optimal solution, and can be discarded.
- Else, store Ni on the queue.
Several different queue data structures can be used. A stack (LIFO queue) will yield a depth-first algorithm. A best-first branch and bound algorithm can be obtained by using a priority queue that sorts nodes on their g-value.[2] The depth-first variant is recommended when no good heuristic is available for producing an initial solution, because it quickly produces full solutions, and therefore upper bounds.[6]
Improvements
When is a vector of , branch and bound algorithms can be combined with interval analysis[7] and contractor techniques in order to provide guaranteed enclosures of the global minimum.[8][9]
Applications
This approach is used for a number of NP-hard problems
- Integer programming
- Nonlinear programming
- Travelling salesman problem (TSP)[3][10]
- Quadratic assignment problem (QAP)
- Maximum satisfiability problem (MAX-SAT)
- Nearest neighbor search[11]
- Cutting stock problem
- False noise analysis (FNA)
- Computational phylogenetics
- Set inversion
- Parameter estimation
- 0/1 knapsack problem
- Feature selection in machine learning[12]
- Structured prediction in computer vision[13]:267–276
Branch-and-bound may also be a base of various heuristics. For example, one may wish to stop branching when the gap between the upper and lower bounds becomes smaller than a certain threshold. This is used when the solution is "good enough for practical purposes" and can greatly reduce the computations required. This type of solution is particularly applicable when the cost function used is noisy or is the result of statistical estimates and so is not known precisely but rather only known to lie within a range of values with a specific probability.
Relation to other algorithms
Nau et al. present a generalization of branch and bound that also subsumes the A*, B* and alpha-beta search algorithms from artificial intelligence.[14]
See also
- Backtracking
- Branch-and-cut, a hybrid between branch-and-bound and the cutting plane methods that is used extensively for solving integer linear programs.
References
- ↑ A. H. Land and A. G. Doig (1960). "An automatic method of solving discrete programming problems". Econometrica 28 (3). pp. 497–520. doi:10.2307/1910129.
- 1 2 3 Clausen, Jens (1999). Branch and Bound Algorithms—Principles and Examples (PDF) (Technical report). University of Copenhagen.
- 1 2 Little, John D. C.; Murty, Katta G.; Sweeney, Dura W.; Karel, Caroline (1963). "An algorithm for the traveling salesman problem" (PDF). Operations Research 11 (6): 972–989. doi:10.1287/opre.11.6.972.
- ↑ Balas, Egon; Toth, Paolo (1983). Branch and bound methods for the traveling salesman problem (PDF) (Report). Carnegie Mellon University Graduate School of Industrial Administration.
- 1 2 Bader, David A.; Hart, William E.; Phillips, Cynthia A. (2004). "Parallel Algorithm Design for Branch and Bound" (PDF). In Greenberg, H. J. Tutorials on Emerging Methodologies and Applications in Operations Research. Kluwer Academic Press.
- ↑ Mehlhorn, Kurt; Sanders, Peter (2008). Algorithms and Data Structures: The Basic Toolbox (PDF). Springer. p. 249.
- ↑ Moore, R. E. (1966). Interval Analysis. Englewood Cliff, New Jersey: Prentice-Hall. ISBN 0-13-476853-1.
- ↑ Jaulin, L.; Kieffer, M.; Didrit, O.; Walter, E. (2001). Applied Interval Analysis. Berlin: Springer. ISBN 1-85233-219-0.
- ↑ Hansen, E.R. (1992). Global Optimization using Interval Analysis. New York: Marcel Dekker.
- ↑ Conway, Richard Walter; Maxwell, William L.; Miller, Louis W. (2003). Theory of Scheduling. Courier Dover Publications. pp. 56–61.
- ↑ Fukunaga, Keinosuke; Narendra, Patrenahalli M. (1975). "A branch and bound algorithm for computing k-nearest neighbors". IEEE Transactions on Computers: 750–753. doi:10.1109/t-c.1975.224297.
- ↑ Narendra, Patrenahalli M.; Fukunaga, K. (1977). "A branch and bound algorithm for feature subset selection" (PDF). IEEE Transactions on Computers C–26 (9): 917–922. doi:10.1109/TC.1977.1674939.
- ↑ Nowozin, Sebastian; Lampert, Christoph H. (2011). "Structured Learning and Prediction in Computer Vision". Foundations and Trends in Computer Graphics and Vision 6 (3–4): 185–365. doi:10.1561/0600000033. ISBN 978-1-60198-457-9.
- ↑ Nau, Dana S.; Kumar, Vipin; Kanal, Laveen (1984). "General branch and bound, and its relation to A∗ and AO∗" (PDF). Artificial Intelligence 23 (1): 29–58. doi:10.1016/0004-3702(84)90004-3.
|