Extremal optimization

From Wikipedia, the free encyclopedia

Extremal Optimization (EO) is an optimization heuristic inspired by the Bak-Sneppen model of self-organized criticality from the field of statistical physics. This heuristic was designed initially to address combinatorial optimization problems such as the travelling salesman problem and spin glasses, although the technique has been demonstrated to function in optimization domains.

Contents

[edit] Relation to self-organized criticality

Self-organized criticality (SOC) is a statistical physics concept to describe a class of dynamical systems that have a critical point as an attractor. Specifically, these are non-equilibrium systems that evolve through avalanches of change and dissipations that reach up to the highest scales of the system. SOC is said to govern the dynamics behind some natural systems that have these burst-like phenomena including landscape formation, earthquakes, evolution, and the granular dynamics of rice and sand piles. Of special interest here is the Bak-Sneppen model of SOC, which is able to describe evolution via punctuated equilibrium (extinction events) - thus modelling evolution as a self-organised critical process.

[edit] Relation to computational complexity

Another piece in the puzzle is work on computational complexity, specifically that critical points have been shown to exist in NP-complete problems, where near-optimum solutions are widely dispersed and separated by barriers in the search space causing local search algorithms to get stuck or severely hampered. It was the evolutionary self-organised criticality model by Bak and Sneppen and the observation of critical points in combinatorial optimisation problems that lead to the development of Extremal Optimization by Stefan Boettcher and Allon Percus.

[edit] The technique

EO was designed as a local search algorithm for combinatorial optimization problems. Unlike genetic algorithms, which work with a population of candidate solutions, EO evolves a single solution and makes local modifications to the worst components. This requires that a suitable representation be selected which permits individual solution components to be assigned a quality measure ("fitness"). This differs from holistic approaches such as ant colony optimization and evolutionary computation that assign equal-fitness to all components of a solution based upon their collective evaluation against an objective function. The algorithm is initialized with an initial solution, which can be constructed randomly, or derived from another search process.

The technique is a fine-grained search, and superficially resembles a hill climbing (local search) technique. A more detailed examination reveals some interesting principles, which may have applicability and even some similarity to broader population-based approaches (evolutionary computation and artificial immune system). The governing principle behind this algorithm is that of improvement through selectively removing low-quality components and replacing them with a randomly selected component. This is obviously at odds with genetic algorithms, the quintessential evolutionary computation algorithm that selects good solutions in an attempt to make better solutions.

The resulting dynamics of this simple principle is firstly a robust hill climbing search behaviour, and secondly a diversity mechanism that resembles that of multiple-restart search. Graphing holistic solution quality over time (algorithm iterations) shows periods of improvement followed by quality crashes (avalanche) very much in the manner as described by punctuated equilibrium. It is these crashes or dramatic jumps in the search space that permit the algorithm to escape local optima and differentiate this approach from other local search procedures. Although such punctuated-equilibrium behaviour can be "designed" or "hard-coded", it should be stressed that this is an emergent effect of the negative-component-selection principle fundamental to the algorithm.

EO has primarily been applied to combinatorial problems such as graph partitioning and the travelling salesman problem, as well as problems from statistical physics such as spin glasses.

[edit] Variations on the theme and applications

Generalised Extremal Optimization (GEO) was developed to operate on bit strings where component quality is determined by the absolute rate of change of the bit, or the bits contribution to holistic solution quality. This work includes application to standard function optimisation problems as well as engineering problem domains. Another similar extension to EO is Continuous Extremal Optimization (CEO).

EO has been applied to image rasterization as well as used as a local search after using ant colony optimization. EO has been used to identify structures in complex networks. EO has been used on a multiple target tracking problem. Finally, some work has been done on investigating the probability distribution used to control selection.

[edit] References

  • [1] Per Bak, Chao Tang, and Kurt Wiesenfeld, "Self-organized criticality: An explanation of the 1/f noise", Physical Review Letters 59, 381–384 (1987)
  • [2] Per Bak and Kim Sneppen, "Punctuated equilibrium and criticality in a simple model of evolution", Physical Review Letters 71, 4083–4086 (1993)
  • [3] P Cheeseman, B Kanefsky, WM Taylor, "Where the really hard problems are", Proceedings of the 12th IJCAI, (1991)
  • G Istrate, "Computational complexity and phase transitions", Proceedings. 15th Annual IEEE Conference on Computational Complexity, 104-115 (2000)
  • [4] Stefan Boettcher, Allon G. Percus, "Extremal Optimization: Methods derived from Co-Evolution", Proceedings of the Genetic and Evolutionary Computation Conference (1999)
  • [5] Stefan Boettcher, "Extremal optimization of graph partitioning at the percolation threshold", J. Phys. A: Math. Gen. 32, 5201-5211 (1999)
  • [6] S Boettcher, A Percus, "Nature’s Way of Optimizing", Artif. Intel. 119, (2000) 275
  • [7] S Boettcher, "Extremal Optimization - Heuristics via Co-Evolutionary Avalanches", Computing in Science & Engineering 2, pp. 75-82, 2000
  • [8] Stefan Boettcher and Allon G. Percus, "Optimization with Extremal Dynamics", Phys. Rev. Lett. 86, 5211–5214 (2001)
  • [9] Jesper Dall and Paolo Sibani, "Faster Monte Carlo Simulations at Low Temperatures. The Waiting Time Method", Computer Physics Communication 141 (2001) 260-267
  • [10] Stefan Boettcher and Michelangelo Grigni, "Jamming Model for the Extremal Optimization Heuristic", J. Phys. A: Math. Gen. 35, 1109-1123 (2002)
  • [11] Souham Meshoul and Mohamed Batouche, "Robust Point Correspondence for Image Registration Using Optimization with Extremal Dynamics", Lecture Notes in Computer Science 2449, 330-337 (2002)
  • [12] Roberto N. Onody and Paulo A. de Castro, "Self-Organized Criticality, Optimization and Biodiversity", Int. J. Mod. Phys. C 14, 911-916 (2002)
  • [13] Stefan Boettcher and Allon G. Percus, "Extremal Optimization at the Phase Transition of the 3-Coloring Problem", Phys. Rev. E 69, 066703 (2004)
  • [14] A. Alan Middleton, "Improved extremal optimization for the Ising spin glass", Phys. Rev. E 69, 055701 (2004)
  • [15] F. Heilmann, K. H. Hoffmann and P. Salamon, "Best possible probability distribution over extremal optimization ranks", Europhys. Lett. 66, pp. 305-310 (2004)
  • [16] Pontus Svenson, "Extremal optimization for sensor report pre-processing", Proc SPIE 5429, 162-171 (2004)
  • [17] Tao Zhou, Wen-Jie Bai, Long-Jiu Cheng, Bing-Hong Wang, "Continuous extremal optimization for Lennard-Jones Clusters", Phys. Rev. E 72, 016702 (2004)
  • [18] Jordi Duch and Alex Arenas, "Community detection in complex networks using extremal optimization", Phys. Rev. E 72, 027104 (2005)


[edit] Web Resources

  • [19] Stefan Boettcher's home page which includes an excellent explanation of the technique and demonstration applets
  • [20] Allon Percus home page
  • [21] A good introduction to EO with lots of linked references

[edit] See also