Extremal optimization (EO) is an optimization heuristic inspired by the Bak–Sneppen model of self-organized criticality from the field of statistical physics. This heuristic was designed initially to address combinatorial optimization problems such as the travelling salesman problem and spin glasses, although the technique has been demonstrated to function in optimization domains.

Relation to self-organized criticality

Self-organized criticality (SOC) is a statistical physics concept to describe a class of dynamical systems that have a critical point as an attractor. Specifically, these are non-equilibrium systems that evolve through avalanches of change and dissipations that reach up to the highest scales of the system. SOC is said to govern the dynamics behind some natural systems that have these burst-like phenomena including landscape formation, earthquakes, evolution, and the granular dynamics of rice and sand piles. Of special interest here is the Bak–Sneppen model of SOC, which is able to describe evolution via punctuated equilibrium (extinction events) – thus modelling evolution as a self-organised critical process.

Relation to computational complexity

Another piece in the puzzle is work on computational complexity, specifically that critical points have been shown to exist in NP-complete problems, where near-optimum solutions are widely dispersed and separated by barriers in the search space causing local search algorithms to get stuck or severely hampered. It was the evolutionary self-organised criticality model by Bak and Sneppen and the observation of critical points in combinatorial optimisation problems that lead to the development of Extremal Optimization by Stefan Boettcher and Allon Percus.

The technique

EO was designed as a local search algorithm for combinatorial optimization problems. Unlike genetic algorithms, which work with a population of candidate solutions, EO evolves a single solution and makes local modifications to the worst components. This requires that a suitable representation be selected which permits individual solution components to be assigned a quality measure ("fitness"). This differs from holistic approaches such as ant colony optimization and evolutionary computation that assign equal-fitness to all components of a solution based upon their collective evaluation against an objective function. The algorithm is initialized with an initial solution, which can be constructed randomly, or derived from another search process.

The technique is a fine-grained search, and superficially resembles a hill climbing (local search) technique. A more detailed examination reveals some interesting principles, which may have applicability and even some similarity to broader population-based approaches (evolutionary computation and artificial immune system). The governing principle behind this algorithm is that of improvement through selectively removing low-quality components and replacing them with a randomly selected component. This is obviously at odds with genetic algorithms, the quintessential evolutionary computation algorithm that selects good solutions in an attempt to make better solutions.

The resulting dynamics of this simple principle is firstly a robust hill climbing search behaviour, and secondly a diversity mechanism that resembles that of multiple-restart search. Graphing holistic solution quality over time (algorithm iterations) shows periods of improvement followed by quality crashes (avalanche) very much in the manner as described by punctuated equilibrium. It is these crashes or dramatic jumps in the search space that permit the algorithm to escape local optima and differentiate this approach from other local search procedures. Although such punctuated-equilibrium behaviour can be "designed" or "hard-coded", it should be stressed that this is an emergent effect of the negative-component-selection principle fundamental to the algorithm.

EO has primarily been applied to combinatorial problems such as graph partitioning and the travelling salesman problem, as well as problems from statistical physics such as spin glasses.

Variations on the theme and applications

Generalised extremal optimization (GEO) was developed to operate on bit strings where component quality is determined by the absolute rate of change of the bit, or the bits contribution to holistic solution quality. This work includes application to standard function optimisation problems as well as engineering problem domains. Another similar extension to EO is Continuous Extremal Optimization (CEO).

EO has been applied to image rasterization as well as used as a local search after using ant colony optimization. EO has been used to identify structures in complex networks. EO has been used on a multiple target tracking problem. Finally, some work has been done on investigating the probability distribution used to control selection.

See also

References

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.