Evolutionary algorithm

In artificial intelligence, an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions (see also loss function). Evolution of the population then takes place after the repeated application of the above operators.

Evolutionary algorithms often perform well approximating solutions to all types of problems because they ideally do not make any assumption about the underlying fitness landscape. Techniques from evolutionary algorithms applied to the modeling of biological evolution are generally limited to explorations of microevolutionary processes and planning models based upon cellular processes. In most real applications of EAs, computational complexity is a prohibiting factor. In fact, this computational complexity is due to fitness function evaluation. Fitness approximation is one of the solutions to overcome this difficulty. However, seemingly simple EA can solve often complex problems; therefore, there may be no direct link between algorithm complexity and problem complexity.

Comparison to biological processes

A possible limitation of many evolutionary algorithms is their lack of a clear genotype-phenotype distinction. In nature, the fertilized egg cell undergoes a complex process known as embryogenesis to become a mature phenotype. This indirect encoding is believed to make the genetic search more robust (i.e. reduce the probability of fatal mutations), and also may improve the evolvability of the organism.[1][2] Such indirect (a.k.a. generative or developmental) encodings also enable evolution to exploit the regularity in the environment.[3] Recent work in the field of artificial embryogeny, or artificial developmental systems, seeks to address these concerns. And gene expression programming successfully explores a genotype-phenotype system, where the genotype consists of linear multigenic chromosomes of fixed length and the phenotype consists of multiple expression trees or computer programs of different sizes and shapes.[4]

Implementation

Step One: Generate the initial population of individuals randomly. (First generation)

Step Two: Evaluate the fitness of each individual in that population (time limit, sufficient fitness achieved, etc.)

Step Three: Repeat the following regenerational steps until termination:

  1. Select the best-fit individuals for reproduction. (Parents)
  2. Breed new individuals through crossover and mutation operations to give birth to offspring.
  3. Evaluate the individual fitness of new individuals.
  4. Replace least-fit population with new individuals.

Types

Similar techniques differ in genetic representation and other implementation details, and the nature of the particular applied problem.

Swarm algorithms, including:

Other population-based metaheuristic methods

Examples

The computer simulations Tierra and Avida attempt to model macroevolutionary dynamics.

[10] [11] [12]

References

  1. G.S. Hornby and J.B. Pollack. Creating high-level components with a generative representation for body-brain evolution. Artificial Life, 8(3):223–246, 2002.
  2. Jeff Clune, Benjamin Beckmann, Charles Ofria, and Robert Pennock. "Evolving Coordinated Quadruped Gaits with the HyperNEAT Generative Encoding". Proceedings of the IEEE Congress on Evolutionary Computing Special Section on Evolutionary Robotics, 2009. Trondheim, Norway.
  3. J. Clune, C. Ofria, and R. T. Pennock, "How a generative encoding fares as problem-regularity decreases," in PPSN (G. Rudolph, T. Jansen, S. M. Lucas, C. Poloni, and N. Beume, eds.), vol. 5199 of Lecture Notes in Computer Science, pp. 358–367, Springer, 2008.
  4. Ferreira, C., 2001. Gene Expression Programming: A New Adaptive Algorithm for Solving Problems. Complex Systems, Vol. 13, issue 2: 87–129.
  5. Wayward World, by Jon Roland. Novel that uses fetura to select candidates for public office.
  6. F. Merrikh-Bayat, The runner-root algorithm: A metaheuristic for solving unimodal and multimodal optimization problems inspired by runners and roots of plants in nature, Applied Soft Computing, Vol. 33, pp. 292–303, 2015
  7. R. Oftadeh et al. (2010), A novel meta-heuristic optimization algorithm inspired by group hunting of animals: Hunting search, 60, 2087–2098.
  8. A. Agharghor and M,E. Riffi (2017), First Adaptation of Hunting Search Algorithm for the Quadratic Assignment Problem, 520, 263–267. doi=10.1007/978-3-319-46568-5_27
  9. Hasançebi, O., Kazemzadeh Azad, S. (2015), Adaptive Dimensional Search: A New Metaheuristic Algorithm for Discrete Truss Sizing Optimization, Computers and Structures, 154, 1–16.
  10. Simionescu, P.A.; Beale, D.G.; Dozier, G.V. (2004), Constrained optimization problem solving using estimation of distribution algorithms (PDF), Proc. of the 2004 Congress on Evolutionary Computation - CEC2004, Portland, OR, pp. 1647–1653, doi:10.1109/CEC.2006.1688506, retrieved 7 January 2017
  11. Simionescu, P.A.; Dozier, G.V.; Wainwright, R.L. (2006), A Two-Population Evolutionary Algorithm for Constrained Optimization Problems (PDF), Proc 2006 IEEE International Conference on Evolutionary Computation, Vancouver, Canada, pp. 1647–1653, doi:10.1109/CEC.2006.1688506, retrieved 7 January 2017
  12. Simionescu, P.A. (2014). Computer Aided Graphing and Simulation Tools for AutoCAD Users (1st ed.). Boca Raton, FL: CRC Press. ISBN 978-1-4822-5290-3.

Bibliography

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.