HyperNEAT

Hypercube-based NEAT, or HyperNEAT, [1] is a generative encoding that evolves artificial neural networks (ANNs) with the principles of the widely used NeuroEvolution of Augmented Topologies (NEAT) algorithm.[2] It is a novel technique for evolving large-scale neural networks utilizing the geometric regularities of the task domain. It uses Compositional Pattern Producing Networks [3] (CPPNs), which are used to generate the images for Picbreeder.org and EndlessForms.com.

Applications to Date

References

  1. ^ K. O. Stanley , D. B. D’Ambrosio and J. Gauci, “A Hypercube-Based Indirect Encoding for Evolving Large-Scale Neural Networks.” Artificial Life. 2009. To be published.
  2. ^ Kenneth O. Stanley and Risto Miikkulainen (2002). "Evolving Neural Networks Through Augmenting Topologies". Evolutionary Computation 10 (2): 99-127
  3. ^ K. O. Stanley, “Compositional pattern producing networks: A novel abstraction of development,” Genetic Programming and Evolvable Machines, vol. 8, pp. 131 – 162, June 2007.
  4. ^ D. B. D’Ambrosio and K. O. Stanley, “Generative encoding for multiagent learning,” in GECCO ’08: Proceedings of the 10th annual conference on Genetic and evolutionary computation, (New York, NY, USA), pp. 819–826, ACM, 2008.
  5. ^ J. Gauci and K. O. Stanley, “A case study on the critical role of geometric regularity in machine learning,” in AAAI (D. Fox and C. P. Gomes, eds.), pp. 628–633, AAAI Press, 2008.
  6. ^ Jeff Clune, Benjamin Beckmann, Charles Ofria, and Robert Pennock. Evolving Coordinated Quadruped Gaits with the HyperNEAT Generative Encoding. Proceedings of the IEEE Congress on Evolutionary Computing Special Section on Evolutionary Robotics, 2009. (pdf)Trondheim, Norway.
  7. ^ J. Clune, R.T. Pennock, and C. Ofria. The sensitivity of HyperNEAT to different geometric representations of a problem. Proceedings of the Genetic and Evolutionary Computation Conference (GECCO). Montreal, Canada. 2009. (pdf)
  8. ^ Yosinski J, Clune J, Hidalgo D, Nguyen S, Cristobal Zagal J, Lipson H (2011) Evolving Robot Gaits in Hardware: the HyperNEAT Generative Encoding Vs. Parameter Optimization. Proceedings of the European Conference on Artificial Life. (pdf)
  9. ^ Clune J, Stanley KO, Pennock RT, and Ofria C. On the performance of indirect encoding across the continuum of regularity. IEEE Transactions on Evolutionary Computation, 2011. pdf
  10. ^ J. Clune, C. Ofria, and R. T. Pennock, “How a generative encoding fares as problem-regularity decreases,” in PPSN (G. Rudolph, T. Jansen, S. M. Lucas, C. Poloni, and N. Beume, eds.), vol. 5199 of Lecture Notes in Computer Science, pp. 358–367, Springer, 2008. (pdf)
  11. ^ Clune J, Beckmann BE, Pennock RT, and Ofria C. HybrID: A Hybridization of Indirect and Direct Encodings for Evolutionary Computation. Proceedings of the European Conference on Artificial Life (ECAL), 2009. Budapest, Hungary. (pdf)
  12. ^ Clune J, Beckmann BE, McKinley PK, and Ofria C (2010) Investigating whether HyperNEAT produces modular neurasl networks. Proceedings of the Genetic and Evolutionary Computation Conference. 635-642.(pdf)
  13. ^ Suchorzewski M, Clune J (2011) A Novel Generative Encoding for Evolving Modular, Regular and Scalable Networks. Proceedings of the Genetic and Evolutionary Computation Conference. 1523-1530. (pdf)
  14. ^ Verbancsics P and Stanley KO (2011) Constraining Connectivity to Encourage Modularity in HyperNEAT Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2011). New York, NY:ACM

External links