Hypercomputation

Hypercomputation or super-Turing computation refers to models of computation that go beyond Turing computability. This includes various hypothetical methods for the computation of non-Turing-computable functions.

The term "super-Turing computation" emerged in the early 1990s, with at least two independent sources noted in the literature. It appeared in talks, the PhD thesis,[1] and even earlier technical reports of Hava Siegelmann since the early 1990s for a very particular theory (described below) and became a subfield of computation since her 1995 Science paper.[2] It also appears in a 1990 talk[3] and 1991 technical report[4] by Mike Stannett, who also published a theoretical discussion of an X-machine-based "super-Turing machine" in March 1990.[5]

The term "hypercomputation" was introduced in 1999 by Jack Copeland and Diane Proudfoot.[6]

The terms are not quite synonymous: "super-Turing computation" by Siegelmann usually implies that the proposed model is likely to exist in nature (via biology and/or physics) and may thus be physically realizable, while "hypercomputation" may be a general mathematical or philosophical idea.

History

A computational model going beyond Turing machines was introduced by Alan Turing in his 1938 PhD dissertation Systems of Logic Based on Ordinals.[7] This paper investigated mathematical systems in which an oracle was available, which could compute a single arbitrary (non-recursive) function from naturals to naturals. He used this device to prove that even in those more powerful systems, undecidability is still present. Turing's oracle machines are mathematical abstractions, and are not physically realizable.[8]

Hypercomputation and the Church–Turing thesis

The Church–Turing thesis states that any algorithmically computable function can be computed by a Turing machine. Hypercomputers compute functions that a Turing machine cannot and which are, hence, not computable in the Church-Turing sense.

An example of a problem a Turing machine cannot solve is the halting problem. A Turing machine cannot decide if an arbitrary program halts or runs forever. Some proposed hypercomputers could simulate the program for an infinite number of steps and tell the user whether the program halted. In particular, Siegelmann showed in her 1993 PhD thesis,[1] and later in her 1998 book,[9] that the halting problem of Turing machines can be solved with analog recurrent neural networks.

Super-Turing Computation and the Super-Turing thesis

In the early 1990s Hava Siegelmann and Eduardo Sontag proved that an Artificial Recurrent Neural Network (ARNN), could perform beyond the Turing limit (hypercomputation)[10][11]

Siegelmann and colleagues revealed a hierarchy of well established computational classes that start at the Turing machine and ascend to the full super-Turing power [12] Siegelmann described in various publications that there are many ways to reach super-Turing computation. While first she showed the existence of Super-Turing computation via analog neural networks with fixed (non-learning) and unbounded weights, she went on to prove super-Turing power in many varied realizable machines including small precision weight neural networks that receive their power from: learning in analog domain, or using stochasticity,[9] or evolving over time,[13] or use data from environment, or systems that are Turing in any computable steps and in between change by one of the above methods. See also work in Verifying Properties of Neural Networks

Siegelmann's Science paper demonstrated a first step of realizing this computation[2] while current efforts exist by physicists and engineers to build such systems. Siegelmann's recent research proves that the state space of Super-Turing computation is 2^{\aleph_0} while the space of Turing machines is \aleph_0 only, giving a new understanding to the Turing test - as separating spaces of different size [14]

The theory led to better understanding of neural networks (the foundation of deep-learning) and supported innovative applications in such sensitive areas as radar registration and the control of nuclear plants. [15] [16] [17] [18] Siegelmann and colleagues went further to create a complexity theory for continuous time and physical systems.[19][20] As a particular example Siegelmann's group analyzed linear programming and other computer-science problems showing that analog computers can solve these faster than discrete time computers [21] [22]

A recent publication reveals that Alan Turing had been searching for Super-Turing computation based on brain principles, and show his suggested directions how to look for it perfectly match with Siegelmann's theories. [23] Siegelmann and Sontag proposed a new computational hypothesis - whereas analog, learning, and evolving systems are constrained by the Super-Turing computational power.

Other Hypercomputer proposals

Analysis of capabilities

Many hypercomputation proposals amount to alternative ways to read an oracle or advice function embedded into an otherwise classical machine. Others allow access to some higher level of the arithmetic hierarchy. For example, supertasking Turing machines, under the usual assumptions, would be able to compute any predicate in the truth-table degree containing \Sigma^0_1 or \Pi^0_1. Limiting-recursion, by contrast, can compute any predicate or function in the corresponding Turing degree, which is known to be \Delta^0_2. Gold further showed that limiting partial recursion would allow the computation of precisely the \Sigma^0_2 predicates.

Model Computable predicates Notes Refs
supertasking tt(\Sigma^0_1, \Pi^0_1) dependent on outside observer [47]
limiting/trial-and-error  \Delta^0_2 [25]
iterated limiting (k times)  \Delta^0_{k+1} [27]
Blum-Shub-Smale machine incomparable with traditional computable real functions [48]
Malament-Hogarth spacetime HYP dependent on spacetime structure [49]
analog recurrent neural network  \Delta^0_1[f] f is an advice function giving connection weights; size is bounded by runtime [2][50]
infinite time Turing machine  \ge T(\Sigma^1_1) [51]
classical fuzzy Turing machine  \Sigma^0_1 \cup \Pi^0_1 for any computable t-norm [45]
increasing function oracle  \Delta^1_1 for the one-sequence model;  \Pi^1_1 are r.e. [46]

Taxonomy of "super-recursive" computation methodologies

Mark Burgin has collected a list of what he calls "super-recursive algorithms" (from Burgin 2005: 132):

In the same book, he presents also a list of "algorithmic schemes":

Criticism

Martin Davis, in his writings on hypercomputation[56][57] refers to this subject as "a myth" and offers counter-arguments to the physical realizability of hypercomputation. As for its theory, he argues against the claims that this is a new field founded in the 1990s. This point of view relies on the history of computability theory (degrees of unsolvability, computability over functions, real numbers and ordinals), as also mentioned above. In his argument he makes a remark that all of hypercomputation is trivial as : " if non computable inputs are permitted then non computable outputs are attainable." It is widely accepted that this criticism refers to earliest mathematical and philosophical suggestions and ignores much of the newer proposals that are not subject to the criticism.

Andrew Hodges wrote a critical commentary[58] on Copeland and Proudfoot's article.[6]

See also

References

  1. 1 2 Hava Siegelmann (Oct 1993). Foundations of Recurrent Neural Networks (Ph.D. thesis). Rutgers, The State University of New Jersey.
  2. 1 2 3 H.T. Siegelmann (Apr 1995). "Computation Beyond the Turing Limit" (PDF). Science 268 (5210): 545548. doi:10.1126/science.268.5210.545. PMID 17756722.
  3. Mike Stannett, Super-Turing Computation. Seminar presentation (scans of original slides), Department of Computer Science, University of Sheffield, 1990.
  4. Mike Stannett, An Introduction to post-Newtonian and super-Turing computation. Technical Report CS-91-02, Department of Computer Science, University of Sheffield, 1991.
  5. Mike Stannett, 1990, X-machines and the halting problem: Building a super-Turing machine Formal Aspects of Computing, Volume 2, Issue 1, pp. 331-441. http://link.springer.com/article/10.1007%2FBF01888233
  6. 1 2 Copeland and Proudfoot, Alan Turing's forgotten ideas in computer science. Scientific American, April 1999
  7. Alan Turing, 1939, Systems of Logic Based on Ordinals Proceedings London Mathematical Society Volumes 2–45, Issue 1, pp. 161–228.
  8. "Let us suppose that we are supplied with some unspecified means of solving number-theoretic problems; a kind of oracle as it were. We shall not go any further into the nature of this oracle apart from saying that it cannot be a machine" (Undecidable p. 167, a reprint of Turing's paper Systems of Logic Based On Ordinals)
  9. 1 2 H.T. Siegelmann, "Neural Networks and Analog Computation: Beyond the Turing Limit", Birkhauser, Boston, December 1998
  10. H.T. Siegelmann and E.D. Sontag (1995). "Computational Power of Neural Networks" (PDF). Journal of Computer System Sciences 50 (1): 132150. doi:10.1006/jcss.1995.1013. (Work described as most fundamental theorem about neural networks in Simon Haykin’s book of Neural Networks.)
  11. H.T. Siegelmann and E.D. Sontag (1994). "Analog Computation via Neural Networks" (PDF). Theoretical Computer Science 131: 331360. doi:10.1016/0304-3975(94)90178-3. (Work described as the fundamental theorem differentiating neural networks from classical computers in Simon Haykin’s book of Neural Networks; is cited in the field and in the media.)
  12. J.L. Balcázar and R. Gavaldà and H.T. Siegelmann (Jul 1997). "Computational Power of Neural Networks: A Characterization in Terms of Kolmogorov Complexity" (PDF). IEEE Transactions on Information Theory 43 (4): 11751183. doi:10.1109/18.605580.
  13. J. Cabessa and H. T. Siegelmann (2014). "The Super-Turing Computational Power of Plastic Recurrent Neural Networks" (PDF). International Journal of Neural Systems 24 (8).
  14. J. Cabessa and H.T. Siegelmann (Apr 2012). "The Computational Power of Interactive Recurrent Neural Networks" (PDF). Neural Computation 24 (4): 9961019. doi:10.1162/neco_a_00263.
  15. J.P. Neto, H.T. Siegelmann, and J.F. Costa, “Implementation of Programming Languages with Neural Nets,” International Journal of Computing Anticipatory Systems 1, 1997: 201-208
  16. J. Kilian and H.T. Siegelmann, “The Dynamic Universality of Sigmoidal Neural Networks,” Information and Computation 128(1), 1996: 45-56.
  17. H.T. Siegelmann, B.G. Horne, and C.L.Giles, “Computational Capabilities of Recurrent NARX Neural Networks,” IEEE Transaction on Systems, Man and Cybernetics – part B: Cybernetics 27(2), 1997: 208-215.
  18. 47. H.T. Siegelmann, E. Nissan, and A. Galperin, “A Novel Neural/Symbolic Hybrid Approach to Heuristically Optimized Fuel Allocation and Automated Revision of Heuristics in Nuclear Engineering,” Advances in Engineering Software 28(9), 1997: 581-592.
  19. H.T. Siegelmann, A. Ben-Hur and S. Fishman, “Computational Complexity for Continuous Time Dynamics,” Physical Review Letters, 83(7), 1999: 1463-1466.
  20. H.T. Siegelmann and S. Fishman, “Computation by Dynamical Systems,” Physica D 120, 1998 (1-2): 214-235.
  21. A. Ben-Hur, J. Feinberg, S. Fishman and H. T. Siegelmann, “Random matrix theory for the analysis of the performance of an analog computer: a scaling theory,” Physics Letters A. 323(3-4), March 2004: 204-209.
  22. A. Ben-Hur, J. Feinberg, S. Fishman and H. T. Siegelmann, “Probabilistic analysis of a differential equation for linear programming,” Journal of Complexity 19(4), August 2003: 474-510.
  23. H. T. Siegelmann, “Turing on Super-Turing and Adaptivity”. J. Progress in Biophysics & Molecular Biology. April (Sep) 2013, 113(1):117-26. doi: 10.1016/j.pbiomolbio.2013.03.013.
  24. These models have been independently developed by many different authors, including Hermann Weyl (1927). Philosophie der Mathematik und Naturwissenschaft.; the model is discussed in Shagrir, O. (June 2004). "Super-tasks, accelerating Turing machines and uncomputability" (PDF). Theor. Comput. Sci. 317: 105–114. doi:10.1016/j.tcs.2003.12.007. and in Petrus H. Potgieter (July 2006). "Zeno machines and hypercomputation". Theoretical Computer Science 358 (1): 23–33. doi:10.1016/j.tcs.2005.11.040.
  25. 1 2 3 E. M. Gold (1965). "Limiting Recursion". Journal of Symbolic Logic 30 (1): 28–48. doi:10.2307/2270580. JSTOR 2270580., E. Mark Gold (1967). "Language identification in the limit". Information and Control 10 (5): 447–474. doi:10.1016/S0019-9958(67)91165-5.
  26. 1 2 3 Hilary Putnam (1965). "Trial and Error Predicates and the Solution to a Problem of Mostowksi". Journal of Symbolic Logic 30 (1): 49–57. doi:10.2307/2270581. JSTOR 2270581.
  27. 1 2 L. K. Schubert (July 1974). "Iterated Limiting Recursion and the Program Minimization Problem". Journal of the ACM 21 (3): 436–445. doi:10.1145/321832.321841.
  28. Arnold Schönhage, "On the power of random access machines", in Proc. Intl. Colloquium on Automata, Languages, and Programming (ICALP), pages 520-529, 1979. Source of citation: Scott Aaronson, "NP-complete Problems and Physical Reality" p. 12
  29. Edith Spaan, Leen Torenvliet and Peter van Emde Boas (1989). "Nondeterminism, Fairness and a Fundamental Analogy". EATCS bulletin 37: 186–193.
  30. Hajnal Andréka, István Németi and Gergely Székely, Closed Timelike Curves in Relativistic Computation Parallel Process. Lett. 22, 1240010 (2012).
  31. Todd A. Brun, Computers with closed timelike curves can solve hard problems, Found.Phys.Lett. 16 (2003) 245-253.
  32. S. Aaronson and J. Watrous. Closed Timelike Curves Make Quantum and Classical Computing Equivalent
  33. Hogarth, M., 1992, 'Does General Relativity Allow an Observer to View an Eternity in a Finite Time?', Foundations of Physics Letters, 5, 173–181.
  34. István Neméti; Hajnal Andréka (2006). "Can General Relativistic Computers Break the Turing Barrier?". Logical Approaches to Computational Barriers, Second Conference on Computability in Europe, CiE 2006, Swansea, UK, June 30-July 5, 2006. Proceedings. Lecture Notes in Computer Science 3988. Springer. doi:10.1007/11780342.
  35. Etesi, G., and Nemeti, I., 2002 'Non-Turing computations via Malament-Hogarth space-times', Int.J.Theor.Phys. 41 (2002) 341–370, Non-Turing Computations via Malament-Hogarth Space-Times:.
  36. Earman, J. and Norton, J., 1993, 'Forever is a Day: Supertasks in Pitowsky and Malament-Hogarth Spacetimes', Philosophy of Science, 5, 22–42.
  37. Joel David Hamkins and Andy Lewis, Infinite time Turing machines, Journal of Symbolic Logic, 65(2):567-604, 2000.
  38. 1 2 Jan van Leeuwen; Jiří Wiedermann (September 2000). "On Algorithms and Interaction". MFCS '00: Proceedings of the 25th International Symposium on Mathematical Foundations of Computer Science. Springer-Verlag.
  39. Jürgen Schmidhuber (2000). "Algorithmic Theories of Everything". Sections in: Hierarchies of generalized Kolmogorov complexities and nonenumerable universal measures computable in the limit. International Journal of Foundations of Computer Science ():587-612 (). Section 6 in: the Speed Prior: A New Simplicity Measure Yielding Near-Optimal Computable Predictions. in J. Kivinen and R. H. Sloan, editors, Proceedings of the 15th Annual Conference on Computational Learning Theory (COLT ), Sydney, Australia, Lecture Notes in Artificial Intelligence, pages 216--228. Springer, . 13 (4): 1–5. arXiv:quant-ph/0011122.
  40. 1 2 J. Schmidhuber (2002). "Hierarchies of generalized Kolmogorov complexities and nonenumerable universal measures computable in the limit". International Journal of Foundations of Computer Science 13 (4): 587–612. doi:10.1142/S0129054102001291.
  41. There have been some claims to this effect; see Tien Kieu (2003). "Quantum Algorithm for the Hilbert's Tenth Problem". Int. J. Theor. Phys. 42 (7): 1461–1478. arXiv:quant-ph/0110136. doi:10.1023/A:1025780028846. or M. Ziegler (2005). "Computational Power of Infinite Quantum Parallelism". International Journal of Theoretical Physics 44 (11): 2059–2071. doi:10.1007/s10773-005-8984-0. and the ensuing literature. For a retort see Warren D. Smith. "Three counterexamples refuting Kieu's plan for "quantum adiabatic hypercomputation"; and some uncomputable quantum mechanical tasks". Applied Mathematics and Computation 178 (1): 184–193. doi:10.1016/j.amc.2005.09.078..
  42. Bernstein and Vazirani, Quantum complexity theory, SIAM Journal on Computing, 26(5):1411-1473, 1997.
  43. Santos, Eugene S. (1970). "Fuzzy Algorithms". Information and Control 17 (4): 326–339. doi:10.1016/S0019-9958(70)80032-8.
  44. Biacino, L.; Gerla, G. (2002). "Fuzzy logic, continuity and effectiveness". Archive for Mathematical Logic 41 (7): 643–667. doi:10.1007/s001530100128. ISSN 0933-5846.
  45. 1 2 3 Wiedermann, Jiří (2004). "Characterizing the super-Turing computing power and efficiency of classical fuzzy Turing machines". Theor. Comput. Sci. 317 (1–3): 61–69. doi:10.1016/j.tcs.2003.12.004.
  46. 1 2 Dmytro Taranovsky (July 17, 2005). "Finitism and Hypercomputation". Retrieved Apr 26, 2011.
  47. Petrus H. Potgieter (July 2006). "Zeno machines and hypercomputation". Theoretical Computer Science 358 (1): 23–33. doi:10.1016/j.tcs.2005.11.040.
  48. Lenore Blum, Felipe Cucker, Michael Shub, and Stephen Smale. Complexity and Real Computation. ISBN 0-387-98281-7.
  49. P.D. Welch (10 Sep 2006). "The extent of computation in Malament-Hogarth spacetimes". British J. for Philosophy of Science 59: 659–674. arXiv:gr-qc/0609035. Check date values in: |year= / |date= mismatch (help)
  50. Hava Siegelmann; Eduardo Sontag (1994). "Analog Computation via Neural Networks". Theoretical Computer Science 131 (2): 331–360. doi:10.1016/0304-3975(94)90178-3.
  51. Joel David Hamkins; Andy Lewis (2000). "Infinite Time Turing machines". Journal of Symbolic Logic 65 (2): 567–604. doi:10.2307/2586556.
  52. Hintikka, Ja; Mutanen, A. (1998). "An Alternative Concept of Computability". Language, Truth, and Logic in Mathematics. Dordrecht. pp. 174–188.
  53. Darko Roglic (24 Jul 2007). "The universal evolutionary computer based on super-recursive algorithms of evolvability". arXiv:0708.2686 [cs.NE].
  54. Eugene Eberbach (2002). "On expressiveness of evolutionary computation: is EC algorithmic?". Computational Intelligence, WCCI 1: 564–569. doi:10.1109/CEC.2002.1006988.
  55. Borodyanskii, Yu M; Burgin, M. S. (1994). "Operations and compositions in transrecursive operators". Cybernetics and Systems Analysis 30 (4): 473–478. doi:10.1007/BF02366556.
  56. Davis, Martin (2006). "Why there is no such discipline as hypercomputation". Applied Mathematics and Computation 178 (1): 4–7. doi:10.1016/j.amc.2005.09.066.
  57. Davis, Martin (2004). "The Myth of Hypercomputation". Alan Turing: Life and Legacy of a Great Thinker. Springer.
  58. Andrew Hodges. "The Professors and the Brainstorms". The Alan Turing Home Page. Retrieved 23 September 2011.

Further reading

External links

This article is issued from Wikipedia - version of the Tuesday, February 09, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.