Simulation hypothesis

From Wikipedia, the free encyclopedia
Illustration of the brain in a vat concept

The simulation hypothesis (simulation argument or simulism) proposes that reality is a simulation and those affected are generally unaware of this. The concept is reminiscent of René Descartes' Evil Genius but posits a more futuristic simulated reality.

Origins

There is a long philosophical and scientific history to the underlying thesis that reality is an illusion. This skeptical hypothesis (which can be dated in Western thought back to Parmenides, Zeno of Elea and Plato and in Eastern thought to the Advaita Vedanta concept of Maya) arguably underpins the mind–body dualism of Descartes, and is closely related to phenomenalism, a stance briefly adopted by Bertrand Russell. In a narrower sense it has become an important theme in science fiction, and recently has become a serious topic of study for futurology, in particular for transhumanism through the work of Nick Bostrom. The Simulation Hypothesis is a subject of serious academic debate[1] within the field of transhumanism.

In its current form, the Simulation Argument began in 2003 with the publication of a paper by Nick Bostrom.[1] Bostrom considers that the argument goes beyond skepticism, claiming that "...we have interesting empirical reasons to believe that a certain disjunctive claim about the world is true", one of the disjunctive propositions being that we are almost certainly living in a simulation.[2] Bostrom and other writers postulate there are empirical reasons why the 'Simulation Hypothesis' might be valid.[1][3] Bostrom's trilemma is formulated in temporal logic as follows:[4]

"A technologically mature "posthuman" civilization would have enormous computing power. Based on this empirical fact, the simulation argument shows that at least one of the following propositions is true:
  1. The fraction of human-level civilizations that reach a posthuman stage is very close to zero;
  2. The fraction of posthuman civilizations that are interested in running ancestor-simulations is very close to zero;
  3. The fraction of all people with our kind of experiences that are living in a simulation is very close to one.
If (1) is true, then we will almost certainly go extinct before reaching posthumanity. If (2) is true, then there must be a strong convergence among the courses of advanced civilizations so that virtually none contains any relatively wealthy individuals who desire to run ancestor-simulations and are free to do so. If (3) is true, then we almost certainly live in a simulation. In the dark forest of our current ignorance, it seems sensible to apportion one’s credence roughly evenly between (1), (2), and (3).
Unless we are now living in a simulation, our descendants will almost certainly never run an ancestor-simulation."

Chalmers, in The Matrix as Metaphysics agrees that this is not a skeptical hypothesis but rather a Metaphysical Hypothesis.[5] Chalmers goes on to identify three separate hypotheses, which, when combined gives what he terms the Matrix Hypothesis; the notion that reality is but a computer simulation:

  • The Creation Hypothesis, that "Physical space-time and its contents were created by beings outside physical space-time" [5] It is related to the Omphalos hypothesis in theology.
  • The Computational Hypothesis, that "Microphysical processes throughout space-time are constituted by underlying computational processes"[5]
  • The Mind–Body Hypothesis, that "mind is constituted by processes outside physical space-time, and receives its perceptual inputs from and sends its outputs to processes in physical space-time".[5]

The term Simulism appears to have been coined by Ivo Jansch in September 2006.

Descartes

Descartes' Meditations

Descartes (1596–1650) is one of the first 'modern' thinkers to attempt to provide a philosophical framework of mind and the world we perceive around us, seeking a fundamental set of truths. In his writings, Descartes employs a version of methodological skepticism, the first precept of which he states is "never to accept anything for true which I did not clearly know to be such".[6]

In his work Meditations on First Philosophy, he writes that he can only be sure of one thing: thought exists – cogito ergo sum, normally translated as "I think, therefore I am".[7] One of the fundamental ideas explored by Descartes is mind–body dualism which impinges on the nature of reality as we perceive it, and concerns the relationship which exists between mental processes, and bodily states. Descartes mused whether his perception of a body was the result of a dream, or an illusion created by an evil demon. He reasons that: "The mind is a substance distinct from the body, a substance whose essence is thought." [7] From this stance, Descartes goes on to argue:
"I have a clear and distinct idea of myself as a thinking, non-extended thing, and a clear and distinct idea of body as an extended and non-thinking thing. Whatever I can conceive clearly and distinctly, God can so create." [7]
Descartes concludes that the mind, a thinking thing, can and does exist apart from its extended body. This relationship of the mind to the body, is arguably one of the central issues in the philosophy of mind.[8] Descartes also discussed the existence of the external world, arguing that sensory perceptions are involuntary, and are not consciously directed, and as such are evidence of a world external to the mind, since God has given him the "propensity" to believe that such ideas are caused by material things.[7]

Later critics responded to Descartes's 'proof' for the external world with the brain in a vat thought experiment, suggesting in that Descartes' brain might be connected to a machine which simulates all of these perceptions. However, the vat and the machine exist in an external world, so one form of external world is simply replaced by another.

Later thinkers

David Hume

Hume (1711–1776) argued for two kinds of reasoning: probable and demonstrative (Hume's fork), and applied these to the skeptical argument that reality is but an illusion. He concludes that neither of these two forms of reasoning can lead us to belief in the continued existence of an external world. Demonstration by itself cannot establish the uniformity of nature (as laid out by scientific laws and principles), and reason alone cannot establish that the future will resemble the past (e.g. that the sun will rise tomorrow), Probable reasoning, which aims to take us from the observed to the unobserved, cannot do this either, as it also depends on the uniformity of nature, and cannot be proved without circularity by any appeal to uniformity. Hume concludes that there is no solution to the skeptical argument except, to ignore it.[9]

Immanuel Kant

Immanuel Kant
Kant (1724–1804) was an advocate of Transcendental Idealism, that there are limits on what can be understood, and what we see as reality is merely how things appear to us, not how those things are in and of themselves. In his Critique of Pure Reason he notes:
"Everything intuited or perceived in space and time, and therefore all objects of a possible experience, are nothing but phenomenal appearances, that is, mere representations [and] have no independent, self-subsistent existence apart from our thoughts".[10]

An important theme in Kant's work is that there are fundamental features of reality that escape our direct knowledge because of the natural limits of our senses and faculties.[10]

Hegel, Husserl and Heidegger

These three philosophers form the core of Phenomenological thought.

Hegel (1770–1831) proposed a conception of knowledge, mind and reality in which the mind itself creates external forms and objects that stand outside of it or opposed to it. The mind recognizes itself in these external forms, so that they become simultaneously 'mind' and 'other-than-mind'.[11]

Husserl (1859–1938) observed that the 'natural standpoint' of our perception of the world and its objects is characterized by a belief that the objects exist and possess properties. Husserl proposed a way of looking at objects by examining how we "constitute" them as (seemingly) real objects, rather than simply figments of our imagination. In this Phenomenological standpoint, the object ceases to be "external", with mere indicators about its nature, its essence arising from the relationship between the object and the perceiver.[12]

Heidegger (1889–1976) in Being and Time questions of the meaning of Being, and distinguishes it from any specific thing "'Being' is not something like a being".[13] According to Heidegger, this sense of being precedes any notions of which beings exist, as it is a primary construct.

Phenomenalism

Phenomenalism is the view that physical objects do not exist as things in themselves but only as perceptions or sensory stimuli (e.g. redness, hardness, softness, sweetness, etc.) situated in time and in space. In particular, phenomenalism reduces talk about physical objects in the external world to talk about bundles of sense-data. For a brief period, Bertrand Russell (1872–1970) held the view that all that we could be aware of was this sense data; everything else, including physical objects which generated the sense data, could only be known by description, and not known directly.[14]

Modal realism

Modal realism asserts that all possible worlds are as real as this world. A "possible world" is a term devised by Leibniz to enable logical analysis of propositions. The idea was first proposed in papers by David Lewis in the late 1960s,[15] but elaborated upon in Counterfactuals (1973) .[16] This latter work contained an analysis of counterfactual conditionals in terms of the theory of possible worlds and modelled counterfactuals using the possible world semantics of modal logic. In On the Plurality of Worlds, (1991), Lewis argues that "the thesis that the world we are part of is but one of a plurality of worlds, ... and that we who inhabit this world are only a few out of all the inhabitants of all the worlds."

Note: The only relevance of modal realism to the simulation hypothesis is this: if all possible worlds exist, then there is some possible world in which someone has experiences just like yours as the result of a simulation, and you cannot be sure that the real world is not such a world. This, however, adds little to the simulation hypothesis for it is already presumed in the simulation hypothesis that the real world might actually be a world in which your experiences are the result of a simulation.

Constructivism

Ernst von Glasersfeld is a proponent of Radical Constructivism, which claims that knowledge is the result of a self-organizing cognitive process of the human brain. The process of constructing knowledge regulates itself, whereby knowledge is constructed rather than compiled from empirical data. It is therefore impossible in principle to know the extent to which knowledge reflects an external reality. "The function of cognition is adaptive and serves the organisation of the experiential world, not the discovery of ontological reality" [17]

Social constructivism is a sociological theory of knowledge which rose to prominence in 1966 with the publication of The Social Construction of Reality.[18] Social constructivism (or constructionism) attempts to uncover how individuals and groups participate and negotiate their perceived reality, and shared understanding; in this way reality is socially constructed. Paul Ernest (1991) summarises the main foundations of social constructivism as follows:

"Knowledge is not passively received but actively built up by the cognizing subject. The personal theories which result from the organization of the experiential world must fit the constraints imposed by physical and social reality. This is achieved by a cycle of theory – prediction – test – failure – accommodation – new theory. This gives rise to socially agreed theories of the world." [19]

Computationalism

A Turing Machine consisting of an infinite tape and a tape reader.

Computationalism claims that cognition is a form of computation, and underpins much of the work in Artificial Intelligence. It is related to Functionalism, a philosophy of mind put forth by Hilary Putnam in 1960, inspired by the analogies between the mind and the theoretical Turing Machines, which according to the Church–Turing Thesis are capable of processing any given algorithm which is computable. Computationalism rests on two theses: (i) Computational Sufficiency, that an appropriate computational structure suffices for the possession of mind, and (ii) Computational Explanation, that computation provides a framework for the explanation of cognitive processes.[20]

Computationalism assumes the possibility of Strong AI, which would be required in order to establish even a theoretical possibility of a simulated reality. However, the relationship between cognition and phenomenal consciousness is disputed by Searle in an argument known as the Chinese room.[21] Further critics have argued that it is possible that consciousness requires a substrate of "real" physics, and simulated people, while behaving appropriately, would be philosophical zombies.[22]

Transhumanism

Converging Technologies, (2002) explores the potential for technological improvements to human performance.

The first known use of the term "Transhumanism" was by Julian Huxley in 1957. During the 1980s a group of scientists, artists, and futurists began to organize into the transhumanist movement. Transhumanist thinkers postulate that human beings will eventually be transformed into beings with such greatly expanded abilities as to merit the label "posthuman".[23] Proponents draw on future studies and various fields of ethics such as bioethics, infoethics, nanoethics, neuroethics, roboethics, and technoethics, and are predominantly secular posthumanist and politically liberal.

Nick Bostrom, in A History of Transhumanist Thought (2005)[23] locates transhumanism's roots in Renaissance humanism and the Enlightenment. Transhumanism can be defined as:

  • The improvement of the human condition through applied reason, and technology to eliminate aging and greatly enhance human capacities.
  • The study of the technologies that will enable us to overcome fundamental human limitations, and the ethical issues involved in their use.[24]

The Simulation Argument[1] is part of the Transhumanist debate, located within Digital Philosophy.

Dream argument

The dream argument contends that a futuristic technology is not required to create a simulated reality, but rather, all that is needed is a human brain. More specifically, the mind's ability to create simulated realities during REM sleep affects the statistical likelihood of our own reality being simulated.

Types of reality simulation

Simulation of reality is currently a fictional technology, and non-fictional examples are limited to reality TV or computer simulations of specific events and situations. Current technology in the form of virtual, augmented or mixed reality is very limited in comparison to what would be needed to achieve a convincing simulation of reality. The following typology of the different forms of reality simulation is drawn from examples from both science fiction and futurology. One may usefully distinguish between two types of simulation: in an extrinsic simulation, the consciousness is external to the simulation, whereas in an intrinsic simulation the consciousness is entirely contained within it and has no presence in the external reality.

Physical simulation

Here, the body and functions of participants remain intact, entering into a simulation and participating using their normal physical body. Examples range from Reality TV shows such as The Big Brother House which are social simulations, through online social network services such as Second Life and Massively Multiplayer On-Line Role Playing Games to fictional simulations such as the Star Trek Holodeck. In the extreme case as fictionally portrayed in the original Star Trek episode "The Menagerie", participant's minds were convinced not only of a simulated reality, but also that their physical bodies had been transformed.

Brain-computer interface

In a brain-computer interface simulation, participants enter the simulation from outside, directly connecting their brain to the simulation computer, but normally keeping their physical form intact. The computer transfers sensory data to them and reads their desires and actions back; in this manner they interact with the simulated world and receive feedback from it. The participant may even receive adjustment in order to temporarily forget that they are inside a virtual realm, sometimes called "passing through the veil", a term borrowed from Christianity, which describes the supposed passage of a soul from an earthly body to an afterlife. While inside the simulation, the participant can be represented by an avatar, which could look very different from the participant's actual appearance. The Cyberpunk genre of fiction contains many examples of brain-computer interface simulated reality, most notably featured in The Matrix trilogy.

Brain-in-a-vat

A variant of the brain-computer-interface simulation is the brain-in-a-vat. This is used in philosophy as part of thought experiments, (for example, by Hilary Putnam).

Emigration

In an emigration simulation, the participant would enter the simulation from an outer reality, via a brain-computer interface, but to a much greater degree. On entry, the participant is subject to mind transfer which temporarily relocates their mental processing into a virtual-person which holds their consciousness. Their outside-world presence remains in stasis during the simulation. After the simulation is over, the participant's mind is transferred back into their outer-reality body, along with all new memories and experiences gained. Mind transfer is portrayed in Science Fiction novels such as Mindswap (1966) by Robert Sheckley and the TV series Quantum Leap; most notably, mind transfer was the primary mechanism by which consciousness was transferred in The Thirteenth Floor (1999).

Virtual world simulation

In a virtual world simulation, every inhabitant is a native of the simulated world. They do not have a 'real' body in the 'outside' reality. Rather, each is a fully simulated entity, possessing an appropriate level of consciousness that is implemented using the simulation's own logic (i.e. using its own physics). Typical of such a simulation at one extreme (but with no level of consciousness) would be an artificial life simulation such as The Sims computer game. In many computer games, inhabitants lacking consciousness are referred to as NPCs (Non-player characters), or bots (see Philosophical zombies). Where virtual entities achieve the level of artificial consciousness, they could be downloaded from one simulation to another, or even archived and resurrected at a later date. It is also possible that a simulated entity could be moved out of the simulation entirely by means of mind transfer into a synthetic body. Ancestor simulations as described by Nick Bostrom would fall into this category.

Virtual solipsistic simulation

In this type of simulation, an artificial consciousness is created; the "world" participants perceive exists only within their minds. There are two possible variants of this: in the first, there is only a single solipsistic conscious entity in existence, and is the sole focus of the simulation; in the second, there are multiple conscious entities, but each receives a separate but globally consistent version of the simulation . This scenario is a counterpart of social constructivism which concerns the ways in which groups participate in the creation of their perceived reality.

An intermingled simulation would support both extrinsic and intrinsic types of consciousness: beings from an outer reality visiting or emigrating, and virtual-people who are natives of the simulation both artificial consciousnesses or bots, lacking any physical body in the outer reality. Sometimes this is termed a metaverse. The Matrix trilogy features an intermingled type of simulation: it contains not only human minds, but also the 'agents', who are sovereign software programs indigenous to the computed realm, and NPCs.

Consequences of living in a simulation

Based on the assumption that we are living in a simulation, philosophers have hypothesised about the nature of their creators. A conclusion reached by Peter S. Jenkins at York University argues that there would be multiple reasons to create a simulation; In order to avoid the simulation creating another simulation, the first would be deleted. As it is predicted that we'd have the technology to create simulations in the year 2050, long-term planning after that "would be futile".[25] This, in turn, raises questions as to why the creators of the simulation would delete the simulation. More importantly, if our universe were one of many being simulated, the simulation argument could therefore be statistically applied to the creators saying they are in a simulation too.

Since Nick Bostrom's 2003 publication, many articles have appeared that discuss computer simulated realities,[26] including one short book, On Computer Simulated Universes,[27] written by Dr. Mark J. Solomon in 2013. In this work, the term Matryoshkaverse was coined, meaning that our universe might be locked inside and also encompass a vast number of other universes, like a set of Russian wooden dolls, with one or more dolls each nested inside another (pp. 43–44). Solomon postulates that if there are computer simulated universes, there must also be computer universes contained within simulated universes. With many active simulations, there would be a wide range of physical properties differing from universe to universe. Universes with more positive physical traits to support life would produce better environments for more advanced civilizations to evolve to the point where they themselves would create their own computer simulated universes. And this process would continue. So over a long period of time, universes would evolve with the physics more favorable for life. Solomon argues that universes, over time, have been naturally selected for particular physical properties, with an end result of creating more and more habitable, hospitable and longer-lived universes. In other words, this line of reasoning could explain how the laws of physics might actually evolve relying on a process somewhat similar to human evolution (pp. 21–24). Solomon writes that computer programs do not have the ability to make independent choices or engage in independent decision-making. In other words, computer programs do not have Free Will (and neither do human beings, as Solomon writes). Further, computer program are entirely dependent on physical laws. And with the absence of free will, there no longer appears to be a meaningful distinction between an individual or group of individuals running a simulated universe versus one simulated universe running another simulated universe (pp. 32–35). Also, Solomon contends that with the absence of Free Will the distinction often made between individuals or "selves" is largely artificial or arbitrary.

Testing the hypothesis

A method to test the hypothesis was proposed in 2012 in a joint paper by physicists Silas R. Beane from the University of Bonn (now at the University of Washington, Seattle), and Zohreh Davoudi and Martin J. Savage from the University of Washington, Seattle.[28] Under the assumption of finite computational resources, the simulation of the universe would be performed by dividing the continuum space-time into a discrete set of points. In analogy with the mini-simulations that lattice-gauge theorists run today to build up nuclei from the underlying theory of strong interactions (known as Quantum Chromodynamics), several observational consequences of a grid-like space-time have been studied in their work. Among proposed signatures is an anisotropy in the distribution of ultra-high-energy cosmic rays, that, if observed, would be consistent with the simulation hypothesis according to these physicists (but, of course, would not prove that the universe is a simulation). A multitude of physical observables must be explored before any such scenario could be accepted or rejected as a theory of nature.[29]

In popular culture

Science fiction themes

Science fiction has highlighted themes such as virtual reality, artificial intelligence and computer gaming for more than twenty years. One of the first references to simulations occurred in the 1959 novel Time out of Joint by Philip K. Dick. In this the central character is trapped in a "bubble" of 1950s small town America. Simulacron-3 (1964) by Daniel F. Galouye (alternative title: Counterfeit World) tells the story of a virtual city developed as a computer simulation for market research purposes, in which the simulated inhabitants possess consciousness; all but one of the inhabitants are unaware of the true nature of their world.

Permutation City (1994) by Greg Egan explores quantum ontology via the various philosophical aspects of artificial life and simulations of intelligence. Other Egan novels, such as Diaspora (1997) and Schild's Ladder (2002) also involve simulated consciousness. In Iain Banks's The Algebraist, a simulist religion called "The Truth" is the dominant belief system of a considerable proportion of interstellar humanity.

In the 20th Century both drama and film have repeatedly explored alternative realities, such as the Theatre of the Absurd, and cropping up unexpectedly in films such as It's a Wonderful Life, and the 1960s television series The Prisoner. The Truman Show (1998) was a fictional example showing the logical extension of this trend, in which the central character is trapped within a physical simulation and whose life is controlled by a director. The idea that reality might be a computer simulation was the central thesis of The Matrix Trilogy (1999–2003). However, many earlier science fiction plot lines incorporated variants this theme and its associated elements such as artificial intelligence.

Other feature films whose plot lines have explicitly involved the simulism hypothesis:

The 2012 play 'World of Wires' was partially inspired by the Bostrom essay.[30]

Role-playing and wargaming

Role-playing simulations have a long history stretching back to ancient times, and have been used extensively in vocation-oriented higher-education courses (e.g. Law, Medicine, Economics) as well as politics and international relations contexts, for example SimSoc is a "game" used to teach various aspects of sociology, political science, and communications skills, originally created by William A. Gamson in 1966, and currently in its fifth edition Role-play simulations can be described as "multi-agenda social-process simulations" in which "participants assume individual roles in a hypothesised social group and experience the complexity of establishing and implementing particular goals within the fabric established by the system". .[31] Simulations involving role-play also have therapeutic uses within psychotherapy, in the form of psychodrama, developed by Jacob L. Moreno in the 1920s. Later on in the 20th Century this was termed play therapy.

Role-play is also an important part of military training. The Prussian term for live-action military training exercises is "Kriegsspiel" or wargames, and are used for training and evaluation purposes. A similar use of role-playing is an essential feature of the Incident Command System (ICS), widely used by emergency response agencies to manage and evaluate responses to large and/or complex incidents. Battle and other historical reenactments also involve role-play, and have been practised for millennia, but with entertainment appearing to be the primary purpose, rather than training or system evaluation.

The history of role-playing games begins with the earlier tradition of role-playing, which combined with the rulesets of fantasy wargames gives rise to the modern role-playing game. This can take a variety of forms: live action role-playing games, theatre-style live action role-playing, freeform role-playing games, indie role-playing games, storytelling games, are all games in which the participants assume the roles of characters and collaboratively create stories using a role-playing game system. Such games may require the players to remain in character or to allow players to comment on action by stepping out of character. The participants do not all need to be present: play by mail and play-by-post games both allow for asynchronous and distance game-playing. A computer version of play by mail (Yahoo! Role-Playing) became popular in the 1990s.

The GNS theory, originally developed by Ron Edwards, is an attempt to document how role-playing games work. The theory divides participants into three categories: gamists (who are concerned with competition and challenge), narrativists (who are concerned with story and theme) and simulationists (who are concerned with the gaming experience and exploration).[32]

Computer games and simulations

The history of video games begins in the late 1940s[33] when Thomas T. Goldsmith Jr. and Estle Ray Mann were granted a patent for a game to be played using a cathode ray tube. During the 1950s and the 1960s various such games were developed,[34] and by the early 1970s such games were becoming commercially viable.[35] The first generation of personal computer games were often text adventures or interactive fiction,[36] in which the player communicated with the computer by entering commands through a keyboard. By the mid-1970s, games were being developed and distributed through magazines, such as Creative Computing and Computer Gaming World.[37]

The development of role-playing video games began in the mid-1970s, when stand-alone role-playing video games were being developed as an offshoot of mainframe text-based role-playing games on PDP-10 and Unix-based computers. Amongst the first of these were pedit5 and dnd,[38] whose name derives from an abbreviation of Dungeons & Dragons (D 'n' D), the original role-playing game which had been published earlier in 1974. This gave rise to a whole genre of dungeon crawl games. In 1980, probably the most seminal of this genre, Rogue was released, inspiring a host of roguelike clones.[39] Two notable examples of these were Ultima (1980)[40] and Wizardry (1981).[41]

Innovations in these games eventually became standards of almost all role-playing video games produced. Later games such as Dungeon Master (1987) introduced real-time gameplay and several user-interface innovations such as the direct manipulation of objects and the environment with the mouse. Later developments in this genre have tended to involve on-line interaction with other players (see below), rather than played on stand-alone machines. One variant, computer-assisted gaming, is still very much alive ;[42] here the games are only partially computerized, but actively regulated by a human referee.[43] It is claimed that there are Cultural differences in computer and console role-playing games between Eastern and Western versions .[44]

Online gaming and virtual worlds

The origins of today's virtual worlds and virtual communities lie in the interactive fiction and adventure games of the 1970s. The first text-based computer-based interactive fiction was Colossal Cave Adventure created by Will Crowther in 1975 (later extended by Don Woods). In 1976, Dungeon was a version of Dungeons & Dragons, a role-playing video game based on a medieval fantasy scenario. This was followed in 1978 by Multi-User Dungeon, a text-based multi-player on-line role playing game. However it took the advent of Usenet in 1980 as a distributed community, to allow the idea to develop effectively. From these early beginnings came several variants on the gaming theme: MUCK, MUSH and MOO (collectively MU* ), all developed out of TinyMUD (1989) a social game variant of the original MUD. In the early 1990s these became more sophisticated and found uses outside gaming, particularly in education.[45] In 1985 the Whole Earth eLectronic Link was founded as a virtual community. This was one of the precursors to the Internet. Initially online games were primarily text-based; however, in 1994 WebWorlds (later called ActiveWorlds) was created as the first on-line 3D virtual reality platform. This was quickly followed in 1996 by The Palace, which provided graphical chat rooms with a flexible avatar system. The 1980s and 1990s also saw the development of Massively Multiplayer Online Role-Playing Games, growing out of initial offerings such as MUD1 (1978) which were text-based, but then developed through Rogue (1980) and other similar games, such as Islands of Kesmai (1984), to using ASCII graphics. In the 1990s, games such as Neverwinter Nights (1991) and the later Ultima Online (1997) were primarily visual-graphics based.

Since 2000, Massively Multiplayer On-line Gaming has developed in various directions. Computer simulations such as VATSIM and IVAO offer the user the ability to fly virtual planes in a world wide air traffic control simulation. Virtual communities such as MySpace (2003) use social software to facilitate social interaction and networking. Massively Multiplayer Online Social Games such as The Sims Online (2002), There (2003) and Second Life (2003) which are virtual reality environments where the user is represented by an avatar have developed from earlier offerings such as Habbo Hotel (2000). These focus on socialization instead of objective-based gameplay, and might best be described as Multi-User Virtual Environments. MMORPGs, such as World of WarCraft (2004) have also become interactive communities but based more on fantasy worlds rather than real-world scenarios. Such communities are sometimes called metaverses, a term taken from the 1992 novel Snow Crash by Neal Stephenson.

Cellular automata and digital physics

See John Conway's Game of Life.

Artificial intelligence and virtual reality

Although the idea of an automaton has been in existence since the time of the ancient Greeks, both in fact and fiction, the first use of the term robot was in 1921, derived from the title of a play by Karel Čapek called R.U.R. (Rossum's Universal Robots). While Capek's creatures have intelligence, they are biological rather than mechanical, similar to the replicants in Blade Runner.

See also

References

  1. 1.0 1.1 1.2 1.3 Bostrom, N. , 2003, Are You Living in a Computer Simulation?, Philosophical Quarterly (2003), Vol. 53, No. 211, pp. 243–255.
  2. The Simulation Argument Website FAQ 3
  3. Davis J. Chalmers The Matrix as Metaphysics Dept of Philosophy, U. o Arizona; paper written for the philosophy section of the Matrix website.
  4. The Simulation Argument: Why the Probability that You Are Living in a Matrix is Quite High, Nick Bostrom, Professor of Philosophy at Oxford University, 2003
  5. 5.0 5.1 5.2 5.3 Davis J. Chalmers The Matrix as Metaphysics Dept of Philosophy, U. o Arizona; paper written for the philosophy section of The Matrix website.
  6. Descartes, René, 1596–1650, Discourse on the Method of Rightly Conducting One's Reason and of Seeking Truth in the Sciences
  7. 7.0 7.1 7.2 7.3 Descartes, R. (1641) Meditations on First Philosophy, in The Philosophical Writings of René Descartes, trans. by J. Cottingham, R. Stoothoff and D. Murdoch, Cambridge: Cambridge University Press, 1984, vol. 2, 1–62.
  8. Kim, J. (1995). in Honderich, Ted: Problems in the Philosophy of Mind. Oxford Companion to Philosophy. Oxford: Oxford University Press.
  9. (Hume, D. 1777, An Enquiry Concerning Human Understanding, XII, Part 2, p.128)
  10. 10.0 10.1 Immanuel Kant, Critique of Pure Reason, trans. and ed. by Paul Guyer and Allen W. Wood, Cambridge Univ. Press, 1988,
  11. G.W.F. Hegel, Phenomenology of Spirit, translated by A.V. Miller with analysis of the text and foreword by J. N. Findlay (Oxford: Clarendon Press, 1977) ISBN 0-19-824597-1.
  12. Woodruff Smith, D. (2007). Husserl. Routledge
  13. Martin Heidegger, Being and Time, trans. by Joan Stambaugh (Albany: State University of New York Press, 1996)
  14. Ayer, A.J., Russell, 1972, Fontana, London ISBN 0-00-632965-9.
  15. There are three separate papers where the theory of modal realism is suggestd: Lewis, K.D., (1968),Counterpart Theory and Quantified Modal Logic, Lewis, K.D. (1970), 'Anselm and Actuality', and Lewis, K.D.,(1971), 'Counterparts of Persons and their Bodies'
  16. Lewis, K.D, (1973)Counterfactuals
  17. Glasersfeld, E. von, 1989, Constructivism in Education, in Husen & Postlethwaite (eds), The International Encyclopaedia of Education Supplementary Volume, Oxford, Pergamon Press :p182)
  18. Peter L. Berger and Thomas Luckmann, The Social Construction of Reality : A Treatise in the Sociology of Knowledge (Anchor, 1967; ISBN 0-385-05898-5)
  19. Ernest, Paul; The Philosophy of Mathematics Education; London: RoutledgeFalmer, (1991)
  20. A Computational Foundation for Study of Cognition, Chalmers, D.J. University of Arizona
  21. Minds, Brains, and Programs John R. Searle, 1980, from The Behavioral and Brain Sciences, vol. 3.
  22. Fetzer, J. (1996) ``Minds Are Not Computers: (Most) Thought Processes Are Not Computational", paper presented at the annual meeting of the Southern Society for Philosophy and Psychology, Nashville, April 5.
  23. 23.0 23.1 Bostrom, Nick (2005). A history of transhumanist thought (PDF). 
  24. World Transhumanist Association (2002–2005). The transhumanist FAQ (PDF). 
  25. Historical Simulations – Motivational, Ethical and Legal Issues
  26. Villard, Ray. "Are we living in a computer simulation." Discovery Magazine, Dec 16, 2012
  27. Solomon, Mark. On Computer Simulated Universes. Hillsborough, NC: Lithp Preth Publishing, 2013, print.
  28. Beane, Silas; Zohreh Davoudi and Martin J. Savage (rev. 9 November 2012). "Constraints on the Universe as a Numerical Simulation". INT-PUB-12-046 (Cornell University Library). Archived from the original on 9 November 2012. Retrieved 28 December 2012. Lay summary The Physics arXiv Blog (October 10, 2012). "ABSTRACT Observable consequences of the hypothesis that the observed universe is a numerical simulation performed on a cubic space-time lattice or grid are explored. The simulation scenario is first motivated by extrapolating current trends in computational resource requirements for lattice QCD into the future. Using the historical development of lattice gauge theory technology as a guide, we assume that our universe is an early numerical simulation with unimproved Wilson fermion discretization and investigate potentially-observable consequences. Among the observables that are considered are the muon g-2 and the current differences between determinations of alpha, but the most stringent bound on the inverse lattice spacing of the universe, b1 > ~ 10^11 GeV, is derived from the high-energy cut off of the cosmic ray spectrum. The numerical simulation scenario could reveal itself in the distributions of the highest energy cosmic rays exhibiting a degree of rotational symmetry breaking that reflects the structure of the underlying lattice." 
  29. For a general audience presentation of this work see: http://www.phys.washington.edu/users/savage/Simulation/Universe/
  30. Brantley, Ben (January 16, 2012). "‘World of Wires' at the Kitchen — Review". The New York Times. 
  31. Gredler, M. (1992), Designing and Evaluating Games and Simulations: A Process Approach, Kogan Page, London
  32. GNS and Other Matters of Role-Playing Theory, Chapter 2
  33. A patent application was filed on January 25, 1947 and U.S. Patent 2,455,992 was issued on December 14, 1948 to Thomas T. Goldsmith Jr. and Estle Ray Mann
  34. For example, Tennis for Two by William Higinbotham (1958), and Spacewar! (1962); the latter probably being the first computer video game, having been created a year earlier by Martin Graetz, Alan Kotok and Stephen Russell on a PDP-1
  35. In 1971 Nolan Bushnell and Ted Dabney created Computer Space, the first commercial coin-operated video game.
  36. The first text-adventure, Adventure, was developed for the PDP-11 by Will Crowther in 1976, and expanded by Don Woods in 1977.Jerz, Dennis (2007). "Somewhere Nearby is Colossal Cave: Examining Will Crowther's Original 'Adventure' in Code and in Kentucky". Digital Humanities Quarterly. Retrieved 2007-09-29. 
  37. These magazines published reader-produced game code to be typed into a computer and played, and running software competitions. "Computer Gaming World's RobotWar Tournament" (PDF). Computer Gaming World. October 1982. p. 17. Retrieved 2006-10-22. 
  38. dnd (1974) was written in the TUTOR programming language for the PLATO System by Gary Whisenhunt and Ray Wood at Southern Illinois University ; enhancements were made by Dirk and Flint Pellett during the late '70's and early '80's.
  39. One of the most notable of these was the 1987 update, NetHack
  40. Ultima I: The First Age of Darkness (1980), created by Richard Garriott. The series has had many updates which are still being published. see: The official Ultima WWW Archive for information and files concerning the entire saga
  41. Wizardry: Proving Grounds of the Mad Overlord, the first of 8 titles published by Sir-Tech between 1981 and 2001.The game began life as a dungeon crawl written by Andrew C. Greenberg and Robert Woodhead, when they were students at Cornell University.
  42. see, for example Chore Wars, launched in July 2007, which offers a new slant on the entire RPG genre – housework!
  43. see: Mac-Assisted Role-Playing, for example.
  44. see: Spy/Counterspy Case File 07: RPGs – East vs. West, The Oblivion of Western RPGs: Can Oblivion save a genre it helped bury, and Kawaisa! A Naive Glance at Western and Eastern RPGs
  45. For example, LinguaMOO is an educational MOO, created in 1995 by Cynthia Haynes of the University of Texas at Dallas and Jan Rune Holmevik of the University of Bergen. see http://web.archive.org/web/20060427061712/http://lingua.utdallas.edu:7000/

Further reading

  • "Are We Living in a Simulation?" BBC Focus magazine, March 2013, pages 43–45. Interview with physicist Silas Beane of the University of Bonn discussing a proposed test for simulated reality evidence. Three pages, 3 photos, including one of Beane and a computer-generated scene from the film The Matrix. Publisher: Immediate Media Company, Bristol, UK.
  • "Do We Live in the Matrix?" by Zeeya Merali, Discover, December 2013, pages 24–25. Subtitle: "Physicists have proposed tests to reveal whether we are part of a giant computer simulation."

External links

This article is issued from Wikipedia. The text is available under the Creative Commons Attribution/Share Alike; additional terms may apply for the media files.