Process physics

From Wikipedia, the free encyclopedia

Process physics is a speculative approach to the modeling of fundamental physics drawing on information theory, developed mainly by Reginald T. Cahill. It aims to be a theory of everything which would solve the problems with conventional physics such as the incompatibility between the theories of general relativity (Einstein's theory of gravity) and quantum mechanics. One of its main features is the abandonment of the space-time construct of Einstein in favor of modeling time as a process. The model attempts to derive the evolution of space and matter from the behavior of a large collection of nodes with connections between them, similar to a neural network, with the connections changing over time according to a very simple iterative rule.

This model exhibits features reminiscent of both gravity and quantum-mechanical nonlocality, and it is also argued that the connections naturally tend to evolve into states reminiscent of three-dimensional space. However, the model has not been developed to a point where it can reproduce experimentally-confirmed predictions of existing theories such as general relativity and the standard model, so it cannot presently be called a theory of quantum gravity, and few physicists other than Cahill have taken it up as an approach to developing such a theory. The model also does not currently make specific quantitative predictions about experimental results that would distinguish it from existing theories, so it cannot yet be called a true "theory" as the term is generally used in science.

Contents

[edit] History

Process physics was first publicized in a paper by Reginald T. Cahill and Christopher M. Klinger about modeling space and time with a random matrix in 1996, Pre-geometric modeling. This was further developed in the paper Self-Referential Noise and the Synthesis of Three-Dimensional Space in 1998. In 2002 the papers Process Physics: From Quantum Foam to General Relativity and Process Physics: From Information Theory to Quantum Space and Matter in 2003 took the radical step of expanding on the themes of the earlier papers to encompass both quantum mechanics and general relativity.

Cahill asserts that several experiments measuring the speed of light and "force" of gravity in 2003 to 2005 invalidate the theory of relativity as per Einstein/Hilbert and reduce relativity to the less "general" Lorentzian model, the law of relativity, in which an absolute motion is possible. He asserts a theory that he claims employs identical assumptions to the process philosophy of Alfred North Whitehead.[citation needed]

[edit] Modeling process physics

[edit] Pre-geometric modelling of space and time

Process physics uses the concept of self-organising criticality to explain the emergence of structure and information from random processes. It is modeled as an iterative process that uses a matrix B that describes the strengths of connections between nodes. This matrix is then iterated by the formula 
B_{ij}\rightarrow B_{ij} -\alpha (B + B^{-1})_{ij} + \omega_{ij},\quad i,j = 1,2\dots,2N;N\rightarrow \infty
, where:

  • Bij is the strength of the connectivity of nodes i and j,
  • α is a dimensionless constant, and
  • ωij is a wiener random variable.

We set Bij = − Bji and ωij = − ωji to avoid a node having a connection strength to itself.[1] By iterating the matrix B through this equation, a tree structure emerges with strongly connected nodes exhibiting a fractal structure with a dimensionality tending towards 3, as in our 3-D space.[2] Further iteration shows that connections between some nodes decay, while new connections are created. Over many iterations, more new connections are formed than are lost, causing an expanding space, as observed in the physical universe. Within this tree structure emerge topological defects that have more connectivity than normal and are therefore more 'sticky', giving rise to patterns that persist. It is argued that these patterns have matter-like behaviour because of their persistence and their fuzziness at smaller scales similar to quantum particles.

A simple visual representation is as follows:

  • Self-organizing semantic information field (Gödel, Turing and Chaitin,..). Keywords: stochastic neural networks (SNN), randomness ,.. Semantic information generated and recognized within the system, unlike other approaches that are only syntactical and not semantic
  • The SNN characteristic inherent bootstrap mechanism triggers the process of iteration; semantics generated by semantic seeds appearing during the process of iteration. The iteration process is therefore the essential consequence of SNN structure and not a convenient tool one needs to "get things started".
  • The iteration process generates ge-bits, the geometry units later coalescing ("gluing") and thus "building" the space structure equivalent to the quantum foam. Topological defects appearing in the process of the growing three-dimensional fractal space (ge-bits are glued into the space/quantum foam, composed of the active nodes and in-built topological defects) generate quantum matter; that is, one gets both space and matter via iteration of a mathematical formalism. The time modeling is not quasi-geometric as is the case with the model of space that emerges.
  • Gravity equals to the loss of relational information during the in-flow from the quantum foam to the matter. Since the in-flow is turbulent, it necessitates the appearance of gravitational waves and explains the anomaly in rotation speed of spiral galaxies. Currently accepted physics explains this by the concept of dark matter; a concept that process physics dispenses with. Quantum homotopy field theory, describing the dynamics of topological defects, gives subsequently ordinary quantum field theory and the rest of quantum physics and relativity, albeit differently interpreted.

[edit] Time

In relativistic physics, time is modelled as a geometrical dimension that is added to the three dimensions of space to construct four dimensional space-time. This is a static model that does not have an absolute concept of a past, present or future (see relativity of simultaneity). Additionally, because all modern theories feature either T symmetry or CPT symmetry, they have no natural arrow of time, and there is no fundamental difference between predicting the future based on the present and "retrodicting" the past based on the present.

Time in process physics is modelled as an iterative process, where each iteration is like the next present moment. Due to the randomness present in the iterative equation, the future is not completely predictable. Also it is not possible to perform the inverse operation, meaning you cannot go backwards to the previous moments. Thus, in the process physics model of time, there is a static past, a continually changing present moment, and an unpredictable future — all of which is consistent with our subjective experience of time. However, Stephen Hawking feels that the fact that we remember the past but are uncertain of the future can be adequately explained by existing theories in terms of the way the second law of thermodynamics applies to information processing.[3]

[edit] Space

In the process physics, space has internal structure, described as the network of nodes. Mathematically the model used by process physics to describe space is essentially the same as that used to model neural networks. The inspiration to use this neural-network type of model to describe reality came from the discovery that the behaviour of the statistical particles skyrmions can be described by a similar model.

[edit] Matter

In process physics, matter is described as topological defects in three dimensional space that have the ability to become persistent by preserving the pattern of its links over many iterations. Matter is embedded in three dimensional space but is essentially made of the same thing as space. It moves by re-linking preferentially in the direction of travel and losing links more often in the opposite direction to travel. The pattern therefore appears to move relative to the underlying fabric of space and to other matter. Once the movement has started then it will become self sustaining requiring no more energy to continue. Any change in direction to its passage through space would be resisted, which manifests itself as inertia.

[edit] Gravity

The topological defect nature of matter means that it has more links than normal space. This would produce the effect of using up more links than the space surrounding it, meaning that space would effectively sink into matter. This is speculated as the reason behind gravity, where the space between masses would effectively shrink, making the masses closer together.

The masses would not move as such but the distance between them would get smaller. This also explains why a free-falling body does not seem to experience a force while accelerating due to gravity towards a more massive body. However, this contradicts general relativity, as the gravitational effect would be instantaneous rather than an effect at a distance at the speed of light. An experiment to measure the speed of gravity would go a long way to establishing which is closer to reality: general relativity or process physics.

[edit] Experimental claims

Cahill claims in a series of papers that he has managed to detect "absolute motion".[1][2] But mainstream physicists refute this claim on the grounds that the theory behind his experiment is profoundly flawed: the equations of the speed of light in a moving refractive medium are incorrect.

[edit] See also

[edit] References

Citations

  1. ^ Reginald T. Cahill , Christopher M. Klinger (2005). "Bootstrap Universe from Self-Referential Noise" (PDF). Retrieved on 2006-12-06.
  2. ^ Reginald T. Cahill, Christopher M. Klinger (1998). "Self-Referential Noise and the Synthesis of Three-Dimensional Space". Retrieved on 2006-12-06.
  3. ^ Stephen Hawking (1990). A Brief History of Time. Bantam, p. 147. ISBN 0-553-34614-8. 

[edit] External links

Languages