Pointer machine
From Wikipedia, the free encyclopedia
In theoretical computer science a pointer machine is an "atomistic" abstract computational machine model akin to the Random access machine.
Depending on the type, a pointer machine may be called a linking automaton, a KU-machine, an SMM, an atomistic LISP machine, a tree-pointer machine, etc. (cf Ben-Amram 1995). At least three major varieties exist in the literature -- the Kolmogorov-Uspenskii model (KUM, KU-machine), the Knuth linking automaton, and the Schönhage Storage Modification Machine model (SMM). The SMM seems to be the most common.
From its "read-only tape" (or equivalent) a pointer machine receives input -- bounded symbol-sequences ("words") made of at least two symbols e.g. { 0 , 1 } -- and it writes output symbol-sequences on an output "write-only" tape (or equivalent). To transform a symbol-sequence (input word) to an output symbol-sequence the machine is equipped with a "program" -- a finite-state machine (memory and list of instructions). Via its state machine the program reads the input symbols, operates on its storage structure -- a collection of "nodes" (registers) interconnected by "edges" (pointers labelled with the symbols e.g. { 0, 1 }), and writes symbols on the output tape.
Pointer machines cannot do arithmetic. Computation proceeds only by reading input symbols, modifying and doing various tests on its storage structure -- the pattern of nodes and pointers, and outputting symbols based on the tests. "Information" is in the storage structure.
Contents |
[edit] Types of "Pointer Machines"
Both Gurevich and Ben-Amram list a number of very similar "atomistic" models" of "abstract machines"; Ben-Amram believes that the 6 "atomistic models" must be distinguished from "High-level" models. This article will discuss the following 3 atomistic models in particular:
- Schönhage's Storage Modification Machines (SMM),
- Kolmogorov-Uspenskii Machines (KUM or KU-Machines),
- Knuth's "Linking Automaton"
But Ben-Amram add more:
- Atomistic Pure-LISP Machine (APLM)
- Atomistic Full-LISP machine (AFLM),
- General atomistic Pointer Machines,
- Jone's I Language (two types)
[edit] Problems with the pointer machine model
Use of the model in complexity theory: van Emde Boas (1990) expresses concern that this form of abstract model is:
- "an interesting theoretical model, but ... its attractiveness as a fundamental model for complexity theory is questionable. Its time measure is based on uniform time in a context where this measure is known to underestimate the true time complexity. The same observation holds for the space measure for the machine" (van Emde Boas (1990) p. 35)
Gurevich 1988 also expresses concern:
- "Pragmatically speaking, the Schönhage model provides a good measure of time complexity at the current state of the art (though I would prefer something along the lines of the random access computers of Angluin and Valiant)" (Gurevich (1988) p. 6 with reference to Angluin D. and Valiant L. G., Fast Probabilistic Algorithms for Hamiltonian Circuits and Matchings", Journal of Computer and System Sciences 18 (1979) 155-193.)
The fact that, in §3 and §4 (pp. 494-497), Schönhage himself (1980) demonstrates the real-time equivalences of his two Random Access Machine models "RAM0" and "RAM1" leads one to question the necessity of the SMM for complexity studies.
Potential uses for the model: However, Schönhage (1980) demonstrates in his §6, Integer-multiplication in linear time. And Gurevich wonders whether or not the "parallel KU machine" "resembles somewhat the human brain" (Gurevich (1988) p. 5)
[edit] Schönhage's Storage Modification Machine (SMM) model
Outline: work in progress (follows van Emde Boas 1990 rather than Schonhage which is marred with virtually no examples). Seems to be the most common and accepted model:
> unlike register machine model. More difficult to understand quite unlike a "computer" -- abstract or otherwise.
> Attempt here to give an understanding from a more basic-concept level.
- directed "graph" (a drawing that looks virtually identical to a state diagram) of circles called nodes and labelled arrows called edges.
Alphabet k = list of symbols input to the machine. k=2 is the minimum number of edges/sumbols. Typical is the binary alphabet { 0, 1 }. A ternary set might be { a, b, c } etc.
Word = a string of symbols, e.g. 101101, input to the machine. A machine will "accept" a subset of all the possible strings U that can be generated by all SUM0 to n( 2(n*k) ) ( verify formula: this is in reference to the number of subsets creatable from the universal superset ) possible combinations of the symbols
- U = { 0, 1, 00, 01, 10, 11, 100, 101, 110, 111, 1000, 1001, 1011, 1100, 1101, 1110, 1111, 11110, etc }
Node: Each node is distinguishable and unique and labelled with a symbol inside it (e.g. with a number or a letter). From each node emerges k arrows (e.g. 2 arrows for the binary set of symbols { 0 , 1 }). A node is created by the new w instruction.
Edge: From each node emerges as many "arrows" as there are symbols in the alphabet; The arrow's head indicates the "next" node, and each arrow is labelled with a symbol. Example: for a binary alphabet, e.g. { 0, 1 }, two arrows will emerge from every node; one of the two arrows will be labelled with "0", the other with "1". for a ternary alphabet, e.g. { a, b, c }, three symbols will emerge, each will bear the (unique) symbol "a", "b", or "c". The set w to v instruction redirects an edge to different node. Here w and v represent words. The word v is a former word -- i.e. a previously-created string of symbols -- so that the redirected edge will point "backwards" to an old node that "culminates/results" in that string.
Path: A path along nodes and arrows represents every word (string of symbols e.g. 101101) "the machine" can accept. A path will be determined in part by the history of how it was created.
(1) new w: creates a new node. w represents the new word that creates the new node. As the The machine reads the word w it follows the path represented by the word w until the machine comes to the last (extra, additional, concatenated) symbol. The additional symbol forces the last state to "flip" its arrow from backward-pointing to forward-pointing, to create a new node, and to point to the node. The new node in turn points both its edges back to the old last-state, where they just "rest" until redirected by new or set.
-
- "w" is the (new) word that "leads to" such as the 5-to-6 expansion: 10110[1]
- old edge: points toward new node, example 10110[1], the 5th node's 1-edge now points to the 6th node
- new edges: both new edges point "backward" to the previous node. In a sense they are "sleeping", waiting for an assignment. In the case of the starting or center node both edges point back to itself (the starting node).
(2)Set w to v: redirects (moves) an edge (arrow) from the path represented by word w to a former node that represents word v
(3)If v = w then instruction z : Conditional instruction that compares two paths represented by words w and v to see if they end at the same node; if so jump to instruction z else continue.
[edit] Knuth's "Linking Automaton" model
[edit] Kolmogorov-Uspenskii Machine (KU-machine) model
KUM differs from SMM in allowing only invertible pointers: for every pointer from a node x to a node y, an inverse pointer from y to x must be present. Since outgoing pointers must be labeled by distinct symbols of the alphabet, both KUM and SMM graphs have O(1) outdegree. However, KUM pointers' invertibility restricts the in-degree to O(1), as well. This addresses some concerns for physical (as opposite to purely informational) realism, like those in the above van Emde Boas quote.
[edit] See also
Register machine -- generic register-based abstract machine computational model
- Counter machine -- most primitive machine, base models' instruction-sets are used throughout the class of register machines
- Random access machine -- RAM: counter machine with added indirect addressing capability
- Random access stored program machine -- RASP: counter-based or RAM-based machine with a "program of instructions" to be found in the registers themselves in the matter of a Universal Turing machine i.e. the von Neumann architecture.
Turing machine -- generic tape-based abstract machine computational model
- Post-Turing machine -- minimalist one-tape, two-direction, 1 symbol { blank, mark } Turing-like machine but with default sequential instruction execution in a manner similar to the basic 3-instruction counter machines.
[edit] References
Most references and a bibliography are to be found at the article Register machine. The following are particular to this article:
- Amir Ben-Amram (1995), What is a "Pointer machine?, SIGACTN: SIGACT News (ACM Special Interest Group on Automata and Computability Theory)", volume 26, 1995. also: DIKU, Department of Computer Science, University of Copenhagen, amirben@diku.dk. Wherein Ben-Amram describes the types and subtypes: (type 1a) Abstract Machines: Atomistic models including Kolmogorov-Uspenskii Machines (KUM), Schönhage's Storage Modification Machines (SMM), Knuth's "Linking Automaton", APLM and AFLM (Atomistic Pure-LISP Machine) and (Atomistic Full-LISP machine), General atomistic Pointer Machines, Jone's I Language; (type 1b) Abstract Machines: High-level models, (type 2) Pointer algorithms.
- Andrey Kolmogorov and V. Uspenskii, On the definition of an algorithm, Uspehi Mat. Nauk. 13 (1958), 3-28. English translation in American Mathematical Society Translations, Series II, Volume 29 (1963), pp. 217-245.
- Yuri Gurevich (2000), Sequential Abstract State Machines Capture Sequential Algorithms, ACM Transactions on Computational Logic, vol. 1, no. 1, (July 2000), pages 77-111. In a single sentence Gurevich compares the Schönhage [1980] "storage modification machines" to Knuth's "pointer machines." For more, similar models such as "random access machines" Gurevich references:
-
- J. E. Savage (1998), Models of Computation: Exploring the Power of Computing. Addison Wesley Longman.
- Yuri Gurevich (1988), On Kolmogorov Machines and Related Issues, the column on "Logic in Computer Science", Bulletin of European Association for Theoretical Computer Science, Number 35, June 1988, 71-82.
- A. Schōnhage (1980), Storage Modification Machines, Society for Industrial and Applied Mathematics, SIAM J. Comput. Vol. 9, No. 3, August 1980. Wherein Schōnhage shows the equivalence of his SMM with the "successor RAM" (Random Access Machine), etc. He refers to an earlier paper where he introduces the SMM:
-
- A. Schōnhage (1970), Universelle Turing Speicherung, Automatentheorie und Formale Sprachen, Dōrr, Hotz, eds. Bibliogr. Institut, Mannheim, 1970, pp. 69-383.
- Peter van Emde Boas, Machine Models and Simulations pp.3-66, appearing in:
-
- Jan Van Leeuwen, ed. "Handbbook of Theoretical Computer Science. Volumne A: Algorithms and Complexity, The MIT PRESS/Elsevier, 1990. ISBN 0-444-88071-2 (volume A). QA 76.H279 1990.
- van Emde Boas' treatment of SMMs appears on pp. 32-35. This treatment clarifies Schōnhage 1980 -- it closely follows but expands slightly the Schōnhage treatment. Both references may be needed for effective understanding.