Talk:Algorithm characterizations
From Wikipedia, the free encyclopedia
[edit] Holding place -- Markov's (1954) definition of "algorithm"
"Boldface added for emphasis]:
"1. In mathematics, "algorithm" is commonly understood to be an exact prescription, defining a computational process, leading from various initial data to the desired result...."(p.1)
"The following three features are characteristic of algorithms and determine their role in mathemtics:
- "a) the precision of the prescription, leaving no place to arbitrariness, and its universal comprehensibility -- the definiteness of the algorithm;
- "b) the possibility of starting out with initial data, which may vary within given limits -- the generality of the algorithm;
- "c) the orientation of the algorithm toward obtaining some desired result, which is indeed obtained in the end with proper initial data -- the conclusiveness of the algorithm." (p.1)
"2. The description of algorithm just given does not pretend to mathematical precision..."(p.1)
"3. In recent times a number of authors have elaborated theories leading naturally to a sharpening of the notion of an algorithm. We have in view the works of Kleene in the theory of recursive functions /21, 23/, the works of Church in the theory of λ-conversion /14, 15, 17/, the work of Turing in the theory of computable numbers /31/, and the work of Post in the theory of "finite combinatory processes" /25/" (p. 2; numbers between slashes refer to Markov's bibliography]
"4. ... The entire significance for the mathematics of rendering more precise the concept of algorithm emerges, however, in connection with the problem of a constructive foundation for mathematics [etc.]"(p.2)
"5. All the theories listed under point 3 are sufficiently complicated in themselves and lead to a sharpening of the precision of the concept of algorithm in an indirect manner.
"The theory of recursive functions /21, 23, 24/ deals mainly with that special case of an algorithm, when the role of initial data is played by natural numbers, and the result of its application is a number. Passage to the general case requires therefore the arithmetization of the initial data and results, which is obtained through some "Godel numbering" /18/ -- a process, consisting in the application of an algorithm, defined once and for all, but which is rather complicated.
"The establishment of the theory of algorithms on a rigorous basis through the use of the Church theory of λ-conversion requires, besides the Godel numbering, also a cumbersome formal apparatus.
"Turing's theory of "computable numbers", principally oriented toward a constructive approach to the concept of real number, leads also indirectly to a more precise notion of algorithm which is of interest to us. The exposition of its author /31/ contains some inaccuracies, pointed out by Post /28/.
"Finally, Post's theory of finite combinatorial processes, closely related to the theory of Turing, has not been worked out at all and consists in fact of one definition only /25/." (p. 2-3)
In the remainder of the text, in particular Chapter II section 3, Markov defines and defends his definition of "normal algorithm". He states that:
"It would be quite natural to assume that the theory of "natural algorithms" is equivalent to the theory of algorithms based on the concept of a recursive function. it was shown by V.K. Detlove that this is actually the case /1/." (p. 4)
wvbaileyWvbailey 20:56, 26 July 2006 (UTC)
[edit] Markov's definition of "natural algorithm"
His idea can be seen from the first three (of six) Chapters in his "Contents". In particular in Chapter II.3 he defines his normal algorithm (see sub-section below) in terms of a set of symbols that form words, and a method using substitution rules to create new but valid words.
Although this sounds exactly like a "recursive language" ("Equivalence of a type 0 grammar to a Turing Machine" cf Hopcroft and Ullman p. 221, Introduction to Automata theory, Languages and Computation, Addison-Wesley, 1979) nowhere do Hopcroft and Ullman cite A. A. Markov, neither in their index nor in their bibliography. Gurevich 2000 does, however. As does Kleene (1952): "The problem which Post [1947] and Markov [1947] prove unsolvable was proposed by Thue 1914" (Kleene p. 382). And Minsky (1967) re neural nets (p. 65).
Chapter I. LETTERS, ALPHABETS, WORDS
- 1. Letters
- 2. Alphabets
- 3. Words
- 4. Entries
- 5. Links and chains
- 6. Translations
Chapter II. THE NOTION OF ALGORITHM
- 1. Algorithms in alphabets
- 2. Examples of algorithms
- 3. Normal algorithms
- 4. Examples of normal algorithms
- 5. The principle of normalization
Chapter III. CONSTRUCTION OF NORMAL ALGORITHMS
- 1. Extension of an algorithm
- 2. Closure of an algorithm
- 3. Composition of algorithms
- 4. Combination of algorithms
- 5. Branching of algorithms
- 6. Reiteration of algorithms
- 7. Translation of an algorithm
- 8. Some algorithms connected with matrices
wvbaileyWvbailey 21:36, 26 July 2006 (UTC)
[edit] Nutshell description of Markov's definition of his "normal algorithm" (pp.63ff)
[Boldface added for emphasis]
In particular, the notion "2." of "steps of local nature" appear also in Blass and Gurevich with regards to Kolmogorov-Uspensky (KU) machines:
-
- Gurevich proposed to Uspensky that " 'every computation, performing only one restricted local action at a time, can be viewed as (not only being simulated by, but actually being) the computation of an appropriate KU machine.' Uspensky concurred." (Blass, Gurevich 2003, p. 10)
Markov's definition:
"1. Separate elementary steps, each of which will be performed according to one of these rules..."
"2. ...steps of local nature ..." [as opposed to "integral nature" ... he means that the algorithm won't change more than a certain number of symbols to the left or right of the observed word/symbol ]
"3. Rules for the substitution formulas ..."
- 1) order given is their preferred order of use
- 2) apply substitution to the first entry of the left-hand member
"4. Definite prescription for termination..."
- 1) a means to distinguish a "concluding substitution" [lee tading to "terminal state"]
- 2) as distinguished from a "simple" -- i.e typical, usual, most-likely-to-occur -- subsitution
"5. [He defines the above as a "normal algorithm"]
"6. [the "scheme" of the algorithm is the list of the substitution rules.]
wvbaileyWvbailey 21:37, 26 July 2006 (UTC)
[edit] holding place for Stone's definition of "algorithm" (p. 3 - 13)
"[Informal] Definition: An algorithm is a set of rules that precisely define a sequence of operations." (p. 4)
"For people to follow the rules of an algorithm, the rules must be formulated so that they can be followed in a robot-like manner, that is, withoout the need for thought... however, if the instructions [to solve the quadratic equation] are to be obeyed by someone who knows how to perform arithmetic operations but does not know how to extract a square root, then we must also provide a set of rules for extracting a square root in order to satisfy the definition of algorithm" (p. 4-5)
- [i.e. "the rules" must be complete to perform the expected outcome given the input symbols and the "computer's" self-contained abilities and knowledge-- every symbol in the rules and the input must be (i) distinguishable, every distinguished rule/input (ii) understood, and every distinguished, understood rule actually (iii)executable.]
"Algorithms may specify iterative behavior..." (p. 5)
"...not all instructions are acceptable, because they may require the robot to have abilities beyond those that we consider reasonable. For example ... 'Output the number 0 if Henry VIII was a King of England, otherwise output the number 1' [his example: this is unreasonable if the robot has not been previously instructed, even worse if asked if Aristotle were a king of Bngland but only 5 kings have been taught].... the instruction requires that the robot to have particular information, so that if the information is not given to the robot, it cannot obey the instruction. An intuitive definition of an acceptable sequence of instructions is one in which each instruction is precisely defined so that the robot is guaranteed to be able to obey it....We say that an instruction is effective if there is a procedure that the robot can follow in order to determine precisely how to obey the instruction.... effectiveness ... may be dependent on the finiteness property. ... every algorithm must have the property of finiteness; that is, it must be able to terminate in a finite number of steps. Each instruction must be precisely defined so as to avoid any ambiguity or misinterpretation....
"To summarize ... we define an algorithm to be a set of rules that precisely defines a sequence of operations such that each rule is effective and definite and such that the sequence terminates in a finite time." (p. 8)
"An algorithm for the Turing machine consists of a set of rules that involve head motion, tape operations, and change of state... the state of the machine at a particular instant of time identifies the instrction that the Turing machine is carryong out at the time.
"An algorithm for the Turing machine consists of a set of five-tuples such that there is exactly one five-tupe for each state-symbol pair, excluding the HALT state. Such an algorithm is known as a Turing machine program." (p. 9)
Stone's 5-tuple is classical (qi, sj, qij, sij, dij): qi is the current state, sj is a tape symbol, qij is the next state, sij the symbol to write in currently-observed cell, dij the tape motion. Observe that he has defined "algorithm" as the finite-state machine table for the machine, NOT as what he presents immediately afterwards:
"The computation of a Turing machine is described by stating:
- "1. The tape alphabet
- "2. The form in which the parameters are presented on the tape
- "3. The initial state of the Turing machine
- "4. The form in which answers will be represented on the tape when the Turing machine halts
- "5. The machine program" (p. 10)
Is the machine program truly "the algorithm" or is "the algorithm" the list of 5 elements above, i.e. the definitions of tape ALPHABET, INPUT, OUTPUT, and INITIAL STATE, as well as 'the rules."
wvbaileyWvbailey 19:23, 27 July 2006 (UTC)
[edit] Holding place: Gurevich, and Blass-Gurevich definition of algorithm
[Boldface added for emphasis. more to follow as wvbailey et. al. figures out what the gents are saying]
"...Turing's informal argument in favor of his thesis justifies a stronger thesis: every algorithm can be simulated by a Turing machine....In practice, it would be ridiculous...[Nevertheless,] [c]an one generalize Turing machines so that any algorithm, never mind how abstract, can be modeled by a generalized machine?...But suppose such generalized Turing machines exist. What would their states be?...a first-order structure ... a particular small instruction set suffices in all cases ... computation as an evolution of the state ... could be nondeterministic... can interact with their environment ... [could be] parallel and multi-agent ... [could have] dynamic semantics ... [the two underpinings of his work are:] Turing's thesis ...[and] the notion of (first order) structure of [Tarski 1933]" (Gurevich 2000, p. 1-2)
"Gandy [1980] put four principles which any such [e.g. "vastly parallel" computational] machine must satisfy... [Gurevich is quoting Gandy here:] "The most important of these, called 'the principle of local causality' rejects the possibility of instantaneous action at a distance. ... It can be proved that if a device satisfies the principles then its successive states form a computable sequence." (Gurevich 2000 p. 4)
"4.4 Fixed Vocabulary
- "...In our case, by definition, vocabularies are finite. This reflects the informal assumption that the program of an algorithm A can be given by a finite text....In particular, the vocabulary does not change during the computation." (Gurevich 2000 p. 11). He goes on to examine whether this is reasonable and concludes that self-modifying algorithms (i.e. programs) are using their text (the program itself) as part of their data.)(Gurevich 2000, p. 11)
"7.1 Constructivity
- "Traditionally it is required in the foundations of mathematics that inputs to an algorithm be constructive objects (typically strings) and that the states of an algorithm have constructive representations. One champion of that tradition was Markov [1954] ...[but]... these constructivity requirements may be too restrictive in applications, especially in high-level design and specification. We abstract from the constructivity constraints. ASMs are algorithms which treat their states as databases or oracles" [say what??] (Gurevich 2000, p. 19)
> notion of an algorithm containing its own documentation
wvbaileyWvbailey 16:11, 27 July 2006 (UTC)
[edit] Holding place: an elegant description of why algorithms are necessary
From Boolos and Jeffrey (1974, 1999):
- "No human being can write fast enough, or long enough, or small enough [^ footnote] to list all members of an enumerably infinite set by writing out their names, one after another, in some notation. But humans can do something equally useful, in the case of certain enumerably infinite sets: They can give explicit instructions for determining the nth member of the set, for arbitrary finite n. Such instructions are to be given quite explicitly, in a form in which they could be followed by a computing machine, or by a human who is capable of carrying out only very elementary opertions on symbols. The problem will remain, that for all but a finite number of values of n it will be physically impossible for any human or any machine to actually carry out the computation, due to limitations on the time available for computation, and on the speed with which single steps of the computation can be carried out, and on the amount of matter in the universe which is available for forming symbols. But it will make for clarity and simplicity if we ignore these limitations..." (p. 19)
-
- ^ Footnote: "There is only a finite amount of paper in the world, so you'd have to write smaller and smaller without limit, to get an infinite number of symbols down on paper. Eventually you'd be trying to wrtie on molecules, on atoms, on electrons...."(p.19)
Algorithms seem to come in a couple basic forms:
1 Start them with blank tape ({ }), (0?), let them loose, and they print number or numbers on their tape (e.g. pi), (e.g. an up-counter)
-
- f({ }) = pi;
- f({ }) = |.||.|||.||||.|||||.||||||.|||||||.[etc]
2 Start them with an arbitrary number or arbitrary collection of numbers as presented on their tape in a preagreed manner i.e. that they consider well-formed and well-chosen, and out pops a "number" (e.g. through multiplication) in a form you will say is well-formed (acceptable):
-
- f(1,10) = 10; f(10,11) = 110
- f(1,2) = 2; f(2,3) = 6
- f(||,|||) = |||; f(|||,||||) = |||||||
- ...|||.||||..... => ...|||||||....
In this second case the algorithm must [be able to] (i) accept the symbols (e.g. |'s only, no *'s allowed) and (ii) accept their form/presentation (i.e. two, and only two, packed strings of |'s separated by a single ".", no *'s in the string, etc.) and also (iii) agree that the strings are well-chosen (e.g. not outside certain boundaries, perhaps. Thus we might define "0" = |).
For example, a fancier multiplying-machine that accepts negative and positive integers might be able to handle stings such as
- ...|.|||.|.||||... (-2,-3)
- ...|.|||.||||... (-2,+3)
But such strings will fail to be a well-chosen for the more primitive machine that handles only positive numbers such as ...|||.||||... (2, 3). And this fails too because of the misplaced "*"
- ...|.|||.|*||... (-2,??)
This does raise interesting questions about "completely specified". What happens when the parameters are not correctly presented? wvbaileyWvbailey 20:26, 21 August 2006 (UTC)
Moved from Talk: Algorithm by wvbaileyWvbailey 14:29, 9 September 2006 (UTC)
[edit] Example 2 under Atomization of the instruction set
This phrase doesn't make much sense:
"The RASP can be reduced to a RAM by moving its instructions off the tape to in finite-state machine"
Is this supposed to be:
"The RASP can be reduced to a RAM by moving its instructions off the tape to a finite-state machine"
or maybe, but less likely (and not very meaningful):
"The RASP can be reduced to a RAM by moving its instructions off the tape to an infinite-state machine" ?
--SilverStar 05:51, 23 October 2006 (UTC)
-
- Sorry. It should be something more like: "The RASP can be reduced to a RAM by moving its instructions off the tape and (perhaps with translation) into its finite-state machine "table" of instructions." Thanks for picking this up. wvbaileyWvbailey 13:49, 23 October 2006 (UTC)
[edit] The Sipser's applies to any Lambda Calculus?
About the section "2006: Sipser's assertion ...".
I think if it is only a "reference model problem", it not applies to the "characterization" it self, only to the approache/methodological options... puting into another words: If we use Lambda Calculus to reference model on the "algorithm definition", it is ok for Sipser?
-- Krauss nov. 2006
- Good question. I suspect Sipser would say that the model must be a machine. (I suppose we could ask him!) There is a quote from Godel -- I have looked for it recently but didn't find it, but I know I've read it a couple places ... He was not convinced that the recursive functions (mu-recursive, partial or total) completely defined all functions that are calculable/definable. Late in life he believed that only the Turing machine and equivalents were the final definition. I don't remember anything about lambda-calculus but it is close enough to mu-recursion (I believe) so it would not meet Godel's question. If you assume the Church-Turing thesis then I suppose you have to say that any specification such as mu-recursion/lambda calculus/Turing machine is adequate. My guess is people don't use the first two because they are harder to use. I am far away from my books, so I can't research this more.wvbaileyWvbailey 15:52, 18 November 2006 (UTC)