Talk:Chaitin's constant

From Wikipedia, the free encyclopedia

WikiProject Mathematics
This article is within the scope of WikiProject Mathematics, which collaborates on articles related to mathematics.
Mathematics rating: B Class Mid Priority  Field: Discrete mathematics


Contents

[edit] Calculation of the Chaitin constant...?

Calculation of the start of a Chaitin Ω Calude, Dinneen, and Shu have calculated the first 64 bits of a Chaitin ΩU for a particular machine. These are, in binary notation 0.0000001000000100000110001000011010001111110010111011101000010000... or in decimal notation 0.0078749969978123844... Note that the notion of whether a specific digit of Ω can be correctly calculated is different from the notion of computability: any specific digit of any number is computable since for any integer there is a program printing it.

What is this? Who are Calude, Dinneen, and Shu? What particular machine is being talked about? What references are there to back up this statement?

And more importantly, this little bit:

Note that the notion of whether a specific digit of Ω can be correctly calculated is different from the notion of computability: any specific digit of any number is computable since for any integer there is a program printing it.

is wrong. If all of the digits of Ω could be correctly calculated, then the number would be computable, as that function would then compute all of Ω. The reason given, that "for any integer there would be a program printing it" doesn't make sense and is unclear. I think we should remove this from the article.

192.88.124.204 (talk) 22:11, 12 April 2008 (UTC)


That particular bit comes from the Calude, Dinneen and Shu paper "Computing a Glimpse of Randomness" and I agree that it is incorrect as worded. A "specific digit of any number" is not something to which computability applies; computability applies to the entire number up through that particular digit. Calude, et al were reinforcing the point that their particular method of calculating a portion of Chaitin's constant doesn't violate the basic tenants of the uncomputability of the constant itself. Regardless, it's incorrect and not necessary to the article. 76.99.236.216 (talk) 04:54, 16 May 2008 (UTC)

[edit] comments by exa

It is a normal and transcendental number which can be defined but cannot be computed

I really doubt that this fact is proven yet. Both (the normalness and the transcendence) is quite likely, but I have never seen a proven for either of that. I would be surprised if the irrationality of the number is already proven, which I doubt again due to the incomputableness of the number.

(The constant N heavily depends on the encoding choices and does not reflect the complexity of the axiomatic system in any way.)

Where did you get this terrible idea?.... Do you know the invariance theorem? Do you know Theorem C in AIT that shows the relation between the axiomatic complexity of a FAS to the number of digits that can be proven? exa 16 Sep, 2004

... then there exists a constant N such that no digit of Ω after the N-th can be proven to be one or zero within that system.

Is this true? It seems like this states that there are at most a finite number of Turing machines that can be proven to either halt or not halt (given some system of enumeration, etc.). Chas zzz brown 02:06 Feb 5, 2003 (UTC)

Ooops! In the words of emily latella, "never mind". I misread that the nth digit of Ω was 1 iff Turing machine n halts. Chas zzz brown 09:00 Feb 5, 2003 (UTC)

The proof of the transcendence of Chaitin's number for any particular encoding could rely on the following indirect argument. If Chaitin's number is algebraic, then it is computable. But we can prove it isn't computable as this leads to a contradiction with the undecidability of the halting problem. However, does this argument hold water if Chaitin's number is algebraic, but it is absolutely impossible to determine which one it is? At 3am, I'm not sure... Elroch 02:11, 27 March 2006 (UTC)

I have rewritten the introduction to this article and added some headings to divide it into sections. I am not sure if this belongs to algorithmic information theory so perhaps someone can put it in the right category. MathMartin 17:44, 20 Jul 2004 (UTC)


What does the statement "Ω is uncompressible" mean? Is compressible number defined anywhere in the Wikipedia? - Mike Rosoft 16:55, 8 Dec 2004 (UTC)

Oops! I thought the link I had supplied to Data compression would clear it up. It means that it is impossible to supply a program + infinite "compressed" sequence such that for any n, the program can return the first n bits of Ω using the first f(n) bits of the compressed sequence, and for any k there exists n0 such that for n > n0, f(n) < nk. I couldn't find this definition anywhere in the data compression category. Lev 20:15, 9 Dec 2004 (UTC)

I have included a summary of the definition for incompressible number. Rattle 09:00, 1 Jun 2005 (UTC)

Certainly there may be an algorithm that computes Chaitin's constant. There is just no way to prove that any algorithm does so.

Nope. If there were such an algorithm, it could be used to solve the halting problem (even though we would not be able to prove it). But there is no algorithm, provable or not, that solves it. – Lev 10:35, 10 August 2005 (UTC)
To clarify this a bit: There is no algorithm that computes ALL bits of Chaitin's constant. Otherwise, it could be used to solve the halting problem: We would try to compute Omega by running all possible programs in parallel (we would have to launch them one by one, of course). Then we would sum up 2^-(length) for every program that would terminate. Now one of two things would happen: either the prog in question would terminate, or the sum would eventually excede Omega-2^(-our prog's length), proving by contradiction that the answer is no.
Now if there were an algorithm for computing Omega but we didn't know which it was, the above meta-algorithm would give us an algorithm for solving the halting problem, but we still wouldn't know what it was. But there is no solution to the halting problem, not even one we can't find.
Certainly, for any finite N there is an algorithm that computes the first N bits of Omega, but we don't always know which it is. And there might (for all I know) be an algorithm that computes an infinite number of bits, but still leaves an infinite number of bits unknown.
Lev 12:33, 21 December 2005 (UTC)
I think the last remark here is wrong, or at least it contradicts to the last sentence of this article: from some N the digits cannot be computed algorithmically, i.e. the algorithm does not exist. From my experience with automatic theorem provers, I'd like to add a metaphor: the automated theorem proving can algorithmically prove quite a lot and help automate proving that certain algorithms do halt or do not halt, but at some (more complicated) point it needs human assistance. There are even more complicated problems than the halting problem, so the later numbers of Chaitin's constant will be even harder to compute.
Lev is right. For any finite N, you can just print out a string which happens to be the relevant bit of omega :). Beyond a certain N, you cannot prove that you are right, as the article states.
One candidate for an algorithm that produces an infinite number of bits is the program which starts simulating every Turing machine (cascaded, so that you're not trying to do an infinite number of things at once), and records those that halt. This machine can produce a sequence that limits to omega from below. IIRC, all of the `1's it produces are correct digits of omega. To prove that such a machine produces infinitely many bits of omega, it only remains to be shown that it keeps on producing 1s (I don't know if that can be done; it might depend on the model of computation/distribution of programs). -- pde 00:16, 14 March 2006 (UTC)

[edit] Shaitan's Consonant

I know what it is! Shaitan's consonant is 666! Repent, infidels!!! -ZyXoas (wiki user, not logged in).

Oooookay, then.

[edit] On the probability interpretation

The following two sentences don't agree; the first one doesn't make sense.

It can then be shown that Ω represents the probability that a randomly produced bit string will 
encode a halting program. This means that if you start flipping coins, always recording a head 
as a one and a tail as a zero, the probability is Ω that you will eventually reach the encoding of  
a syntactically correct halting program.

It is true that Omega is the probability that a randomly chosen infinite sequence of 0s and 1s begins with a string that is a halting program. This makes sense, because there is a probability measure (Lebesgue measure) on the set of such infinite sequences, and the set of sequences that begin with a string encoding a halting program is a measurable set. But there cannot be a randomly chosen bit string because there is no nontrivial probability measure on the set of finite bit strings. CMummert 04:37, 16 June 2006 (UTC)

Yeah, it shold be probability that randomly chosen infinite bit string begins with a proper program which stops after finite time.

I'm not sure will it make things more clear... But the way it is now is correct with some explanations. After all you can assume that you randomly chose bits of a string until (if ever) you reach a valid program string. In such random experiment probability of obtaining program that stops is exactly \Omega. Macieksk 09:28, 24 June 2006 (UTC)

[edit] Merge

The article "Omega (computer science)" is very similar to this one. I suggest that they be merged.P.L.A.R. 21:41, 29 June 2006 (UTC)

See my comments on the other article's discussion page. This page should stay, the other one should go, since this is not the only omega in use in computer science, and perhaps not even the most common. There doesn't seem to be much info on the current Omega (computer science) page that should be brought over here. --Dantheox 23:40, 29 June 2006 (UTC)

Omega (computer science) now redirects to this page.P.L.A.R. 21:00, 30 June 2006 (UTC)

[edit] Properties of Chaitin's constant

Shouldn't some reference be made to the fact that one can use Chaitin's constant to determine the answer to any possible phrasable question? Jenigmat429 03:16, 13 February 2007 (UTC)

That "fact" is not true. Chaitin's omega is at level Sigma_1 of the arithmetical hierarchy. Complete knowledge of it cannot even determine the truth of Pi_2 sentences of Peano arithmetic, much less more complicated questions. Chaitin's constant is Turing equivalent to the Halting problem, and nothing more. CMummert · talk 03:24, 13 February 2007 (UTC)
Ok, more accurately, couldn't you use it to determine the provability of a proposition L in a formal system F? Because, if there's a program that's sufficiently small that will halt if L is provable in F, and if you know enough of chaitin's constant, one could use Chaitin's omega to determine L's provability in F. Or maybe I'm interpreting Chaitin's omega wrong. Jenigmat429 14:16, 13 February 2007 (UTC)
Yes, you can use a Chaitin constant to do that. But this is because these constants are equivalent to the Halting problem, not in particular because of their randomness. CMummert · talk 16:56, 13 February 2007 (UTC)

[edit] rewrite

I expanded and rewrote the article this morning. Feel free to hack away at it, I'm done for a while. CMummert · talk 17:48, 13 February 2007 (UTC)

[edit] more rewrite

Currently the article says: "Each halting probability is a normal and transcendental real number which is definable but not computable, which means that there is no algorithm that enumerates its digits." But there is a non-halting algorithm that does enumerate its digits - Omega can be approximated from below. Eventually the first n digits will be correct. I'll correct this: "which means that there is no halting algorithm that enumerates its digits." Algorithms 20:25, 23 June 2007 (UTC)

[edit] Super Omega

One should mention the "Super Omega" of Jürgen Schmidhuber: http://www.idsia.ch/~juergen/kolmogorov.html . The first n bits of Gregory Chaitin's constant Omega are incompressible in the sense that we cannot compute them by a halting algorithm with fewer than n-O(1) bits. Hoewever, consider the short but never halting algorithm which lists and runs all possible programs; whenever one of them halts its probability gets added to the output (initialized by zero). After finite time the first n bits of the output will not change any more (it does not matter that this time itself is not computable by a halting program). So there is a short non-halting algorithm whose output converges (after finite time) on the first n bits of Omega, for any n. In other words, the enumerable first n bits of Omega are highly compressible in the sense that they are limit-computable by a very short algorithm; they are not random with respect to the set of enumerating algorithms. Schmidhuber constructed a Super Omega which is still limit-computable, but more random than Omega, as one cannot significantly compress it with an enumerating algorithm. Algorithms 15:43, 8 July 2007 (UTC)

---

If there exists a non-halting program that will converge to the first N-bits of Omega in a finite amount of time, it can be written in the form of an infinite loop:

omega=0 while(TRUE) /*LOOP BODY*/ end while

It is very easy to convert this infinite loop into a finite loop:

constant ITERATIONS = ___ omega=0 while ( a< ITERATIONS) /*LOOP BODY*/ a++ end while

The value ITERATIONS can be set to any arbitrarily high value, allowing this loop to execute for an arbitrary long amount of time. Since the non-halting version of the program can converge to any N number of bits you want in a finite amount of time, so can this one, assuming ITERATIONS is a large enough integer.

Suppose the non-halting version can compute X digits in Y iterations. The halting version, if ITERATIONS is set equal Y at the beginning of the program, will also produce the same X number of digits as the non-halting version. However ITERATIONS can be arbitrarily large. You could set ITERATIONS to larger and larger values, and compute more and more digits of Omega. You can set it as large as you want and compute as many digits of Omega as you want. But this seems to violate the property of incompressibility? —Preceding unsigned comment added by 64.162.134.186 (talk) 20:47, 8 October 2007 (UTC)

It would seem that way at first, but it turns out that the bits you determine via that non-halting algorithm may be innaccurate until it completes (which of course never happens) - see the reference "Computing a Glimpse of Randomness" in the article. —Preceding unsigned comment added by 76.99.236.216 (talk) 04:46, 16 May 2008 (UTC)

[edit] Errata

In the Background section, shouldn't say "f = λ x.F(p,x)" instead of "f(x) = λ x.F(p,x)"?