Talk:Risch algorithm
From Wikipedia, the free encyclopedia
Contents |
[edit] Constant undecidability
I'm removing the following statement from the page because I'm not sure it's true:
- The Risch decision procedure is not formally an algorithm because it requires an oracle that decides whether a constant expression is zero, a problem shown by Daniel Richardson to be undecidable.
According to MathWorld, Richardson's theorem states:
- Let R be the class of expressions generated by
- The rational numbers and the two real numbers π and ln2,
- The variable x,
- The operations of addition, multiplication, and composition, and
- The sine, exponential, and absolute value functions.
- Then if E in R, the predicate E = 0 is recursively undecidable.
This is based on a simple algebraic system that includes, among other things, an absolute value function. This implies an ordered field, while Risch integration is typically (always?) done over an unordered field. And this seems to make a big difference! My logic goes as follows:
- Any constant expression involving nothing but rational numbers and the standard field operators (+ - * /) should be decidable. In sums and differences, multiply denominators by common multiples to make them equal, then combine fractions, throw away the denominator, and test for zero on the numerator. Multiplication is even simpler — just multiple the numerators and denominators seperately, and for division just interchange numerator and denominator.
- Any simple algebraic extension over a field can be reduced using Euclidean long division to testing if a remainder in the underlying field is zero.
- Any simple transcendental extension over a field requires all coefficients to be identically zero — each is testable in the underlying field.
Therefore, any algebraic system that consists purely of the rationals plus a finite number of algebraic and/or transcendental extensions should be decidably testable for equality to zero. This is the world of the Risch algorithm.
And the presence of the absolute value function makes all the difference for the Richardson theorem, right?
Baccala@freesoft.org 05:49, 14 January 2006 (UTC)
- Now that Baccala@freesoft.org mentions it, it does make sense. I wrote that line in because I remember reading it in one of the introductory articles on computer algebra, but I have never read Richardson's result. These things can be tricky, and given that I am not a professional in computer algebra, I agree with the removal. XaosBits 17:46, 14 January 2006 (UTC)
- I found the reference. It's from Moses (of Macsyma fame) paper, Symbolic integration: the stormy decade, in Communications of the ACM, volume 14. The quote about deciding if an expression is zero in on page 557. XaosBits
OK, thanks, I've looked over that reference now. It would seem to me that 1) I was wrong, and 2) the original statement was not entirely correct. My argument above only works if you can figure out if an extension is algebraic or transcendental to begin with, which seems to be the problem. Moses says "There exists no known general algorithm for determining whether a constant involving exponentials and logarithms is 0" (p. 557). But I still don't think Richardson's theorem applies here — there might be such an algorithm; I don't think it's been proven undecidable. I'll go read Richardson's paper in detail before I say more.
Baccala@freesoft.org 21:49, 26 January 2006 (UTC)
[edit] Possibly an error?
Shouldn't the first equation be g = ... (instead of f = ...)? Ariel
- Looks that way---I've changed it to g. Michael Hardy 23:39, 8 May 2006 (UTC)
-
- If we have the equation g ′ = f, then the anti-derivative of f is g (up to a constant): ʃf = ʃg ′ = g. In the case that
- the anti-derivative is
- .
- Why not write it with the logarithms? Because I went to check Rosenlicht, and he stated the result in terms of f.
- XaosBits 01:33, 10 May 2006 (UTC)
- If we have the equation g ′ = f, then the anti-derivative of f is g (up to a constant): ʃf = ʃg ′ = g. In the case that
[edit] typo of some sort?
This text is in the article: "...For the function f*e^g, where f and g [what?], we have..."
I don't know what's missing... "are elementary functions"? Thought I should point it out. 71.191.51.134 19:04, 29 July 2007 (UTC)
[edit] This is incorrect
The article says:
- For example, all known programs (except Axiom) cannot find the antiderivative for
This is simply not true. Both Mathematica and Maple can calculate the antiderivative (even not-so-recent versions). Anyone can try this at http://integrals.wolfram.com/ . (The Root[] objects represent roots of polynomials, i.e. numbers, and can be easily expanded (written in explicit form) using the ToRadicals[] command). I won't edit the article because I am not familiar with the topic (perhaps these programs don't use the Risch-algorithm for this specific problem?), but this needs to be cleaned up or rephrased ... —Preceding unsigned comment added by 129.177.44.25 (talk) 08:51, 8 April 2008 (UTC)
- Sorry to tell this, but this seems to be correct. Try http://integrals.wolfram.com with input "x/sqrt(x^4+10*x^2-96*x-71)". You will receive large answer. But the problem is not in the Root[] objects (of course, you are right, they are just numbers, so the fuction is elementary even if it use such numbers - no matter whether they can be transferred to radicals or not). The problem that the answer contains "F", and "Π". F is which is elliptic ingegral of the first kind, and Π is elliptic integral of the third kind. Both F and Π cannot be written using elementary functions. And the question is not whether x/sqrt(x^4+10*x^2-96*x-71) has an antiderivative - the question is whether this antiderivative can be written using elementary functions.
- Starting from this point to the end, I am not sure, whether this is 100% correct.
- Try to change 71 to 72 in the polynomial. Integrals.wolfram.com will give you the answer which will look near the same. This is because integrals.wolfram.com cannot understand, that there is very big difference between
- x/sqrt(x^4+10*x^2-96*x-71) and
- x/sqrt(x^4+10*x^2-96*x-72)
- Antiderivative of the first can be written using elementary functions, and antiderivative of the second cannot. This is because Galois groups of these polynomials are different:
- x^4+10*x^2-96*x-71 Galois group is D(4), e.g. generated by permutations (1 2 3 4) and (1 3), and contains 8 elements (same as in "x^4-2")
- x^4+10*x^2-96*x-72 Galois group is S(4), e.g. generated by permutations (1 2), (1 3), (1 4) and contains 24 elements
- So x^4+10*x^2-96*x-71 is very special case of quadric polynomials, and this is the reason why Risch algorithm in the 71 case gives the answer "yes", and in the 72 case gives the answer "no".
- I have tried Maple v. 11 with input:
- simplify(convert(int(x/sqrt(x^4+10*x^2-96*x-71),x),radical));
- The answer also contains "EllipticF" and "EllipticPi". So Maple also does not understand that antiderivative for x/sqrt(x^4+10*x^2-96*x-71) can be written using elementary fuctions.
- Do you agree with my argumentation?
- Sorry for my bad English.
- Gaz v pol (talk) 18:14, 18 April 2008 (UTC)
-
- What the original writer of the article was trying to say is probably correct. What the article is really saying (i.e. that no system can find an antiderivative---no mention of elementary functions) is incorrect (Mathematica does find an antiderivative, but not in terms of elementary functions). This article needs a big cleanup, but I dare not touch it because this is a very complicated mathematical topic that I know nothing about (I came here to learn a little about it :) ). If you are familiar with the topic, it would be great if you cleaned it up a bit (and made it more precise)! —Preceding unsigned comment added by 83.166.219.47 (talk) 15:41, 6 June 2008 (UTC)