Talk:Risch algorithm

From Wikipedia, the free encyclopedia

[edit] Constant undecidability

I'm removing the following statement from the page because I'm not sure it's true:

The Risch decision procedure is not formally an algorithm because it requires an oracle that decides whether a constant expression is zero, a problem shown by Daniel Richardson to be undecidable.

According to MathWorld, Richardson's theorem states:

Let R be the class of expressions generated by
  1. The rational numbers and the two real numbers π and ln2,
  2. The variable x,
  3. The operations of addition, multiplication, and composition, and
  4. The sine, exponential, and absolute value functions.
Then if E in R, the predicate E = 0 is recursively undecidable.

This is based on a simple algebraic system that includes, among other things, an absolute value function. This implies an ordered field, while Risch integration is typically (always?) done over an unordered field. And this seems to make a big difference! My logic goes as follows:

  1. Any constant expression involving nothing but rational numbers and the standard field operators (+ - * /) should be decidable. In sums and differences, multiply denominators by common multiples to make them equal, then combine fractions, throw away the denominator, and test for zero on the numerator. Multiplication is even simpler — just multiple the numerators and denominators seperately, and for division just interchange numerator and denominator.
  2. Any simple algebraic extension over a field can be reduced using Euclidean long division to testing if a remainder in the underlying field is zero.
  3. Any simple transcendental extension over a field requires all coefficients to be identically zero — each is testable in the underlying field.

Therefore, any algebraic system that consists purely of the rationals plus a finite number of algebraic and/or transcendental extensions should be decidably testable for equality to zero. This is the world of the Risch algorithm.

And the presence of the absolute value function makes all the difference for the Richardson theorem, right?

Baccala@freesoft.org 05:49, 14 January 2006 (UTC)

Now that Baccala@freesoft.org mentions it, it does make sense. I wrote that line in because I remember reading it in one of the introductory articles on computer algebra, but I have never read Richardson's result. These things can be tricky, and given that I am not a professional in computer algebra, I agree with the removal. XaosBits 17:46, 14 January 2006 (UTC)
I found the reference. It's from Moses (of Macsyma fame) paper, Symbolic integration: the stormy decade, in Communications of the ACM, volume 14. The quote about deciding if an expression is zero in on page 557. XaosBits

OK, thanks, I've looked over that reference now. It would seem to me that 1) I was wrong, and 2) the original statement was not entirely correct. My argument above only works if you can figure out if an extension is algebraic or transcendental to begin with, which seems to be the problem. Moses says "There exists no known general algorithm for determining whether a constant involving exponentials and logarithms is 0" (p. 557). But I still don't think Richardson's theorem applies here — there might be such an algorithm; I don't think it's been proven undecidable. I'll go read Richardson's paper in detail before I say more.

Baccala@freesoft.org 21:49, 26 January 2006 (UTC)

[edit] Possibly an error?

Shouldn't the first equation be g = ... (instead of f = ...)? Ariel

Looks that way---I've changed it to g. Michael Hardy 23:39, 8 May 2006 (UTC)
If we have the equation g ′ = f, then the anti-derivative of f is g (up to a constant): ʃf = ʃg ′ = g. In the case that
f = v^\prime + \sum_{i<n} \alpha_i \frac{u_i^{\prime}}{u_i}  \,.
the anti-derivative is
g = v + \sum \alpha_i \ln u_i.
Why not write it with the logarithms? Because I went to check Rosenlicht, and he stated the result in terms of f.
XaosBits 01:33, 10 May 2006 (UTC)