Talk:Newton's method

From Wikipedia, the free encyclopedia

WikiProject Mathematics
This article is within the scope of WikiProject Mathematics, which collaborates on articles related to mathematics.
Mathematics rating: B Class High Priority  Field: Applied mathematics
One of the 500 most frequently viewed mathematics articles.

Contents

[edit] Raphson

In all fairness to Raphson, this should be the Newton-Raphson method. Pizza Puzzle

I do agree with the above mentioned.Nijdam 16:01, 17 March 2006 (UTC)

[edit] pitfalls of the Newton-Raphson method

pitfalls of the Newton-Raphson method:

An infection point(f(x)=0) at the vicinity of a root causes divergence. A local maximum or minumum causes oscillations. It may jump from one location to another. A zero slope causes division by zero.

Right, the method is not perfect. But as long as you start close enough to the root you will always be fine (of course, if the slope at the root is zero, the convergence will be slower). Oleg Alexandrov 20:04, 14 Mar 2005 (UTC)\

I am using Newton-Raphson to solve a system of nonlinear equations using matlab. I am sure my program should work, but it only converges up to a point, then it starts jumping between the two same points and will not converge. It keeps sending my computer into an infinite loop which is really doing my head in! I need this to work soon! Please help me out. —Preceding unsigned comment added by 84.67.41.134 (talk • contribs) 01:19, 16 March 2007 (UTC)

You may have a numerical problem in the calculation (round-off errors), or you may not be close enough to the root to guarantee success, or there may not be a root. Perhaps you should try using the bisection method beginning with an interval containing the two values and see what happens. JRSpriggs 08:10, 16 March 2007 (UTC)

I know there is almost a root (my idiot teacher tells me there is a number) plus in working with numbers like x10^-10 to get it to converge. Also its three equations & unknowns. Thanks for tip.

[edit] Generalisations

shouldn't this:

One may use Newton's method also to solve systems of n (non-linear) equations...

read like this:

One may use Newton's method also to solve systems of k (non-linear) equations...

I don't understand this section well enough to be confident in this change (unsigned post by anon)

You are right. I just fixed that. Oleg Alexandrov 22:49, 3 August 2005 (UTC)

[edit] insufficient hypotheses?

The article currently states:

One can prove that, if f ' is continuous, and if the unknown zero x is isolated, then there exists a neighborhood of x such that for all starting values x0 in that neighborhood, the sequence {xn} will converge towards x.

This doesn't seem to be strong enough. Consider the following example: Let g(x) = x^2 * sin^2( 1/x ). Then g(x) is a continuous non-negative function with infinitely many zeros in each neighborhood of 0, namely at each point x = 1/(n pi) for n an integer. Take f(x) to be the integral from 0 to x of g(x). Then f is differentiable, f ' = g is continuous, and 0 is an isolated zero of f. (This last being true since for any a>0 there are values 0<x<a for which g(x)>0, and for all 0<x<a g(x) >= 0, so f(a) = integral of g(x) from x=0 to a is >0. Similary for a<0.) Then for any starting point at one of these zeroes 1/(n pi) of f ', the Newton method immediately fails with a division by zero. So there is no neighborhood of zero in which all starting values give a sequence converging towards 0.

I'm not sure right now what the right conditions should be. Hmmm... --anon

Good point indeed. And that function can be made as smooth as one would like, even infinitely smooth, if one multiplies sin^2( 1/x ) by exp(-1/x^2) instead of x^2. So, asking for more smoothness will not solve things. So I guess one must ask that either f'(x0)≠0, or if it does equal zero, then x0 be a an isolated root of f'(x) also. That should be sufficient I guess. Oleg Alexandrov (talk) 08:01, 11 January 2006 (UTC)
I have my doubts about Oleg's guess. Anyway, I removed it from the article for the moment. -- Jitse Niesen (talk) 14:00, 11 January 2006 (UTC)
Come on, it's a guess, man. You should not say you have doubts (that's the same as thing the algorithm terminates), you should but your better guess in, and eventually we will converge towards the right thing! :) Do you at least agree that smoothness won't help things. Oleg Alexandrov (talk) 17:03, 11 January 2006 (UTC)
Actually, I think smoothness would help, for sufficiently large values of smoothness ... If you want my guess: Newton's method converges locally if f is analytic. Proved by Stephen Smale, perhaps? Anyway, I found one theorem which I put in. -- Jitse Niesen (talk) 18:30, 11 January 2006 (UTC)


[edit] numerical methods

i am currently in a numerical methods class, i looked through this article and it seems to be missing this aspect of appliction, while the current equation form is correct, to fix up this page to make it more robust, i am going to add a section on using newtons method with functions with many roots to find the roots. im not sure what its called right now, but ill be back. The preceding unsigned comment was added by Mikefadock (talk • contribs) . (Dude)

If you want to find many roots, you need first to locate where the roots are using the bisection method, or something. I think that kind of material would be more appropriate at root-finding algorithms or something rather than Newton's method. Oleg Alexandrov (talk) 02:35, 3 March 2006 (UTC)

The method i am referring to is the iterave method for pinning down roots which are far from zero, starting from newtons method, we have x_one=x_not-f(xnot)/f'(xnot) replacing f(xnot) with g(alpha)-alpha and f'(fnot)=g'(alpha)-1 whereby alpha is the root to be found. I feel this should be included, or at minimum a notation of general situations where newtons method fails and links to the alternatives. From the perspective of a student, i spent 1/2 hour just trying to figure out why newtons method would not work on sin(x)^2, having a quick note stating that "some functions will not work with newtons method, sin(x)^2 other multipul root questions ect. ... please see this section for explanation" . Dude 02:48, 5 March 2006 (UTC)

Inclusion of the conditions for convergence would also be cool.(M=max_(x is a member of r)|f''(x)| / 2*min_(x is a member of r)|f'(x)| , M|alpha-xnot|<1 for convergence). A new section could be added. I really don't know the limit of how large a page can get before it becomes unmanagable. (i'll add it btw, but i was thinking that doing a quick check for thoughts on the matter would be best.) Dude 02:59, 5 March 2006 (UTC)

I don't understand your alternative method at all (what is g?). Does it have a name? Alternative methods are listed in root-finding algorithm.
I don't think the failure of Newton's method has anything to do with multiple roots. The article does mention that Newton's method may fail ("Newton's method will usually converge provided the initial guess is close enough to the unknown zero.", and also in the section "Practical considerations"). However, the presentation can certainly be improved.
The convergence result you mentioned would indeed be nice to add (of course, a reference should also be included), for instance to the "Analysis" section. I guess the article can grow twice as big before we should consider splitting it. -- Jitse Niesen (talk) 05:46, 5 March 2006 (UTC)
I'll throw something together in tex and put it up on my user page, you can look at it there, it will explain the substitution better than i have explained it here. I'll drop in the convergence stuff tommorow or later today.
Convergence fails spectactularily for newtons method under very specific conditions, im plowing through my textbook now to find them....(the function must be thrice differentiable, continuous in f,f',f ' ' , f' cannot equal zero at the root. There exists a region around the root for which convergence is quadratic(accuracy doubles with each new iteration), a larger region for linear convergence, and another in which no convergence occurs.) I feel the article should include the methods for finding the regions in which convergence occurs. Not just simply stating "it occurs close enough to the zero". Give me a little time to patch this together and it should come out a much better resource. Dude 18:42, 5 March 2006 (UTC)
Just a question, is that your own research, or is it published material? Oleg Alexandrov (talk) 01:24, 6 March 2006 (UTC)


An Introduction to Numerical Methods nad Analysis, James F Epperson, A text for my numerical methods class. It covers root finding algorithms, theory, error analysis ... ect. I cannot copy out of it(copyright and everything) so I have to first understand the material, and then present it in an understandable way. To answer your question, published. Dude 01:09, 8 March 2006 (UTC)

[edit] Change to Newton-Raphson method?

This has been suggested several times. Do you think we should move? I've placed this on WP:RM Borisblue 05:21, 22 April 2006 (UTC)

What are the reasons for moving? I think "Newton's method" is used more frequently than "Newton-Raphson method". For instance, a search on MathSciNet for papers with "Newton method" in the title returns 1558 hits, and a search for papers with "Newton-Raphson method" in the title returns 44 hits. -- Jitse Niesen (talk) 04:04, 24 April 2006 (UTC)
In German, at least, it's only ever known as Newtonsches Näherungsverfahren, not as Newton-Raphsonsches Näherungsverfahren... —Nightstallion (?) Seen this already? 07:30, 27 April 2006 (UTC)
Well, one argument in favor is that Raphson actually invented it first, I believe. That's the whole reason people bother to say something as awkward as Newton-Raphson. On the other hand, they're both dead, and we're alive and we have to say it. So there's that... Birge 02:45, 17 October 2006 (UTC)

[edit] Brief Derivation of the Formula

Instead of just presenting the formula

   x_n+1 = x_n - f(x_n)/f'(x_n)

I think that a very brief derivation of this would be enlightening. I for one, understand a formula much better when I can see where it came from

   slope of tangent = f(x_n+1)-f(x_n)/(x_n+1 - x_n) = f'(x_n) {from defn of derivative)
   But f(x_n+1) = 0 so by simple algebra we can derive
   x_n+1 = x_n - f(x)/f'(x)

--unisigned

There is already a picture illustrating that. I would support adding the derivation as long as it is well written and indeed rather short. Oleg Alexandrov (talk) 02:44, 14 October 2006 (UTC)


eh, I just went ahead and added the derivation because I had to think about it for a couple seconds. Hope nobody minds. Earlh 01:36, 13 March 2007 (UTC)

[edit] THE GENERALIZED MEDIANT (RATIONAL MEAN) AND NEWTON'S METHOD

It is absolutely disturbing and worrying to realize that the extremely simple arithmetical methods shown at: New arithmetical root-solving methods, which embraces Newton's, Bernoulli's, Halley's and Householder's methods (among many other new ones) do not appear in any text on numbers since Babylonian times up to now. —Preceding unsigned comment added by Arithmonic (talkcontribs)

That page is almost completely unreadable. Good luck with that...

[edit] Values of X0?

If you have absolutely no idea about a potential value of X0 whatsoever, does it make any significant difference? I tried the example equation f(x) = cos(x) − x3 with the X0 value of 100, and it still came up with the same answer, it just took more steps. So, are there cases where the deviation of the X0 from an actual root value will make a significant difference to calculating the root approximation?

Also, can this method be used to find multiple roots of a continuous function?Nonagonal Spider 05:54, 27 March 2007 (UTC)

Consider f(x) = x/(x2 + 1). If you are too far away from zero, the method will just take you further away. JRSpriggs 10:17, 27 March 2007 (UTC)

[edit] Counterexamples

In the article, the first counterexample has f(0) = 0, and then says that f'(0) = 1. If f'(x) = d(0)/dx = 0 at x = 0, then f'(0) = 0, not 1. — metaprimer (talk) 13:06, 27 October 2007 (UTC)

You cannot say that f'(x) = d(0)/dx = 0 at x = 0. I know it looks convincing, but it's still not correct. For instance, consider the function
 g(x) = \begin{cases} 0, & \text{if } x=0; \\ x, & \text{if } x \ne 0. \end{cases}
This is just a complicated way to write g(x) = x, so g'(0) = 1. However, your reasoning would show that g'(x) = 0.
The function f in the counterexample is more complicated, so it's not so easy to compute f'(0). You need to use the definition of the derivative as a limit. -- Jitse Niesen (talk) 17:39, 27 October 2007 (UTC)
Ah, I see now. Perhaps this should be briefly mentioned - that the limit of f(x) as x -> c is seperate from f(c) and therefore f'(0) is concerned with f(x) at x != 0. — metaprimer (talk) 19:35, 27 October 2007 (UTC)