I think usually FindRoot will give the root that's closer to the start point. But see the example below, where I try to find the root of $\cos x=0$. If I started from $x=0.1$, then I get $10.9956$, but if I started from 1, I get $1.5708$. What's wrong?

example

share|improve this question
4  
Draw a graph of the function and it's tangent at x == 0.1. FindRoot is using Newton's method. – Michael E2 14 hours ago
4  
I believe its because FindRoot[] uses Newton's Method. The tangent line hits further out. In general, Newtons method requires a good initial guess or you "can" get a root quite far away. – Michael McCain 14 hours ago
4  
FindRoot gives the root that it finds. :-) – Brett Champion 13 hours ago
2  
To have more control over which root is obtained, give FindRoot two intial guesses, which prompts it to use the use secant method. Then FindRoot usually returns the value of a root bracketed by the two guesses. For instance, FindRoot[Cos[x], {x, -1, 4}]. – bbgodfrey 13 hours ago

Technically, FindRoot uses a damped Newton's method. In an undamped Newton's method, the first step would look like this, the tangent striking the x-axis just above 10:

Plot[{Cos[x], Cos[0.1] - Sin[0.1] (x - 0.1)}, {x, 0, 12}]

Mathematica graphics

You can keep large steps from occurring by decreasing the DampingFactor:

FindRoot[Cos[x], {x, 0.1}, DampingFactor -> 0.2]
(*  {x -> 1.5708}  *)

The damping factor multiplies the change in x. The undamped step would be x == 10.066644423259238, so the damped first step is

0.1 + (10.066644423259238` - 0.1) 0.2
(*  2.09333  *)

That lands x close enough to the root nearest 0.1 that FindRoot will converge on it.

Of course, damping slows down convergence. In the following, it slows it down too much:

FindRoot[1/x - 1/1000, {x, 0.1}, DampingFactor -> 0.2]

FindRoot::cvmit: Failed to converge to the requested accuracy or precision within 100 iterations.

(*  {x -> 999.984}  *)

The regular method has no problem with it, though:

FindRoot[1/x - 1/1000, {x, 0.1}]
(*  {x -> 1000.}  *)
share|improve this answer

This question has been asked in different context here and here. As mentioned in the comments, RootFind is base on Newton's method which work pretty well when a good guess is provided. But I think, it fails to provide you with multiple roots. To find multiple roots, you can use NDSolve

f[x_] = Cos[x];
Module[{sol}, 
 Column[{sol = NSolve[{f[x] == 0, -10 <= x <= 10}, x], 
   Plot[f[x], {x, -10, 10}, 
    Epilog -> {Red, AbsolutePointSize[6], Point[{x, f[x]} /. sol]}, 
    ImageSize -> 360]}]]

enter image description here

I adopted this idea from Bob Hanlons answer to same sort of question.

share|improve this answer
1  
The NDSolve Method you linked in your answer is very different from the code you posted (you used NSolve). Otherwise this is a good answer :) – Sascha 9 hours ago

To see what is happening, implement a quick Newton iteration algorithm. for instance:

NewtonsMethodList[f_, {x_, x0_}, n_] := 
 NestList[# - Function[x, f][#]/Derivative[1][Function[x, f]][#] &, 
  x0, n]

Now see what happens when we have starting value as 0.1 and 1 and with say 5 iterations

In[2]:= NewtonsMethodList[Cos[x], {x, 0.1}, 5]

Out[2]= {0.1, 10.0666, 11.4045, 10.9711, 10.9956, 10.9956}

In[5]:= NewtonsMethodList[Cos[x], {x, 1.}, 5]

Out[5]= {1., 1.64209, 1.57068, 1.5708, 1.5708, 1.5708}

I hope this helps.

share|improve this answer

When in doubt as to whether FindRoot[] is functioning as expected for a given nonlinear problem, one should try to use the diagnostic capabilities of the options EvaluationMonitor and StepMonitor. You've already been told in other answers as to why you should have expected your result, considering that you started the iteration with a seed that is uncomfortably near an extremum. Thus, let me demonstrate the use of EvaluationMonitor:

Reap[FindRoot[Cos[x], {x, 0.1}, EvaluationMonitor :> Sow[x]]]
   {{x -> 10.9956}, {{0.1, 10.0666, 11.4045, 10.9711, 10.9956, 10.9956}}}

where we use Sow[]/Reap[] to get the values where Cos[x] was evaluated. We can also demonstrate the effect of damping, as shown by Michael:

Reap[FindRoot[Cos[x], {x, 0.1}, "DampingFactor" -> 1/5, EvaluationMonitor :> Sow[x]]]
// Short
   {{x -> 1.5708}, {{0.1, 2.09333, 1.97814, <<76>>, 1.5708, 1.5708, 1.5708}}}

where I have mercifully truncated the output, showing that damping gives better results at the cost of an increased number of iterations.

One could choose to use Brent's method instead by specifying explicit brackets. The convergence is not as fast as Newton-Raphson, but it is certainly much safer:

Reap[FindRoot[Cos[x], {x, 1., 2.}, EvaluationMonitor :> Sow[x]]]
   {{x -> 1.5708}, {{1., 2., 1.5649, 1.57098, 1.5708, 1.5708, 1.5708}}}

If, like me, you like pictures to help with diagnostics, there is a function called FindRootPlot[] (more information here) that can be used:

Needs["Optimization`UnconstrainedProblems`"]
FindRootPlot[Cos[x], {x, 0.1}] // Last

FindRootPlot[] result

FindRootPlot[Cos[x], {x, 0.1}, "DampingFactor" -> 0.2] // Last

FindRootPlot[] with "DampingFactor" result

share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.