Articles of newton raphson

Computing square roots with arithmetic-harmonic mean

We know that if we iterate arithmetic and harmonic means of two numbers, we get their geometric mean. So, basically if we need to compute the square root of $x$: $$\sqrt{x}=\sqrt{1 \cdot x}=AHM(1,x)$$ $$a_0=1,~~~~b_0=x$$ $$a_{n+1}=\frac{a_n+b_n}{2},~~~~~b_{n+1}=\frac{2a_nb_n}{a_n+b_n}=\frac{a_nb_n}{a_{n+1}}$$ As far as I know, this expression will converge for any real positive $x$. See this and this question for […]

Newton's method vs. gradient descent with exact line search

tl;dr: When is gradient descent with exact line search preferred over Newton’s method? I simply don’t understand why exact line search is ever useful, and here’s my reasoning. Let’s say I have a function $f$ that I want to minimize. Gradient descent performs the updates: $$\vec{x} \gets \vec{x} – t \vec{\nabla}f(\vec{x})$$ The optimal step size […]

When does Newton-Raphson Converge/Diverge?

Is there an analytical way to know an interval where all points when used in Newton-Raphson will converge/diverge? I am aware that Newton-Raphson is a special case of fixed point iteration, where: $$ g(x) = x – \frac{f(x)}{f'(x)} $$ Also I’ve read that if $|f(x)\cdot f”(x)|/|f'(x)^2| \lt 1$, then convergence is assured. I am just […]

How does Newton's Method work?

Before I am told, I want to clarify that I searched first, and I don’t believe this to be a repost. I understand the formula in terms of how to apply it, and I’ve seen graphical representations and everything. I get that we are finding where the tangent line has a root, then choosing a […]

How to calculate APR using Newton Raphson

I’m have a computer program to calculate apr using Newton Rhapson. I imagine most mathletes can code so i dont imagine the coding being an issue. The solution is based on this initial formula $$\text{PMT}_{\text{month}}= \text{loan} \times \text{rate}\times \frac{(1+\text{rate})^{\text{#PMTs}}}{(1+ \text{rate})^{\text{#PMTs}}-1}$$ The formula and its derivative for the solution are as below with $P$, the principal/loan […]

Newton's method for square roots 'jumps' through the continued fraction convergents

I know that Newton’s method approximately doubles the number of the correct digits on each step, but I noticed that it also doubles the number of terms in the continued fraction, at least for square roots. Explanation. If we start Newton’s iterations with some partial convergent of the simple continued fraction for the square root, […]

How to solve $x^2 = e^x$

The question is to find $x$ in: \begin{equation*} x^2=e^x \end{equation*} I know Newton’s method and hence could find the approx as $x\approx -0.7034674225$ from \begin{equation*} x_{n+1} = x_n – \dfrac{x_n^2-e^{x_n}}{2x_n-e^{x_n}} \end{equation*} According to WolframAlpha: They also say that $x=-2W(\dfrac{1}{2})$ which shows that it can be solved using some Lambert-W function…Can anyone tell me how to […]

does this Newton-like iterative root finding method based on the hyperbolic tangent function have a name?

I’ve recently discovered that modifying the standard Newton-Raphson iteration by “squashing” $\frac{f (t)}{\dot{f} (t)}$ with the hyperbolic tangent function so that the iteration function is $$N_f (t) = t – \tanh \left( \frac{f (t)}{\dot{f} (t)} \right)$$ results in much larger (Fatou) convergence domains, also known as (immediate) basins of attraction. The convergence theorems of the […]