We know that if we iterate arithmetic and harmonic means of two numbers, we get their geometric mean. So, basically if we need to compute the square root of $x$: $$\sqrt{x}=\sqrt{1 \cdot x}=AHM(1,x)$$ $$a_0=1,~~~~b_0=x$$ $$a_{n+1}=\frac{a_n+b_n}{2},~~~~~b_{n+1}=\frac{2a_nb_n}{a_n+b_n}=\frac{a_nb_n}{a_{n+1}}$$ As far as I know, this expression will converge for any real positive $x$. See this and this question for […]

tl;dr: When is gradient descent with exact line search preferred over Newton’s method? I simply don’t understand why exact line search is ever useful, and here’s my reasoning. Let’s say I have a function $f$ that I want to minimize. Gradient descent performs the updates: $$\vec{x} \gets \vec{x} – t \vec{\nabla}f(\vec{x})$$ The optimal step size […]

Is there an analytical way to know an interval where all points when used in Newton-Raphson will converge/diverge? I am aware that Newton-Raphson is a special case of fixed point iteration, where: $$ g(x) = x – \frac{f(x)}{f'(x)} $$ Also I’ve read that if $|f(x)\cdot f”(x)|/|f'(x)^2| \lt 1$, then convergence is assured. I am just […]

Before I am told, I want to clarify that I searched first, and I don’t believe this to be a repost. I understand the formula in terms of how to apply it, and I’ve seen graphical representations and everything. I get that we are finding where the tangent line has a root, then choosing a […]

I’m have a computer program to calculate apr using Newton Rhapson. I imagine most mathletes can code so i dont imagine the coding being an issue. The solution is based on this initial formula $$\text{PMT}_{\text{month}}= \text{loan} \times \text{rate}\times \frac{(1+\text{rate})^{\text{#PMTs}}}{(1+ \text{rate})^{\text{#PMTs}}-1}$$ The formula and its derivative for the solution are as below with $P$, the principal/loan […]

I know that Newton’s method approximately doubles the number of the correct digits on each step, but I noticed that it also doubles the number of terms in the continued fraction, at least for square roots. Explanation. If we start Newton’s iterations with some partial convergent of the simple continued fraction for the square root, […]

The question is to find $x$ in: \begin{equation*} x^2=e^x \end{equation*} I know Newton’s method and hence could find the approx as $x\approx -0.7034674225$ from \begin{equation*} x_{n+1} = x_n – \dfrac{x_n^2-e^{x_n}}{2x_n-e^{x_n}} \end{equation*} According to WolframAlpha: They also say that $x=-2W(\dfrac{1}{2})$ which shows that it can be solved using some Lambert-W function…Can anyone tell me how to […]

I’ve recently discovered that modifying the standard Newton-Raphson iteration by “squashing” $\frac{f (t)}{\dot{f} (t)}$ with the hyperbolic tangent function so that the iteration function is $$N_f (t) = t – \tanh \left( \frac{f (t)}{\dot{f} (t)} \right)$$ results in much larger (Fatou) convergence domains, also known as (immediate) basins of attraction. The convergence theorems of the […]

Intereting Posts

Show that the zero set of $f$ is an orientable submanifold of $\Bbb R^{n+1}$.
Where are the values of the sine function coming from?
$\int_{0}^{\infty} \frac{\cos x – e^{-x^2}}{x} \ dx$ Evaluate Integral
Finite Field Extensions and the Sum of the Elements in Proper Subextensions (Follow-Up Question)
Integral in $n-$dimensional euclidean space
An application of J.-L. Lion's Lemma
Mean of a Convergent Sequence
What's the history of the result that $p_{n+1} < p_n^2$, and how difficult is the proof?
Condition under which the subring of multiples of an element equal the ring itself?
Can one differentiate an infinite sum?
Simplifying hyperbolic compositions like $\sinh (N \operatorname{acosh} a)$
If $A$ and $B$ are positive-definite matrices, is $AB$ positive-definite?
How to create new mathematics?
Calculate $\int \limits {x^n \over 1+x+\frac{x^2}{2!}+\frac{x^3}{3!}+…+\frac{x^n}{n!}} dx$ where $n$ is a positive integer.
Solving a Diophantine Equation