Intereting Posts

How many different ways can a number N be expressed as a sum of K different positive integers?
Constructing a quasiconvex function
Prove that if |G|=132 then G cannot be simple
Two circles intersection
Closed form for $\sum_{n=1}^\infty\frac{(-1)^n n^a H_n}{2^n}$
Largest prime below a given number N
Different ways to come up with $1+2+3+\cdots +n=\frac{n(n+1)}{2}$
Fibonacci-1 is always composite for n>6. why?
Why is it called a quadratic if it only contains $x^2$, not $x^4$?
How to compute the integral $\int_{-\infty}^\infty e^{-x^2/2}\,dx$?
The convergence of $\sum \pm a_n$ with random signs
How does the area of $A+A$ compare to the area of $A-A$?
Intuitive Aproach to Dolbeault Cohomology
Proof $\int_0^\infty \frac{\exp(-\sqrt{x^2+y^2})}{x^2+y^2}dx = \frac{\pi}{2}\left(\frac{1}{y} – K_0(y)L_{-1}(y) – K_1(y)L_{0}(y)\right)$
Rounding Percentages

Can anyone tell me why the L’Hospital’s rule works for evaluating limits of the form $\frac{0}{0}$ and $\frac{\infty}{\infty}$ ?

What I understand about limits is that when you divide a *really* small thing (that is $\rightarrow0$) by another really small thing, we get a finite value which may not be so small.

So how does differentiating the numerator and denominator help us get the Limit of a function?

- Understanding the proof that $L_\infty$ norm is equal to $\max\{f(x_i)\}$
- Integrating a product of exponentials and error functions
- Why isn't $f(x) = x\cos\frac{\pi}{x}$ differentiable at $x=0$, and how do we foresee it?
- How to prove $\sin(1/x)$ is not uniformly continuous
- Absolute value in trigonometric substitutions
- Finding the limit of $(1-\cos(x))/x$ as $x\to 0$ with squeeze theorem

- How to integrate $\int_1^\infty \frac{dx}{x^2\sqrt{x^2-1}}$?
- Pursuit Curve. Dog Chases Rabbit. Calculus 4.
- How to solve this partial fraction decomposition for taking the Inverse Laplace Transform?
- First derivative test
- Transforming integral equation to differential equation
- Can anyone clarify why this is?
- Is integrability of a function a local property?
- Prove $\int_0^{2\pi}\frac{3a\sin^2\theta}{(1-a\cos \theta)^4}\mathrm{d}\theta = \int_0^{2\pi}\frac{\cos \theta}{(1-a\cos \theta)^3}\,\mathrm{d}\theta$
- Proving the series of partial sums of $\sin (in)$ is bounded?
- What is the difference between stationary point and critical point in Calculus?

The answer given by Mr. Jackson Walters gives a proof why it works, but if you are looking for an answer that should give an intuition, see this :

Consider the curve in the plane whose $x$-coordinate is given by $g(t)$ and whose $y$-coordinate is given by $f(t)$, i.e. $$\large t\mapsto [g(t),f(t)]. $$ Suppose $f(c) = g(c) = 0$. The limit of the ratio $\large \frac {f(t)}{g(t)}$ as $t \mapsto c$ is the slope of tangent to the curve at the point $[0, 0]$. The tangent to the curve at the point $t$ is given by $[g'(t), f'(t)]$. l’Hôpital’s rule then states that the slope of the tangent at $0$ is the limit of the slopes of tangents at the points approaching zero.

Points to assume (credits : Thanks to Hans lundmark for pointing out what I missed and to Srivatsan for improving my formatting . )

Assume that functions $f$ and $g$ have a well defined Taylor expansion at $a$.

*Proof:*

Another way you can think of this is to use the idea of derivative: a function

$f(x)$ is differentiable at $x=a$ if $f(x)$ is very close to its tangent

line $y = f'(a) \cdot (x-a) + f(a)$ near $x = a$. Specifically,

$$f(x) = f(a) + f'(a) \cdot (x-a) + E_{1}(x)$$

where $E_{1}(x)$ is an error term which goes to $0$ as $x$ goes to $a$. In fact,

$E_{1}(x)$ must approach $0$ so fast that

$$\lim_{x\to a}\frac{E_1(x)}{x-a}=0$$

because $\dfrac{E_{{1}(x)}}{x-a} = \dfrac{ f(x)-f(a) }{x-a} – f'(a) $

and we know from the definition of derivative that this quantity has

the limit $0$ at $a$.

Similarly, if $g$ is differentiable at $x = a$,

$$g(x) = g(a) + g'(a) \cdot (x-a) + E_{2}(x)$$

where $E_{2}(x)$ is another error term which goes to $0$ as $x \to a$. If you’re

computing the limit of $f(x)/g(x)$ as $x \to a$ and if $g(a)$ is not equal to

$0$, then as $x \to a$, the numerator becomes indistinguishable from $f(a)$

and the denominator from $g(a)$, so the limit is

$$\lim_{x \to a} \frac{f(x)}{g(x)}=\frac{f(a)}{g(a)} .$$

If both $f(a)$ and $g(a)$ are $0$, then we must use the tangent

approximations to say that

$$\frac{f(x)}{g(x)} = \frac{f(a) + f'(a) \cdot (x-a) + E_{1}(x) }{ g(a) + g'(a) \cdot (x-a) + E_{2}(x) }$$

$$=\frac{f'(a) \cdot (x-a) + E_{1}(x)}{g'(a) \cdot (x-a) + E_{2}(x) }$$

$$ =\frac{f'(a) + [E_{1}(x)/(x-a)] }{g'(a) + [E_{2}(x)/(x-a)]}$$

and we have seen that the second term becomes negligible as $x\mapsto a$.

In other words, when both function values approach $0$ as $x\mapsto a$, the

ratio of the function values just reduces to the ratio of the slopes

of the tangents, because both functions are very close to their

tangent lines.

I hope you understood. Thanks a lot. Iyengar.

This is far from rigorous, but the way I like to *think about* L’Hospital’s Rule is this:

If I have a fraction whose numerator and denominator are both going to, say, infinity, then I can’t say much about the limit of the fraction. The limit could be anything.

It’s possible, though, that the numerator goes *slowly* to infinity and the denominator goes *quickly* to infinity. That would be good information to know, because then I would know that the denominator’s behavior is the one that really swings the limit of the fraction overall.

So, how can I get information about the *rate of change* of a function? This is precisely the kind of thing a derivative can tell you. Thus, instead of comparing the numerator and denominator directly, I can compare the *rate of change* (i.e. the derivative) of the numerator to the *rate of change* (i.e. the derivative) of the denominator to determine the limit of the fraction overall. This is L’Hospital’s Rule.

For the case $\frac{0}{0}$, we need only to use definition of a derivative in terms of the difference quotient. Suppose $f,g:\mathbb{R} \rightarrow \mathbb{R}$ and $f(a)=g(a)=0$, $f$ and $g$ are continuously differentiable, and $g'(a) \ne 0$, then

$$\begin{eqnarray*} \lim_{x \rightarrow a}\frac{f(x)}{g(x)} &=& \lim_{x \rightarrow a}\frac{f(x)-0}{g(x)-0} \\

&=&\lim_{x \rightarrow a}\frac{f(x)-f(a)}{g(x)-g(a)}\\

&=&\lim_{x \rightarrow a}\frac{\frac{f(x)-f(a)}{x-a}}{\frac{g(x)-g(a)}{x-a}}\\

&=&\frac{\lim_{x \rightarrow a}\frac{f(x)-f(a)}{x-a}}{\lim_{x \rightarrow a}\frac{g(x)-g(a)}{x-a}}\\

&=&\frac{f'(a)}{g'(a)} \end{eqnarray*}$$

(http://csclub.uwaterloo.ca/~jy2wong/jenerated/blog/2012-10-07.lhopitals_rule.html) is interesting.

At the heart of it though, L’Hopital’s rule just seems to be a marriage of the ideas that differentiable functions are pretty darn close to their linear approximations at some point as long as you don’t stray too far from that point and that for a continuous function, a small movement in the domain means a small movement in the value of the function.

- Prove that there is a unique inner product on $V$
- Defining a metric space
- If $H\unlhd G$ with $(|H|,)=1$ then $H$ is the unique such subgroup in $G$.
- We roll a six-sided die ten times. What is the probability that the total of all ten rolls is divisible by 6?
- Original works of great mathematicians
- Regular in codim one scheme and DVR
- Evaluate $\sum_{k=0}^{n} {n \choose k}{m \choose k}$ for a given $n$ and $m$.
- Prove $f(x)=\int\frac{e^x}{x}\mathrm dx$ is not an elementary function
- Prove that the maximum of $n$ independent standard normal random variables, is asymptotically equivalent to $\sqrt{2\log n}$ almost surely.
- Reducing the form of $2\sum\limits_{j=0}^{n-2}\sum\limits_{k=1}^n {{k+j}\choose{k}}{{2n-j-k-1}\choose{n-k+1}}$.
- A geometric inequality, proving $8r+2R\le AM_1+BM_2+CM_3\le 6R$
- What is the reason behind the current Order of Operations? (PEMDAS)
- Integral $\int_0^{\pi/2} \frac{\sin^3 x\log \sin x}{\sqrt{1+\sin^2 x}}dx=\frac{\ln 2 -1}{4}$
- What does 2v mean in the context of Simplicial Homology
- What needs to be linear for the problem to be considered linear?