Are polynomials infinitely many times differentiable?

Are polynomials infinitely many times differentiable?

If so, does it only mean that at some point we reach 0 and then we keep on getting 0?

Thank you!

Solutions Collecting From Web of "Are polynomials infinitely many times differentiable?"


$$\frac{d}{dx} 0 = 0,$$
so once you’ve taken $\deg(p) + 1$ derivatives, where $p$ is the polynomial you’re considering, you will just keep getting zero.

The other answers probably answered your question, but in case you wanted to learn more, the key word to look up is smoothness of the function (see here for instance:

So another, more fancy way of asking your question is whether or not polynomials are smooth functions.

To be only a little more formal. I would consider the standard definitions of “derivative” and “differentiable”.

The definition of the derivative is typically defined for a function on an interval or at a point.

Let $I \subseteq \Bbb R$ be an interval, let $f:I\to \Bbb R$, and let $c \in I$. $L \in \Bbb R$ is a derivative of $f$ at $c$ if the following conditions hold.

compacted definition using undefined limit, is given by the limit

$L=\lim_{x\to c}\frac{f(x)-f(c)}{x-c}$

$\forall \epsilon$, $\exists \delta(\epsilon):$ if $\epsilon \gt 0$ then $ \delta (\epsilon)\gt 0$.

If $x \in I$ and $0<|x-c|< \delta(\epsilon)$ then $|\frac{f(x)-f(c)}{x-c}-L|<\epsilon$.

Differentiable simply means that the derivative exists at a point $c$.

So for the case of the polynomials in from $\Bbb R \to \Bbb R$ we will consider this interval $I$ to be equal to the reals itself. What I typically do is consider the two inequalities

$1)$ $|x-c|<\delta(\epsilon)$

$2)$ $|\frac{f(x)-f(c)}{x-c}-L|<\epsilon$

as a system of inequalities, however this is stronger then necessary because solving for the variables simultaneously will imply that $1$ implies $2$ and $2$ also implies $1$ or $1\iff 2$. The definition only requires that $1 \Rightarrow 2$. You can investigate that on your own. First consider $f(x) = x$ then


In this case $\delta$ is not a function of $\epsilon$ because if $\delta =|x-c|+1$ $\Rightarrow |x-c|<\delta$. Normally $2$ would constrain the possible x values and a more specific delta would satisfy this constrained so that $1 \Rightarrow 2$ which is what is actually needed. So we see for all $\epsilon$ $2$ is true because $L$ need only be an element of the real. So we can take $L=1$.

axiom 1 $\frac{d}{dx}x|_{x=c}=1$ As you may recall.
Next consider $f(x)=a $ for any $a \in \Bbb R$. $1$, $2$ become



We can take $\delta(\epsilon)=|x-c|+1$ as before and $L=0$.

axiom 2 $\frac{d}{dx}a|_{x=c}=0, \forall a \in \Bbb R$ a constant.

Next consider $f(x)$, $g(x)$ defined similarly as functions from the reals to the reals and $h(x)=f(x)+g(x)$


$|\frac{h(x)-h(c)}{x-c}|=|\frac{f(x)+g(x)-f(x)-g(x)}{x-c}-L|=|\frac{f(x)-f(c)}{x-c}-L_f+\frac{g(x)-g(c)}{x-c}-L_g|<\epsilon$, $L=L_f+L_g$

By the triangle inequality $|x+y|\le|x|+|y|$ which then implies


But we see this is just the sum of two differentiable functions and if they are differentiable at $c$ then we know $L_f$, $L_g$ exist as well as these statements satisfied for all $\epsilon$ as well there will exist a $\delta$.

$|\frac{f(x)-f(c)}{x-c}-L_f|<\epsilon_f$, $|\frac{g(x)-g(c)}{x-c}-L_g|<\epsilon_g$

and $\epsilon_f+\epsilon_g=\epsilon$ so finally

axiom 3 $\frac{d}{dx}h(x)|_{x=c}=\frac{d}{dx}f(x)|_{x=c}+\frac{d}{dx}g(x)|_{x=c}$ if $h(x)=f(x)+g(x)$ where $f(x)$ and $g(x)$ are two differentiable functions on the interval considered.

Finally the product rule needs to be derived however as that is typically left as an exercise I shall do the same.

product rule if $f(x)$ and $g(x)$ are two differentiable functions on some interval $I$ and $h(x)=f(x) g(x)$then its derivative at all points on $I$ are $\frac{d}{dx}[f(x)]_{x=c}g(x)+f(x)\frac{d}{dx}[g(x)]_{x=c}$

Now we can start to move to the normal tools of calculus, for example if we consider the case were our functions are only as in axiom 1 and 2 and apply the product rule we get $1\cdot x+x\cdot 1=2x$. you can realize the pattern after a few iterations that $\frac{d}{dx}x^n|_{x=c}=nx^{n-1}$. Assume it is true for one $n \in \Bbb N$ then consider


By our product rule then using our induction hypothesis we have

$x^n+x\cdot nx^{n-1}=x^n+nx^n=(n+1)x^n=(n+1)x^{(n+1)-1}$

which is what we have set out to show, all that is left is to evaluate the base case of $n=0$ which you can do at this point. We have actually shown it to be true for $n=2$ already. Also I will leave it to you to evaluate how the coefficients can be viewed as constant functions and use the product rule again to handle any term in a polynomial.

I will not prove but it is true that any polynomial can be witten in the form $\sum_{i=0}^n a_i x^i$ you can show again by induction that for a given $n$, $m \in \Bbb N$ assume $m \le n$

$\frac{d}{dx}\sum_{i=0}^m a_ix^i|_{x=c}=\sum_{i=0}^m a_i\frac{d}{dx}x^i|_{x=c}=\sum_{i=0}^m ia^ix^{i-1}$

then proceed to show true for m+1.

Finally to show that there are only finitely many derivatives of any polynomial greater than $0$ use contradiction and use the index in the coefficient $i$ with successive derivatives. This shows that there exists an n so that $\frac{d^n}{dx^n}\sum_{i=0}^la_ix^i|_{x=c}=0$ for all $l$. We have already seen that the derivative of a constant is zero.