Intereting Posts

Has the $\Gamma$-like function $f_p(n) = 1^{\ln(1)^p} \cdot 2^{\ln(2)^p} \cdot \ldots \cdot n^{\ln(n)^p} $ been discussed anywhere?
Why the number of ways of selecting $r$ things out of $n$ identical things is 1
Prove that there is no smallest positive real number
column space of positive semidefinite matrix
Regarding the notation $f: a \mapsto b$
New very simple golden ratio construction incorporating a triangle, square, and pentagon all with sides of equal length. Is there any prior art?
Intuitive explanation of Cauchy's Integral Formula in Complex Analysis
Relationship between prime factorizations of $n$ and $n+1$?
An orthonormal set cannot be a basis in an infinite dimension vector space?
Induction and contradiction
$\ell^p\subseteq\ell^q$ for $0<p<q<\infty$ and $\|\cdot\|_p<\|\cdot\|_q$
Convergence of $\zeta(s)$ on $\Re(s)> 1$
$H$ a discrete subgroup of topological group $G$ $\implies$ there exists an open $U\supseteq\{1\}$ s.t. the $hU$ are pairwise disjoint
Is Fourier series an “inverse” of Taylor series?
Braid groups and the fundamental group of the configuration space of $n$ points

Are polynomials infinitely many times differentiable?

If so, does it only mean that at some point we reach 0 and then we keep on getting 0?

Thank you!

- Question about finding the limit at an undefined point.
- Limit of a summation, using integrals method
- Nth derivative of $\tan^m x$
- Present a function with specific feature
- Find the volume of a cone whose length of its side is $R$
- Is the function $f(x)=x$ on $\{\pm\frac1n:n\in\Bbb N\}$ differentiable at $0$?

- $x^2 +y^2 + z^2$ is irreducible in $\mathbb C $
- Meaning of different Orders of Derivative
- Derivation of Euler-Lagrange equation
- Limit of derivative is zero. Does it imply a limit for f(x)?
- Solving a polynomial in an easier manner
- determining an integral using only derivative properties of two functions
- Proving that an integral is differentiable
- Long division, explain the result!
- What is an intuition behind total differential in two variables function?
- Can this quick way of showing that $K/(Y-X^2)\cong K$ be turned into a valid argument?

Yes.

$$\frac{d}{dx} 0 = 0,$$

so once you’ve taken $\deg(p) + 1$ derivatives, where $p$ is the polynomial you’re considering, you will just keep getting zero.

The other answers probably answered your question, but in case you wanted to learn more, the key word to look up is smoothness of the function (see here for instance: https://en.wikipedia.org/wiki/Smoothness).

So another, more fancy way of asking your question is whether or not polynomials are smooth functions.

To be only a little more formal. I would consider the standard definitions of “derivative” and “differentiable”.

The definition of the derivative is typically defined for a function on an interval or at a point.

**Derivative**

Let $I \subseteq \Bbb R$ be an interval, let $f:I\to \Bbb R$, and let $c \in I$. $L \in \Bbb R$ is a **derivative** of $f$ at $c$ if the following conditions hold.

**compacted definition** using undefined limit, is given by the limit

$L=\lim_{x\to c}\frac{f(x)-f(c)}{x-c}$

**alternatively**

$\forall \epsilon$, $\exists \delta(\epsilon):$ if $\epsilon \gt 0$ then $ \delta (\epsilon)\gt 0$.

If $x \in I$ and $0<|x-c|< \delta(\epsilon)$ then $|\frac{f(x)-f(c)}{x-c}-L|<\epsilon$.

**Differentiable** simply means that the derivative exists at a point $c$.

So for the case of the polynomials in from $\Bbb R \to \Bbb R$ we will consider this interval $I$ to be equal to the reals itself. What I typically do is consider the two inequalities

$1)$ $|x-c|<\delta(\epsilon)$

$2)$ $|\frac{f(x)-f(c)}{x-c}-L|<\epsilon$

as a system of inequalities, however this is stronger then necessary because solving for the variables simultaneously will imply that $1$ implies $2$ and $2$ also implies $1$ or $1\iff 2$. The definition only requires that $1 \Rightarrow 2$. You can investigate that on your own. First consider $f(x) = x$ then

$|\frac{x-c}{x-c}-L|=|1-L|<\epsilon$

$|x-c|<\delta(\epsilon)$

In this case $\delta$ is not a function of $\epsilon$ because if $\delta =|x-c|+1$ $\Rightarrow |x-c|<\delta$. Normally $2$ would constrain the possible x values and a more specific delta would satisfy this constrained so that $1 \Rightarrow 2$ which is what is actually needed. So we see for all $\epsilon$ $2$ is true because $L$ need only be an element of the real. So we can take $L=1$.

**axiom 1** **$\frac{d}{dx}x|_{x=c}=1$** As you may recall.

Next consider $f(x)=a $ for any $a \in \Bbb R$. $1$, $2$ become

$|x-c|<\delta(\epsilon)$

$|\frac{a-a}{x-c}-L|=|L|<\epsilon$

We can take $\delta(\epsilon)=|x-c|+1$ as before and $L=0$.

**axiom 2** $\frac{d}{dx}a|_{x=c}=0, \forall a \in \Bbb R$ a constant.

Next consider $f(x)$, $g(x)$ defined similarly as functions from the reals to the reals and $h(x)=f(x)+g(x)$

$|x-c|<\delta(\epsilon)$

$|\frac{h(x)-h(c)}{x-c}|=|\frac{f(x)+g(x)-f(x)-g(x)}{x-c}-L|=|\frac{f(x)-f(c)}{x-c}-L_f+\frac{g(x)-g(c)}{x-c}-L_g|<\epsilon$, $L=L_f+L_g$

By the triangle inequality $|x+y|\le|x|+|y|$ which then implies

$|\frac{f(x)-f(c)}{x-c}-L_f+\frac{g(x)-g(c)}{x-c}-L_g|\le|\frac{f(x)-f(c)}{x-c}-L_f|+|\frac{g(x)-g(c)}{x-c}-L_g|$

But we see this is just the sum of two differentiable functions and if they are differentiable at $c$ then we know $L_f$, $L_g$ exist as well as these statements satisfied for all $\epsilon$ as well there will exist a $\delta$.

$|\frac{f(x)-f(c)}{x-c}-L_f|<\epsilon_f$, $|\frac{g(x)-g(c)}{x-c}-L_g|<\epsilon_g$

and $\epsilon_f+\epsilon_g=\epsilon$ so finally

**axiom 3** $\frac{d}{dx}h(x)|_{x=c}=\frac{d}{dx}f(x)|_{x=c}+\frac{d}{dx}g(x)|_{x=c}$ if $h(x)=f(x)+g(x)$ where $f(x)$ and $g(x)$ are two differentiable functions on the interval considered.

Finally the product rule needs to be derived however as that is typically left as an exercise I shall do the same.

**product rule** if $f(x)$ and $g(x)$ are two differentiable functions on some interval $I$ and $h(x)=f(x) g(x)$then its derivative at all points on $I$ are $\frac{d}{dx}[f(x)]_{x=c}g(x)+f(x)\frac{d}{dx}[g(x)]_{x=c}$

Now we can start to move to the normal tools of calculus, for example if we consider the case were our functions are only as in axiom 1 and 2 and apply the product rule we get $1\cdot x+x\cdot 1=2x$. you can realize the pattern after a few iterations that $\frac{d}{dx}x^n|_{x=c}=nx^{n-1}$. Assume it is true for one $n \in \Bbb N$ then consider

$\frac{d}{dx}x^{n+1}|_{x=c}=x^n\frac{d}{dx}x|_{x=c}+x\frac{d}{dx}x^n|_{x=c}=x^n+x\frac{d}{dx}x^n|_{x=c}$

By our product rule then using our induction hypothesis we have

$x^n+x\cdot nx^{n-1}=x^n+nx^n=(n+1)x^n=(n+1)x^{(n+1)-1}$

which is what we have set out to show, all that is left is to evaluate the base case of $n=0$ which you can do at this point. We have actually shown it to be true for $n=2$ already. Also I will leave it to you to evaluate how the coefficients can be viewed as constant functions and use the product rule again to handle any term in a polynomial.

I will not prove but it is true that any polynomial can be witten in the form $\sum_{i=0}^n a_i x^i$ you can show again by induction that for a given $n$, $m \in \Bbb N$ assume $m \le n$

$\frac{d}{dx}\sum_{i=0}^m a_ix^i|_{x=c}=\sum_{i=0}^m a_i\frac{d}{dx}x^i|_{x=c}=\sum_{i=0}^m ia^ix^{i-1}$

then proceed to show true for m+1.

Finally to show that there are only finitely many derivatives of any polynomial greater than $0$ use contradiction and use the index in the coefficient $i$ with successive derivatives. This shows that there exists an n so that $\frac{d^n}{dx^n}\sum_{i=0}^la_ix^i|_{x=c}=0$ for all $l$. We have already seen that the derivative of a constant is zero.

- $n$ divides $\phi(a^n -1)$ where $a, n$ are positive integer.
- How prove this $f(a)\le f(b)$
- A typical $L^p$ function does not have a well-defined trace on the boundary
- Converting sums of square-roots to nested square-roots
- Sum of GCD(k,n)
- A possible closed form?
- How can I show that $n! \leqslant (\frac{n+1}{2})^n$?
- Proportional to 2 Separate Variables vs. Proportional to Product of 2 Variables
- What is “Approximation Theory”?
- Concerning Hurwitz Zeta function, how to prove the following identity?
- A sequence with infinitely many radicals: $a_{n}=\sqrt{1+\sqrt{a+\sqrt{a^2+\cdots+\sqrt{a^n}}}}$
- Some questions about set closure / Kuratowski closure.
- application of strong vs weak law of large numbers
- how many ways to make change, asymptotics
- $f$ not differentiable at $(0,0)$ but all directional derivatives exist