Intereting Posts

Is the image of a Borel subset of $$ under a differentiable map still a Borel set?
Equivalence relations on classes instead of sets
What does Universal mapping property for a free monoid mean?
How can I integrate this equation? (to get surface area of the mobius strip)
Help me prove equivalently of regular semigroup and group.
When chessboards meet dominoes
What's the rationale for requiring that a field be a $\boldsymbol{non}$-$\boldsymbol{trivial}$ ring?
In the Sorgenfrey plane, is an open disc homeomorphic to an open square?
Mandelbrot boundary
Example of $A \le G$ solvable, $B \lhd G$ solvable, but $AB$ is not solvable
Abelian categories and axiom (AB5)
Prove that $\frac4{abcd} \geq \frac a b + \frac bc + \frac cd +\frac d a$
question about characteristic of idempotent.
Is $2^{\frak{c}}$ separable?
How can I prove this combinatorial identity $\sum_{j=0}^n j\binom{2n}{n+j}\binom{m+j-1}{2m-1}=m\cdot4^{n-m}\cdot\binom{n}{m}$?

I was wondering the following. And I probably know the answer already: NO.

Is there another number with similar properties as $e$? So that the derivative of $\exp(x)$ is the same as the function itself.

I can guess that it’s probably not, because otherwise $e$ wouldn’t be that special, but is there a proof of it?

- Closed form for ${\large\int}_0^1\frac{\ln^2x}{\sqrt{1-x+x^2}}dx$
- Evalutating $\lim_{x\to0}\frac{1-\cos x}{x^2}$
- Integral involving exponential, power and Bessel function
- Sum with parameter convergence
- Let $(s_n)$ be a sequence of nonnegative numbers, and $\sigma_n=\frac{1}{n}(s_1+s_2+\cdots +s_n)$. Show that $\liminf s_n \le \liminf \sigma_n$.
- Integral equation

- How to find ${\large\int}_1^\infty\frac{1-x+\ln x}{x \left(1+x^2\right) \ln^2 x} \mathrm dx$
- Does it hold? $\int_{0}^{\pi/2}\cos^{2k}xdx=\frac{(2k-1)!!}{2k!!}\frac{\pi}{2}$
- If $f(x) + f'(x) + f''(x) \to A$ as $x \to \infty$, then show that $f(x) \to A$ as $x \to \infty$
- Integration of $\int \frac{x^2+20}{(x \sin x+5 \cos x)^2}dx$
- Derivative of $f(x)=|x|$
- Why I am getting different answer?
- Hints on calculating the integral $\int_0^1\frac{x^{19}-1}{\ln x}\,dx$
- Prove $\limsup_{n \rightarrow \infty} a_n\geq1$ if $\lim_{n \rightarrow \infty} a_na_{n+1}=1$
- Factorial sum estimate $\sum_{n=m+1}^\infty \frac{1}{n!} \le \frac{1}{m\cdot m!}$
- A series involving the harmonic numbers : $\sum_{n=1}^{\infty}\frac{H_n}{n^3}$

Of course $C e^x$ has the same property for any $C$ (including $C = 0$). But these are the only ones.

**Proposition:** Let $f : \mathbb{R} \to \mathbb{R}$ be a differentiable function such that $f(0) = 1$ and $f'(x) = f(x)$. Then it must be the case that $f = e^x$.

*Proof.* Let $g(x) = f(x) e^{-x}$. Then

$$g'(x) = -f(x) e^{-x} + f'(x) e^{-x} = (f'(x) – f(x)) e^{-x} = 0$$

by assumption, so $g$ is constant. But $g(0) = 1$, so $g(x) = 1$ identically.

**N.B.** Note that it is also true that $e^{x+c}$ has the same property for any $c$. Thus there exists a function $g(c)$ such that $e^{x+c} = g(c) e^x = e^c g(x)$, and setting $c = 0$, then $x = 0$, we conclude that $g(c) = e^c$, hence $e^{x+c} = e^x e^c$.

This observation generalizes to any differential equation with translation symmetry. Apply it to the differential equation $f''(x) + f(x) = 0$ and you get the angle addition formulas for sine and cosine.

Let $f(x)$ be a differentiable function such that $f'(x)=f(x)$. This implies that the $k$-th derivative, $f^{(k)}(x)$, is also equal to $f(x)$. In particular, $f(x)$ is $C^\infty$ and we can write a Taylor expansion for $f$:

$$T_f(x) = \sum_{k=0}^\infty c_k x^k.$$

Notice that the fact that $f(x)=f^{(k)}(x)$, for all $k\geq 0$, implies that the Taylor series $T_f(x_0)$ converges to $f(x_0)$ for every $x_0\in \mathbb{R}$ (more on this later), so we may write $f(x)=T_f(x)$. Since $f'(x) = \sum_{k=0} (k+1)c_{k+1}x^k = f(x)$, we conclude that $c_{k+1} = c_k/(k+1)$. The value of $c_0 = f(0)$, and therefore, $c_k = f(0)/k!$ for all $k\geq 0$. Hence:

$$f(x) = f(0) \sum_{k=0}^\infty \frac{x^k}{k!} = f(0) e^x,$$

as desired.

**Addendum: About the convergence of the Taylor series**. Let us use Taylor’s remainder theorem to show that the Taylor series for $f(x)$ centered at $x=0$, denoted by $T_f(x)$, converges to $f(x)$ for all $x\in\mathbb{R}$. Let $T_{f,n}(x)$ be the $n$th Taylor polynomial for $f(x)$, also centered at $x=0$. By Taylor’s theorem, we know that

$$|R_n(x_0)|\leq |f^{(n+1)}(\xi)|\frac{ |x_0 – 0|^{n+1}}{(n+1)!},$$

where $R_n(x_0)=f(x) – T_{f,n}(x)$ and $\xi$ is a number between $0$ and $x_0$. Let $M=M(x_0)$ be the maximum value of $|f(x)|$ in the interval $I=[-|x_0|,|x_0|]$, which exists because $f$ is differentiable (therefore, continuous) in $I$. Since $f(x)=f^{(n+1)}(x)$, for all $n\geq 0$, we have:

$$|R_n(x_0)|\leq |f^{(n+1)}(\xi)|\frac{ |x_0|^{n+1}}{(n+1)!}\leq |f(\xi)|\frac{ |x_0|^{n+1}}{(n+1)!}\leq M \frac{|x_0|^{n+1}}{(n+1)!} \longrightarrow 0 \ \text{ as } \ n\to \infty.$$

The limit goes to $0$ because $M$ is a constant (once $x_0$ is fixed) and $A^n/n! \to 0$ for all $A\geq 0$. Therefore, $T_{f,n}(x_0) \to f(x_0)$ as $n\to \infty$ and, by definition, this means that $T_f(x_0)$ converges to $f(x_0)$.

Yet another way: By the chain rule, ${\displaystyle {d \over dx} \ln|f(x)| = {f'(x) \over f(x)} = 1}$. Integrating, you get $\ln |f(x)| = x + C$. Taking $e$ to both sides, you obtain $|f(x)| = e^{x + C} = C'e^x$, where $C' > 0$.

As a result, $f(x) = C''e^x$, where $C''$ is an arbitrary constant.

If you are worried about $f(x)$ being zero, the above shows $f(x)$ is of the form $C''e^x$ on any interval for which $f(x)$ is nonzero. Since $f(x)$ is continuous, this implies $f(x)$ is always of that form, unless $f(x)$ is identically zero (in which case we can just take $C'' = 0$ anyhow).

**Hint** $\rm\displaystyle\:\ \begin{align} f{\:’}\!\! &=\ \rm a\ f \\ \rm \:\ g’\!\! &=\ \rm a\ g \end{align} \iff \dfrac{f{\:’}}f = \dfrac{g’}g \iff \bigg(\!\!\dfrac{f}g\bigg)’ =\ 0\ \iff W(f,g) = 0\:,\ \ W = $ Wronskian

This is a very special case of the *uniqueness* theorem for linear differential equations, esp. how the Wronskian serves to measure linear independence of solutions. See here for a proof of the less trivial second-order case (that generalizes to n’th order). See also the classical result below on Wronskians and linear dependence from one of my old sci.math posts.

**Theorem** $\ \ $ Suppose $\rm\:f_1,\ldots,f_n\:$ are $\rm\:n-1\:$ times differentiable on interval $\rm\:I\subset \mathbb R\:$

and suppose they have Wronskian $\rm\: W(f_1,\ldots,f_n)\:$ vanishing at all points in $\rm\:I\:.\:$ Then $\rm\:f_1,\ldots,f_n\:$ are linearly dependent on some *subinterval* of $\rm\:I\:.$

**Proof** $\ $ We employ the following easily proved Wronskian identity:

$\rm\qquad\ W(g\ f_1,\ldots,\:g\ f_n)\ =\ g^n\ W(f_1,\ldots,f_n)\:.\ $ This immediately implies

$\rm\qquad\quad\ \ \ W(f_1,\ldots,\: f_n)\ =\ f_1^{\:n}\ W((f_2/f_1)’,\ldots,\:(f_n/f_1)’\:)\quad $ if $\rm\:\ f_1 \ne 0 $

Proceed by induction on $\rm\:n\:.\:$ The Theorem is clearly true if $\rm\:n = 1\:.\:$ Suppose that $\rm\: n > 1\:$ and $\rm\:W(f_1,\ldots,f_n) = 0\:$ for all $\rm\:x\in I.\:$

If $\rm\:f_1 = 0\:$ throughout $\rm\:I\:$ then $\rm\: f_1,\ldots,f_n\:$ are dependent on $\rm\:I.\:$ Else $\rm\:f_1\:$ is nonzero at some point of $\rm\:I\:$ so also throughout some subinterval $\rm\:J \subset I\:,\:$ since $\rm\:f_1\:$ is continuous (being differentiable by hypothesis). By above $\rm\:W((f_2/f_1)’,\ldots,(f_n/f_1)’\:)\: =\: 0\:$ throughout $\rm\:J,\:$ so by induction there exists a subinterval $\rm\:K \subset J\:$

where the arguments of the Wronskian are linearly dependent, i.e.

on $\rm\ K:\quad\ \ \ c_2\ (f_2/f_1)’ +\:\cdots\:+ c_n\ (f_n/f_1)’\: =\ 0,\ \ $ all $\rm\:c_i’\:=\ 0\:,\ $ some $\rm\:c_j\ne 0 $

$\rm\qquad\qquad\: \Rightarrow\:\ \ ((c_2\ f_2 +\:\cdots\: + c_n\ f_n)/f_1)’\: =\ 0\ \ $ via $({\phantom m})’\:$ linear

$\rm\qquad\qquad\: \Rightarrow\quad\ \ c_2\ f_2 +\:\cdots\: + c_n\ f_n\ =\ c_1 f_1\ \ $ for some $\rm\:c_1,\ c_1’\: =\: 0 $

Therefore $\rm\ f_1,\ldots,f_n\:$ are linearly dependent on $\rm\:K \subset I\:.\qquad$ **QED**

This theorem has as immediate corollaries the well-known results

that the vanishing of the Wronskian on an interval $\rm\: I\:$ is

a necessary and sufficient condition for linear dependence of

$\rm\quad (1)\ $ functions analytic on $\rm\: I\:$

$\rm\quad (2)\ $ functions satisfying a monic homogeneous linear differential

equation

$\rm\quad\phantom{(2)}\ $ whose coefficients are continuous throughout $\rm\: I\:.\:$

Here is a different take on the question. There is a whole spectrum of different discrete “calculi” which converge to the continuous case, each of which has it’s special “$e$”.

Pick some $t>0$. Consider the equation $$f(x)=\frac{f(x+t)-f(x)}{t}$$

It is not hard to show by induction that there is a function $C_t:[0,t)\to \mathbb{R}$ so that $$f(x)=C_t(\{\frac{x}{t}\})(1+t)^{\lfloor\frac{x}{t}\rfloor}$$

where $\{\cdot\}$ and $\lfloor\cdot\rfloor$ denote fractional and integer part, respectively. If I take Qiaochu’s answer for comparison, then $C_t$ plays the role of the constant $C$ and $(1+t)^{\lfloor\frac{x}{t}\rfloor}$ the role of $e^x$. Therefore for such a discrete calculus the right value of “$e$” is $(1+t)^{1/t}$. Now it is clear that as $t\to 0$ the equation becomes $f(x)=f'(x)$, and $(1+t)^{1/t}\to e$.

The solutions of $f(x) = f'(x)$ are exactly $f(x) = f(0) e^x$. But you can also write it as $b a^x$, if you want a different basis. Then $f'(x) = b \log(a) a^x$, and so if you want $f'=f$ you need $\log(a)=1$ and $a=e$ (except for the trivial case $b=0$).

The proof they use at High School, so not as deep or instructive, but it doesn’t require as much knowledge.

$$\begin{eqnarray*}

\frac{dy}{dx} &=& y\\

\frac{dx}{dy} &=& \frac 1 y\\

x &=& \log |y| + C\\

y &=& A\exp(x) \end{eqnarray*} $$

Let $x \in C^1$ on the whole line be a solution to $\dot{x}(t) = x(t)$, $x(0) = 1$. Using the Taylor expansion with remainder, show that necessarily $x(t) = e^t$.

We have that $\dot{x} = x$ implies $x^{(n)} = x^{(n-1)}$ for all $n \ge 1$, and by induction on $n$, we have that $x(t)$ is $C^\infty$ with $x^{(n)} = x$ for all $n$. Thus, if $x(0) = 1$ and $\dot{x} = x$, Taylor’s Theorem gives$$x(t) = \left( \sum_{k=0}^{N-1} {{t^k}\over{k!}}\right) + {{x^{(N)}(t_1)}\over{N!}}t^N,$$for $t_1$ between $0$ and $t$. But $x^{(N)} = x$, so if$$M = \max_{|t_1| \le |t|} |x(t)|,$$which we know exist by compactness of $[-|t|, |t|]$, then$$\left| x(t) – \sum_{k=0}^{N-1} {{t^k}\over{k!}}\right| < {{Mt^N}\over{N!}}.$$The right-hand side heads to $0$ as $N \to \infty$, so the series for $e^t$ converges to $x(t)$.

Note that $e$ is defined by the following Limit: $e=\lim_{n \rightarrow \infty}(1+ \frac{1}{n})^n$. Then: $e^x=\lim_{n \rightarrow \infty}(1+ \frac{1}{n})^{nx}$. Applying the Definition of the derivative $f'(x) = \lim_{h \rightarrow 0} \frac{f(x+h)-f(x)}{h}$ one obtains: $(e^x)’=\lim_{h \rightarrow 0} \frac{ \lim_{n \rightarrow \infty}((1+ \frac{1}{n})^{n(x+h)}-(1+ \frac{1}{n})^{nx})}{h} = \lim_{h \rightarrow 0}( \lim_{n \rightarrow \infty}(1+\frac{1}{n})^{nx} \lim_{n \rightarrow \infty}(\frac{(1+\frac{1}{n})^{nh}-1}{h}))$

$= e^x \lim_{h \rightarrow 0} \lim_{n \rightarrow \infty}(\frac{(1+\frac{1}{n})^{nh}-1}{h})$.

Now one can replace $h$ by $n$ by the relation $h= \frac{C}{n}$ with a finite constant $C$, because if $n \rightarrow \infty$ then $h$ tends to Zero. Hence:

$\lim_{h \rightarrow 0} \lim_{n \rightarrow \infty}(\frac{(1+\frac{1}{n})^{nh}-1}{h}) = \lim_{h \rightarrow 0} (\frac{(1+\frac{h}{C})^{C}-1}{h}) = \lim_{h \rightarrow 0} (\frac{(1+C \frac{h}{C} + \frac{C(C-1)}{2}(\frac{h}{C})^2+O(h^3)-1}{h}) = \lim_{h \rightarrow 0} (1 + \frac{C(C-1)}{2}\frac{h}{C^2}+O(h^2)) = 1$

Therefore $(Ce^x)’=C(e^x)’=Ce^x$.

**q.e.d.**

- Complex Integration of $\int_0^\infty e^{-ax}\cos(bx)\,dx$
- Dedekind's cut and axioms
- $f$ entire, $f$ satisfies $|f(x+iy)|\leq\frac{1}{|y|}$ for all $x,y\in\mathbb{R}$. Prove that $f\equiv 0$.
- Show that $x(x+1) = y^4+y^3+ay^2+by+c$ has a finite number of positive integral solutions.
- $q$-series identity
- Is every dense subspace of a separable space separable?
- Why is the Cartesian product of $A$ and $\varnothing$ empty?
- Prove that the Lebesgue integral of $f\chi$ equals to $0$ indicates that $f=0$ a.e.
- Proving that $\mathbf{W}$+$\mathbf{W^{\perp}}$=$\mathbb{R^{n}}$
- Multivariate function interpolation
- Evaluating $\lim_{x\to0}\frac{1-\cos(x)}{x}$
- Sum of absolute values and the absolute value of the sum of these values?
- In my calculator why does $\sqrt4 -2=-8.1648465955514287168521180122928e-39?$
- Is there an elementary method for evaluating $\displaystyle \int_0^\infty \frac{dx}{x^s (x+1)}$?
- Prove that $(a-b) \mid (a^n-b^n)$