Why is $ \sum_{n=0}^{\infty}\frac{x^n}{n!} = e^x$?

I am trying to see where this relationship comes from: $\displaystyle \sum_{n=0}^{\infty}\frac{x^n}{n!} = e^x$ Does anyone have any special knowledge that me and my summer math teacher doesn’t know about? He went over generating functions a few days ago and I did not know exactly what he was talking about. In the middle of his lecture he wrote this generating function. He explained that in this case $a_{n} = \frac{1}{n!}$ for $\displaystyle \sum_{n=0}^{\infty} a_{n} x^n$.

Is the relationship a definition or can it be derived using $\displaystyle \sum_{n=0}^{\infty} x^n $?

Solutions Collecting From Web of "Why is $ \sum_{n=0}^{\infty}\frac{x^n}{n!} = e^x$?"

The exponentional function has the property that it’s its own derivative. Differentiate the series term by term and see what you get.

For your first question, I would recommend Rudin’s approach as described in his book, Principles of Mathematical Analysis. Let $\exp(y)=\sum_{n=0}^\infty \frac{y^n}{n!}$. Define the real number $e:=\exp(1)=\sum_{n=0}^\infty \frac{1}{n!}$. Now the question is how to bridge the gap between high school mathematics and first-year mathematics, where $e^y$ is considered in high school as $e$ raised to the power $y$, but without a rigourous definition.

First, we need to formalise the meaning of $x^y$. When $x\ge1$ and $y=\frac nm\ge0$ is a rational number, we define $x^y=\sqrt[m]{x^n}$ as usual. Yet, for irrational $y>0$, there is a need to generalise. In this case, Rudin defines
$$x^y=\sup\{x^p: p\in\mathbb{Q},\,0\le p<y\}.\tag{1}$$
The definition of $x^y$ for $0\le x<1$ and/or $y\le0$ can be built on $(1)$ (e.g. define $x^{-y}=1/x^y$ when $y$ is negative).

Now, if we multiply the two power series $\exp(x)$ and $\exp(y)$ together and rearrange the summands (if you know absolute convergence and Cauchy product, you know what I am talking about), it can be shown that $\exp(x)\exp(y)=\exp(x+y)$ for any two $x,y\in\mathbb{R}$. It follows that $\exp(y)^n=\exp(ny)$. In particular $e^n=\exp(1)^n=\exp(n)$ and $\exp\left(\frac nm\right)^m = \exp(n) = e^n$. Taking $m$-th root on both sides, we get $\exp(p)=e^p$ for all rational $p=\frac nm>0$. Therefore, by definition $(1)$,
e^y=\sup\{e^p: p\in\mathbb{Q},\,0\le p<y\}
=\sup\{\exp(p): p\in\mathbb{Q},\,0\le p<y\}.\tag{2}
Since $\exp(x)$ is a monotonic and continuous function in $x$ (again, we need knowledge in power series to justify that), the RHS of $(2)$ is equal to $\exp(y)$. Hence the definitions of $e^y$ and $\exp(y)$ are consistent.

Beside that the exponential function is defined most times in this way, we can see it holds if we take another definition like that the exp function is the unique function with $f(0)=1$
and $f'(x)=f(x)$. This is an ordinary differential equation where $f’$ is locally lipschitz hence there is a unique solution, and as the exponenential function solves it (just differentiate the sum) we are done. This is allowed because it is a power series which converges everywhere (normally you can’t differentiate a series by summing the derivatives of the terms).

If you want a less sophisticated proof that the exponential function is the unique function with this property you can show that there can only be one function with this property. Let $f,g$ be two functions with $f’=f$ and $g’=g$.
$$ \frac{d}{d x} \frac{f}{g}= \frac{f’ \cdot g – f\cdot g’}{g^2}$$
As we said we have $f’=f$ and $g’=g$ we have
$$\frac{f\cdot g-f\cdot g}{g^2}=0$$
hence $\frac{f}{g}$ is constant and $f=a\cdot g$ for some $a$.
In special
$$1=f(0)=a\cdot g(0)=a$$
Hence $a=1$ and $f=g$.

One could define the exponential function as the unique holomorphic function with
$$\exp(u+v)=\exp(u)\cdot \exp(v)$$
For this one you can use the Cauchy product and see that the series satisfy the functional equation.

Let’s define $g(x)=b^x$

If we find the derivative of $g(x)=b^x$

\frac{d}{dx} g(x) = g'(x)=\lim_{h \to 0} \frac{g(x+h)-g(x)}{h}=\lim_{h \to 0} \frac{b^{x+h}-b^x}{h}=b^x \lim_{h \to 0} \frac{b^{h}-1}{h}
If you check the term of $\lim_{h \to 0} \frac{b^{h}-1}{h}$, It does not depends on $x$. It means $\lim_{h \to 0} \frac{b^{h}-1}{h}$ must be constant.


g'(x)= c.b^x =c .g(x) \tag1$$
where $c$ is a constant

$c=\lim_{h \to 0} \frac{b^{h}-1}{h}$

Now let’s focus on another function $f(x)$ that is written in series

$$ f(x)=1 + x + \frac{x^2}{2!}+ \frac{x^3}{3!}+ \ldots= \sum_{k=0}^{\infty} \frac{x^k}{k!} $$

If we find the derivative of $f(x)$ ,

\frac{d}{dx} f(x) =f'(x)=1 + \frac{2x^1}{2!}+ \frac{3x^2}{3!}+ \ldots= \sum_{k=1}^{\infty} \frac{kx^{k-1}}{k!}

As you see above you will find the same series after derivative of the series.
\frac{d}{dx} f(x) =1 + x + \frac{x^2}{2!}+ \frac{x^3}{3!}+ \ldots= \sum_{k=0}^{\infty} \frac{x^{k}}{k!}=f(x)

f'(x)= f(x)

If you check exponential function property in result $1$ ($g'(x)= c .g(x)$),
$f(x)$ is a special exponential function that $c=1$

$$ f(x)=1 + x + \frac{x^2}{2!}+ \frac{x^3}{3!}+ \ldots= \sum_{k=0}^{\infty} \frac{x^{k}}{k!}=b^x$$

To find $b$ for that special case that $c=1$, put $x=1$.

We name that special $b$ as $e$ in mathematics when $c=1$. Please see http://en.wikipedia.org/wiki/E_%28mathematical_constant%29 for more info.

$$ f(1)=e=1 + 1 + \frac{1^2}{2!}+ \frac{1^3}{3!}+ \ldots=\sum_{k=0}^{\infty} \frac{1}{k!}= 2.71828182845904523536028747135266249775724709369995…$$

$$c=1=\lim_{h \to 0} \frac{e^{h}-1}{h}$$

\frac{d}{dx} f(x) =f(x)=e^x=\sum_{k=0}^{\infty} \frac{x^{k}}{k!}

Another approach: I imagine you know that
$$\lim_{n \to \infty} (1+x/n)^n = e^x.$$
Expanding by the binomial theorem,
$$(1+x/n)^n = 1+n \frac{x}{n} + \frac{n(n-1)}{2} \frac{x^2}{n^2} + \frac{n(n-1)(n-2)}{3!} \frac{x^3}{n^3} + \cdots$$
$$=1+x+(1-1/n) \frac{x^2}{2}+(1-1/n)(1-2/n) \frac{x^3}{3} + \cdots$$

As $n \to \infty$, the $k$th term goes to $x^k/k!$. One still has to do some work to turn this into a proof, as you can’t just assume willy-nilly that the limit of an infinite sum will be the sum of the limits, but it is at least very suggestive, and can be made into a completely correct argument.

Maybe my proof is a bit different but seems approach is same:
$$\displaystyle \sum_{n=0}^{\infty}\frac{x^n}{n!} =\ f(x)$$ and $g(x)=e^x$.
Now I will cite this lemma,

if $f'(x)=g'(x)$, through out a certain interval, then the functions $f(x)$ and $g(x)$ differ through out that interval by a constant, i.e to say $f(x)-g(x)=c$.

Now put $x=0$ and see that $c=0$. Conclusions follows immediately.