How are the Taylor Series derived?

I know the Taylor Series are infinite sums that represent some functions like $\sin(x)$. But it has always made me wonder how they were derived? How is something like $$\sin(x)=\sum\limits_{n=0}^\infty \dfrac{x^{2n+1}}{(2n+1)!}\cdot(-1)^n = x-\dfrac{x^3}{3!}+\dfrac{x^5}{5!}-\dfrac{x^7}{7!}\pm\dots$$ derived, and how are they used? Thanks in advance for your answer.

Solutions Collecting From Web of "How are the Taylor Series derived?"

This is the general formula for the Taylor series:

$$\begin{align} &f(x) \\ &= f(a) + f'(a) (x-a) + \frac{f”(a)}{2!} (x – a)^2 + \frac{f^{(3)}(a)}{3!} (x – a)^3 + \dots + \frac{f^{(n)}(a)}{n!} (x – a)^n + \cdots \end{align}$$

You can find a proof here.

The series you mentioned for $\sin(x)$ is a special form of the Taylor series, called the Maclaurin series, centered $a=0$.

The Taylor series is an extremely powerful because it shows that every function can be represented as an infinite polynomial (with a few disclaimers, such as interval of convergence)! This means that we can differentiate a function as easily as we can differentiate a polynomial, and we can compare functions by comparing their series expansions.

For instance, we know that the Maclaurin series expansion of $\cos(x)$ is $1-\frac{x^2}{2!}+\frac{x^4}{4!}-\dots$ and we know that the expansion of $\sin(x)$ is $x-\dfrac{x^3}{3!}+\dfrac{x^5}{5!}-\dfrac{x^7}{7!}\dots$. If we do term-by-term differentiation, we can clearly confirm that the derivative of $\sin(x)$ is $\cos(x)$ by differentiating its series.

We can also use the Maclaurin series to prove that $e^{i\theta}=\cos{\theta}+i\sin{\theta}$ and thus $e^{\pi i}+1=0$ by comparing their series:

e^{ix} &{}= 1 + ix + \frac{(ix)^2}{2!} + \frac{(ix)^3}{3!} + \frac{(ix)^4}{4!} + \frac{(ix)^5}{5!} + \frac{(ix)^6}{6!} + \frac{(ix)^7}{7!} + \frac{(ix)^8}{8!} + \cdots \\[8pt]
&{}= 1 + ix – \frac{x^2}{2!} – \frac{ix^3}{3!} + \frac{x^4}{4!} + \frac{ix^5}{5!} – \frac{x^6}{6!} – \frac{ix^7}{7!} + \frac{x^8}{8!} + \cdots \\[8pt]
&{}= \left( 1 – \frac{x^2}{2!} + \frac{x^4}{4!} – \frac{x^6}{6!} + \frac{x^8}{8!} – \cdots \right) + i\left( x – \frac{x^3}{3!} + \frac{x^5}{5!} – \frac{x^7}{7!} + \cdots \right) \\[8pt]
&{}= \cos x + i\sin x \ .

Also, you can use the first few terms of the Taylor series expansion to approximate a function if the function is close to the value on which you centered your series. For instance, we use the approximation $\sin(\theta)\approx \theta$ often in differential equations for very small values of $\theta$ by taking the first term of the Maclaurin series for $\sin(x).$

\newcommand{\angles}[1]{\left\langle #1 \right\rangle}
\newcommand{\braces}[1]{\left\lbrace #1 \right\rbrace}
\newcommand{\bracks}[1]{\left\lbrack #1 \right\rbrack}
\newcommand{\ceil}[1]{\,\left\lceil #1 \right\rceil\,}
\newcommand{\dd}{{\rm d}}
\newcommand{\expo}[1]{\,{\rm e}^{#1}\,}
\newcommand{\fermi}{\,{\rm f}}
\newcommand{\floor}[1]{\,\left\lfloor #1 \right\rfloor\,}
\newcommand{\half}{{1 \over 2}}
\newcommand{\ic}{{\rm i}}
\newcommand{\ket}[1]{\left\vert #1\right\rangle}
\newcommand{\pars}[1]{\left( #1 \right)}
\newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}}
\newcommand{\pp}{{\cal P}}
\newcommand{\root}[2][]{\,\sqrt[#1]{\vphantom{\large A}\,#2\,}\,}
\newcommand{\sech}{\,{\rm sech}}
\newcommand{\sgn}{\,{\rm sgn}}
\newcommand{\totald}[3][]{\frac{{\rm d}^{#1} #2}{{\rm d} #3^{#1}}}
\newcommand{\verts}[1]{\left\vert\, #1 \,\right\vert}
Note that
\int_{0}^{x}\fermi’\pars{x – t}\,\dd t = -\fermi\pars{0} + \fermi\pars{x}
\quad\mbox{such that}
\fermi\pars{x}=\fermi\pars{0} + \int_{0}^{x}\fermi’\pars{x – t}\,\dd t

Integrating by parts:
\fermi\pars{0} + \fermi’\pars{0}x + \int_{0}^{x}t\fermi”\pars{x – t}\,\dd t
\\[5mm] & =
\fermi\pars{0} + \fermi’\pars{0}x + \half\,\fermi”\pars{0}x^{2}
+\half\int_{0}^{x}t^{2}\fermi”’\pars{x – t}\,\dd t
\\[5mm]& = \cdots =
\color{#00f}{\fermi\pars{0} + \fermi’\pars{0}x + \half\,\fermi”\pars{0}x^{2}
+ \cdots + {\fermi^{{\rm\pars{n}}}\pars{0} \over n!}\,x^{n}}
\color{#f00}{{1 \over n!}\int_{0}^{x}t^{n}\fermi^{\rm\pars{n + 1}}\pars{x – t}\,\dd t}

Well, what we really want to do is approximate a function $f(x)$ around an value, $a$.

We will call our Taylor series $T(x)$. Naturally we want our series to have the exact of $f(x)$ when $x = a$. For this, we will start our Taylor approximation with the constant term $f(a)$. We have $$T(x) = f(a)$$ as our first approximation and it is good assuming the function doesn’t change much near $a$.

We can obtain a much better approximation of our function had the same slope (or derivative) as $f(x)$ at $x = a$. We want $T'(a) = f'(a)$. The best way to accomplish this is to add the term $f'(x)(x-a)$ to our approximation. We now have $T(x) = f(a) + f'(a)(x-a)$. You can verify that $T(a) = f(a)$ and that $T'(a) = f'(a)$.

If we were to continue this process we would derive the complete Taylor series where $T^{(n)}(a) = f^{(n)} (a)$ for all $n \in \mathbb{Z}^{+}$ (or n is a positive integer).

This is where the series comes from. If you write it in summation notation you reach what Juan Sebastian Lozano Munoz posted.

Another way you can use Taylor series that I’ve always liked — using the definition of a derivative to show that $$\frac{d}{dx} e^x = e^x.$$

The definition is $$\lim \limits_{h \to 0} \frac{e^{x+h} – e^x}{h},$$

Which is equal to

$$\lim \limits_{h \to 0} \frac{e^x(e^h – 1)}{h}.$$

If we can show that $\lim \limits_{h \to 0} \frac{e^h – 1}{h} = 1$, we’ll be home free. This is where Taylor/MacLaurin series come in. We know that $e^x = 1 + x + \frac{x^2}{2!} + \frac{x^3}{3!} + \dots$, so we can substitute:

$$\lim \limits_{h \to 0} \frac{-1 + 1 + h + \frac{h^2}{2!} + \frac{h^3}{3!} + \dots}{h}$$

$$\lim \limits_{h \to 0} \frac{h + \frac{h^2}{2!} + \frac{h^3}{3!} + \dots}{h}$$

$$\lim \limits_{h \to 0} 1 + \frac{h}{2!} + \frac{h^2}{3!} + \dots$$

$$ = 1$$

Like you said, Taylor series are meant to represent some function, let’s call it $f(x)$. We often have functions, like $\sin(x)$ or $\log(x)$, that have a few easy to compute point near the where we want to compute the value, and it is often useful to approximate things, and so we can come up with an approximation method for $f(x)$.

Let some point $a$ be near our desired $x$ value, if a is easy to compute, then an easy approximation for $f(x)$ would simply be $f(a)$. However we might want to know with a little more accuracy what $f(x)$ is, and so what we do is take our first derivative at $a $, $f'(a)$, and use that as our coefficient, for an approximation polynomial: $$f(a) + f'(a)(x-a)$$ where instead of $x$, we use the difference between $x$ and $a$.

The general formula for a Taylor series expansion of $f(x)$, if $f$ is infinity differentiable is the following:
$$f(x) = \sum\limits^{\infty}_{n = 0} \frac{f^{(n)}(a)}{n!} (x-a)^n$$
where $a$ is the point of approximation.

The reason for this has to to with power series, because the Taylor series is a power series, as well as our approximations. See, if we were to carry out our approximation over and over (in infinite amount of times), we would be getting closer and closer to the actual function, until (at infinity) we do. The Taylor series is extremely important in both mathematics and in applied fields, as it both deals with some fundamental properties of function, as well as provides an amazing approximation tool (as polynomials are easier to compute than nearly any other functions).

If you want to find out more, here are some resourses:

  • MIT covers power series and Taylor series in this module of their single variable calculus course
  • Khan Academy has a series (pun intended) on Taylor series.
  • these Math. SE questions talk more about the applications of Taylor series.

If you want to kill 2 birds with one stone, Kenneth Iverson’s Elementary Functions builds up to the Taylor series approximation of sine by way of the polynomial and simple concepts like slope and area (slyly avoiding the dreaded buzzwords differential and integral and bizarrely, avoiding even the word calculus). The style is always to show you the concept in action, and then tell you the name.

All of this while teaching you APL from scratch.

Disclaimer: I just read this book a week ago, and I’m gushing about it to everyone.

Taylor series can often be derived by doing arithmetic with known Taylor series.

Do you want the Taylor series for $\operatorname{sinc}(x) = \sin(x) / x$? Don’t try to take the derivatives of $\operatorname{sinc}(x)$! Instead, compute

$$\operatorname{sinc}(x) = \sin(x) / x
= x^{-1} \sum_{n=0}^{+\infty} \frac{(-1)^n x^{2n+1}}{(2n+1)!}
= \sum_{n=0}^{+\infty} \frac{(-1)^n x^{2n}}{(2n+1)!} $$

In fact, if your goal was to compute the values of the derivative of $\operatorname{sinc}(x)$ at $0$, the easiest way is to first compute its Taylor series by the means above, and then read the values of the derivatives off from the coefficients of the Taylor series.

More complicated arithmetic is harder, but sometimes you only need a few terms and can just multiply things out.

Do you want the fourth order Taylor series for $\sin \sin x$? Compute

$$ \begin{align}\sin \sin x &= \sin\left( x – \frac{x^3}{6} + O(x^5) \right)
\\&= \left( x – \frac{x^3}{6} + O(x^5) \right)
– \frac{1}{6}\left( x – \frac{x^3}{6} + O(x^5) \right)^3 + O(x^5)
\\&= \left(x – \frac{x^3}{6} + O(x^5)\right) – \frac{1}{6}\left(x^3 + O(x^5) \right) + O(x^5)
\\&= x – \frac{x^3}{3} + O(x^5)

Where $O(x^5)$ just means that there are (possibly) more terms but they all have an exponent on $x$ that is 5 or greater. (this $O$ notation can be given more general meaning, but that’s all that’s needed here)