Intereting Posts

Isomorphism between $SL(2,\mathbb{Z}) \times \mathbb{Z_2}$ and $GL(2,\mathbb{Z})$
On a conformal mapping
Explanation of proof of Gödel's Second Incompleteness Theorem
Can continuity be proven in terms of closed sets?
Show that $A \cap B = B$ iff $A \cup B = A$, where $A \subseteq B$.
Prove that a complex-valued entire function is identically zero.
Do the two limits coincide?
Maximum and minimum absolute of a function $(x,y)$
Why $\mathbf{0}$ vector has dimension zero?
What does $\sum_{k=0}^\infty \frac{k}{2^k}$ converge to?
Algebraic Integers in a Cyclotomic Field
Roots in different algebraic closure have the same multiplicative relations
Prove that $A$ is diagonalizable iff $\mbox{tr} A\neq 0$
Matrix identity involving trace
First-Order Logic vs. Second-Order Logic

I recently learnt Taylor series in my class. I would like to know how is to possible to distinguish whether a function is real-analytic or not. First thing to check is if it is smooth. But how can I know whether the taylor series converges to the function?

For example: $f(x)=\frac{1}{1-x}, x\in(0,1)$ has $n^{th}$ degree taylor polynomial $\sum_{k=0}^n x^k$. In this case, I understand that $f$ is analytic in its domain since the geometric series $\sum_{k=0}^\infty x^k$ for $x\in(0,1)$ converges to $\frac{1}{1-x}$.

In general, what is the trick? For example, how to know if $\sin(x),\cos(x)$ are analytic?

- How many smooth functions are non-analytic?
- Sequence question from Rudin
- Should I be worried that I am doing well in analysis and not well in algebra?
- prove that $(1 + x)^\frac{1}{b}$ is a formal power series
- Infinite series $\sum_{n=0}^{\infty}\arctan(\frac{1}{F_{2n+1}})$
- how prove $\sum_{n=1}^\infty\frac{a_n}{b_n+a_n} $is convergent?

- Limit of a function. Integration problem
- Show that $f_n(x_n) \to f(x).$
- How can I compute the integral $\int_{0}^{\infty} \frac{dt}{1+t^4}$?
- Cantor's completeness principle
- Prob. 16, Chap. 3 in Baby Rudin: $x_{n+1} = (x_n + \alpha/x_n)/2$, with $x_1 > \sqrt{\alpha}$, $\alpha > 0$
- A Challenging Integral $\int_0^{\frac{\pi}{2}}\log \left( x^2+\log^2(\cos x)\right)dx$
- Continuous at exactly two points and differentiable at exactly one of them
- Is the product $\prod_{k=1}^\infty \frac{2^k-1}{2^k}$ necessarily $0$?
- A simple way to evaluate $\int_{-a}^a \frac{x^2}{x^4+1} \, \mathrm dx$?
- Inverse of the sum $\sum\limits_{j=1}^k (-1)^{k-j}\binom{k}{j} j^{\,k} a_j$

This is a difficult question in general. Ideally, to show that $f$ is analytic at the origin, you show that in a suitable neighborhood of $0$, the error of the $n$-th Taylor polynomial approaches $0$ as $n\to\infty$.

For example, for $f(x)=\sin(x)$, any derivative of $f(x)$ is one of $\sin(x)$, $\cos(x)$, $-\sin(x)$, or $-\cos(x)$, and the error given by the $n$-th Taylor polynomial takes the form $\displaystyle \frac{f^{(n+1)}(\alpha)}{(n+1)!}x^{n+1}$ for some $\alpha$ between $0$ and $x$ (that depends on $n$). In absolute value, this is bounded by $\displaystyle \frac{|x|^{n+1}}{(n+1)!}$, that (on any bounded set) approaches $0$ uniformly as $n\to\infty$. This shows that the Taylor series for $f(x)=\sin(x)$ converges to $\sin(x)$, in any neighborhood of $0$ (and therefore everywhere). The same applies to $f(x)=\cos(x)$. A similar argument holds for a variety of functions, including $f(x)=e^x$.

And there are general theorems; for instance, any solution of a linear homogeneous ordinary differential equation with analytic coefficients is analytic (in a small neighborhood), as the differential equation can be used to establish bounds on the error term. The case of sine is an example, as $\sin(x)$ is a solution of $y”=-y$.

But the question is difficult in general. For example, a uniformly convergent series of analytic functions needs not be analytic. For instance, consider Weierstrass function, which in fact is nowhere differentiable.

Given a smooth function $f$ and a point $a$ in its domain, it may be that the formal Taylor series associated to $f$ at $a$ does not converge anywhere. Clearly in that case $f$ is not analytic at $a$. But it may be that the formal Taylor series associated to $a$ converges in an interval, but it does not converge to $f$ (identically) in any such interval. Then, again, $f$ is not analytic, but this may be harder to establish. For a short survey of $C^\infty$ nowhere analytic functions, by Dave L Renfro, see here.

In practice, for many analytic functions $f$, analyticity is established not by studying the rate of decay of the error terms, but by “inheritance”. For example, $f$ could be the series of term by term derivatives of an analytic function, or its term by term antiderivative, or the result of composing two analytic functions, etc.

Figuring out if a function is real analytic is a pain; figuring out whether a complex function is analytic is much easier. First, understand that a real function can be analytic on an interval, but not on the entire real line.

So what I try to do is consider f as a function of a complex variable in the neighborhood of the point say x = a in question. If f(z) (z complex) = f(x) when y = 0 (z = x + iy) then f(z) is an extension of f to a neighborhood of a. To show that f(z) is analytic you need only show that it has a derivative as a complex variable at a. If so then its Taylor’s series will converge to f(z) in some neighborhood of a. f as a real function is analytic on the interval that this neighborhood covers.

This approach does show that sin and cos are analytic in the entire plane, and thus on the x-axis.

I agree with Betty Mock’s thesis that complex analytic functions are usually easier to deal with than real analytic functions, but I don’t think that entire functions (e.g. $e^z$, $\cos z$, $\sin z$) are a good example of that: the real-analytic story is the same as the complex-analytic story.

Looking around, I found this previous math.SE question which asks why if a real function $f$ is analytic at $a \in \mathbb{R}$ and $f(a) \neq 0$, then $\frac{1}{f}$ is analytic at $a$. As the answers indicate, if you use complex analysis then it’s just a matter of adapting the differentiability of the reciprocal to functions of a complex variable (no problem). However, if you insist on showing directly that the Taylor series of $\frac{1}{f}$ has a positive radius of convergence at $a$….then this really is a pain, as several of the answers (including one due to me, where I reference a book on real analytic function theory but don’t have the energy to reproduce the details) attest.

Already understanding why $f(x) = \frac{1}{x^2+1}$ is real analytic and that the radius of convergence of the Taylor series expansion at $a \in \mathbb{R}$ is precisely $\sqrt{1+a^2}$ is a quick pure-thought argument if you know the rudiments of complex analysis. But trying to show this formula for the radius of convergence directly from the Taylor series expansion…there will be some actual work there, it seems to me.

- Proving $\mathbb{F}_p/\langle f(x)\rangle$ with $f(x)$ irreducible of degree $n$ is a field with $p^n$ elements
- Find all entire functions such that $|f(z)|\leq |z^2-1|$ for all $z\in\mathbb C$.
- An elementary functional equation.
- Elementary Geometry Nomenclature: why so bad?
- Maximum number of edges in a planar graph without $3$- or $4$-cycles
- Asymptotics of the sum of squares of binomial coefficients
- Express $\sin^8\theta+\sin^6\theta+\sin^4\theta+\sin^2\theta-2$ as a single term in terms of $\sin\theta$
- Four turtles/bugs puzzle
- Prove that there exists a Borel measurable function $h: \mathbb R \to \mathbb R$ such that $g= h\circ f$.
- programming language with HALTS keyword
- Eigenvalue and Eigenvector for the change of Basis Matrix
- Intuition behind the definition of linear transformation
- If $x_{n}$ is decreasing and $\sum x_{n}$ converges, then prove that $\lim nx_{n} = 0$
- What's the loop space of a circle?
- Evaluate $\int x^2e^{x^2} dx$