Intereting Posts

Why does the graph of $e^{1/z}$ look like a dipole?
Kummer extensions
Number of points in the fibre and the degree of field extension
Compute the period of a decimal number a priori
Prove that the set of all algebraic numbers is countable
Number Of Solutions $X^{2}=X$
Is there a name for this type of permutation?
Existence of fixed point given a mapping
Hartshorne II Ex 5.9(a) or R. Vakil Ex 15.4.D(b): Saturated modules
If $n = m^3 – m$ for some integer $m$, then $n$ is a multiple of $6$
proof of Generalized De Morgan's Laws by mathematical induction
What will be a circle look like considering this distance function?
Explicit formula for inverse matrix elements
Looking for a simple problem for math demonstration
Counterexample for downward monotone convergence theorem on measurable set

With regard to this answer to an inverse Laplace transform question, I derived the following result:

$$\frac1{i 2 \pi} \int_{c-i \infty}^{c+i \infty} ds \, e^{s t} \Gamma(s)^2 = 2 K_0 \left ( 2 e^{-t/2} \right ) – t I_0 \left ( 2 e^{-t/2} \right ) $$

This was not a result I can claim to have seen before and I have had a hard time finding anything related. While I am confident that it is correct, I wanted to perform a simple check by deriving the Laplace transform and hopefully getting the original $\Gamma(s)^2$ back. When I did this, I got the following:

- Evaluating $\int_{0}^{\pi/3}\ln^2 \left ( \sin x \right )\,dx$
- Evaluating this integral $ \small\int \frac {x^2 dx} {(x\sin x+\cos x)^2} $
- Conjectural closed form for $\int_0^\infty\sqrtz\ \operatorname{Ei}^2(-z)\,dz$
- Integral that arises from the derivation of Kummer's Fourier expansion of $\ln{\Gamma(x)}$
- Is there an analogue of Lebesgue’s Dominated Convergence Theorem for a net $ (f_{\alpha})_{\alpha \in A} $ of measurable functions?
- tough integral involving $\sin(x^2)$ and $\sinh^2 (x)$

$$\int_0^{\infty} dt \, e^{-s t} \left [ 2 K_0 \left ( 2 e^{-t/2} \right ) – t I_0 \left ( 2 e^{-t/2} \right ) \right ] = \sum_{n=0}^{\infty} \frac{\displaystyle \operatorname*{Res}_{s=-n} \Gamma(s)^2}{s+n} $$

Now, I know about partial fraction expansions for rational functions with a finite number of poles.

**My question is as follows**: Is the above “partial fraction” expansion for the function $F(s) = \Gamma(s)^2$ a valid representation of $\Gamma(s)^2$?

**EDIT**

The above result is incorrect. Rather, the result is

$$\frac1{i 2 \pi} \int_{c-i \infty}^{c+i \infty} ds \, e^{s t} \Gamma(s)^2 = 2 K_0 \left ( 2 e^{-t/2} \right ) $$

This result holds up: the inverse, which states that

$$\int_{-\infty}^{\infty} dt \, e^{-s t} 2 K_0 \left ( 2 e^{-t/2} \right ) = \Gamma(s)^2 $$

Subbing $u=e^{-t}$, we get

$$2 \int_0^{\infty} du \, u^{s-1} K_0 \left ( 2 \sqrt{u} \right ) = \Gamma(s)^2 $$

Note that this is a two-sided transform. See the link for why it converges for all real values of $t$.

- Easy way of memorizing or quickly deriving summation formulas
- Integral $\int_0^\infty \frac{\sin^2 ax}{x(1-e^x)}dx=\frac{1}{4}\log\left( \frac{2a\pi}{\sinh 2a\pi}\right)$
- Please show $\int_0^\infty x^{2n} e^{-x^2}\mathrm dx=\frac{(2n)!}{2^{2n}n!}\frac{\sqrt{\pi}}{2}$ without gamma function?
- Limit of $\lim_{t \to \infty} \frac{ \int_0^\infty \cos(x t) e^{-x^k}dx}{\int_0^\infty \cos(x t) e^{-x^p}dx}$
- compute one improper integral involving arctangent
- Prove that $u(x,t)=\int_{-\infty}^{\infty}c(w)e^{-iwx}e^{-kw^2t}dw\rightarrow 0$ if $x\rightarrow \infty$
- A contradictory integral: $\int \sin x \cos x \,dx$
- Different Ways of Integrating $3\sin x\cos x$
- A proof of $\int_{0}^{1}\left( \frac{\ln t}{1-t}\right)^2\,\mathrm{d}t=\frac{\pi^2}{3}$
- Meaning of measure zero

I felt the answer to the original question posed had to be “No”, because since $\Gamma(s)$ has simple poles at the nonpositive integers, $\Gamma(s)^2$ has double poles, and your series was suspiciously convergent, suggesting that the right-hand side only had simple poles. Thankfully, this proved to be correct.

So, onto answering the modified question. I’ll run through a simpler example first to work out what’s going on, then tackle $\Gamma(s)$. The answer to the broader question, “Is there a generalisation of partial fractions to analytic functions”, is *yes*: this is provided by Mittag-Leffler’s Theorem:

Let $D$ be an open set in $\mathbb{C}$ and $E \subset D$ a closed discrete subset. For each $a$ in $E$, let $p_a(z)$ be a polynomial in $1/(z-a)$. There is a meromorphic function $f$ on $D$ such that for each $a \in E$, the function $f(z)-p_a(z)$ is holomorphic at $a$. In particular, the principal part of $f$ at $a$ is $p_a(z)$.

This is an existence theorem, but we can in principle extend it to build functions we are actually interested in. For example, the Gamma function is given by the Mellin-type integral

$$ \Gamma(s) = \int_0^{\infty} x^{s-1} e^{-x} \, dx, \quad (\Re(s)>0). $$

We can produce a partial fractions expansion for $\Gamma$ in the following way: notice that the problems with this integral converging all stem from the possible singularity at zero. Therefore, we expect that the part of the integral near zero will tell us about the poles of $\Gamma$. Therefore, write

$$ \Gamma(s) = \int_0^1 x^{s-1} e^{-x} \, dx + \Gamma_1(s), \quad (\Re(s)>0). $$

$\Gamma_1(s)$ is the upper incomplete Gamma function evaluated at $1$; by trivial bounding arguments, it is convergent for all $s$ and hence defines and entire function (with a convergent power series […]), so we can safely ignore it from the point of view of partial fractions, residues and so on.

The lower integral’s integrand is finite for $s>0$, and in particular we expand $e^x$ in a power series, and can then interchange the order of integration and summation as follows:

$$ \begin{align*}

\int_0^1 x^{s-1} e^{-x} \, dx &= \int_0^1 x^{s-1} \left( \sum_{k=0}^{\infty} (-1)^k \frac{x^k}{k!} \right) \, dx \\

&= \sum_{k=0}^{\infty} \frac{(-1)^k}{k!} \int_0^1 x^{s+k-1} \, dx \\

&= \sum_{k=0}^{\infty} \frac{(-1)^k}{k!} \frac{1}{s+k}

\end{align*}$$

Ah-ha! This is a convergent sum for any complex $s$ that is not a nonpositive integer (just lop off the finite number of terms with $\Re(s)+k<0$ and compare with the exponential sum, for example). It has exactly the same poles as $\Gamma(s)$ (as we hoped for, given that it is $\Gamma(s)$ with an analytic bit subtracted off), and, even better, we got all of $\Gamma(s)$’s residues for free, too (and inspection tells us we were right about what they were all along).

Okay, so we know about $\Gamma(s)$ now. Now we can move onto your question, what about $\Gamma(s)^2$? Now, I could do this by squaring the expansion I just found, but that looks like a seriously unpleasant idea. Thankfully you have provided us with a better way using the integral representation

$$ \int_0^{\infty} x^{s-1} \left( 2K_0 ( 2 \sqrt{x} ) \right) \, dx = \Gamma(s)^2. $$

First we write $K_0$ in a more useful form using the identity (mentioned in your original answer, and also in DLMF)

$$ 2K_0(2\sqrt{x}) = \sum_{k=0}^{\infty} \frac{2H_k}{(k!)^2} x^k – \left( 2\gamma+\log{z} \right) I_0(2\sqrt{x}) = \sum_{k=0}^{\infty} \frac{1}{(k!)^2} \left( 2H_k-2\gamma- \log{x} \right) x^k . $$

Now we do as we did before, splitting at $x=1$ and ignoring the analytic bit of the integral, the singular part is:

$$ \begin{align*}

\int_0^1 x^{s-1} 2K_0(2\sqrt{x}) \, dx &= \int_0^1 x^{s-1} \left( \sum_{k=0}^{\infty} \frac{1}{(k!)^2} \left( 2H_k-2\gamma- \log{x} \right) x^k \right) \, dx \\

&=\sum_{k=0}^{\infty} \frac{1}{(k!)^2} \left( 2(H_k-\gamma) \int_0^1 x^{s+k-1} \, dx -\int_0^1 x^{s+k-1}\log{x} \, dx \right)

\end{align*}$$

As every schoolchild knows,

$$ \int_0^{1} x^{n}\log{x} \, dx = \left. x^{n+1} \left( \frac{\log{x}}{n+1} – \frac{1}{(n+1)^2} \right) \right|_)^1 = -\frac{1}{(n+1)^2}, $$

and hence we obtain the expansion

$$ \Gamma(s)^2 = \sum_{k=0}^{\infty} \left( \frac{1}{(k!)^2} \frac{1}{(s+k)^2} + 2\frac{H_k-\gamma}{(k!)^2} \frac{1}{s+k} \right) + \int_1^{\infty} x^{s-1} 2K_0(2\sqrt{x}) \, dx $$

This has all the properties we could hope for: double poles, the right residues, and the coefficients of the double poles are what we’d expect naïvely.

**Final important remark:** the alert reader my have noticed that I swept something under the rug: I chose to split the integral at $x=1$. Won’t this choice cause problems for the values of the residues? In particular, why are the residues well-defined when I produced them out of a hat with this procedure?

Suppose instead I chopped the gamma integral at $x=a$, for example. Then the $k$th term in the expansion of the finite interval’s integral is proportional to

$$ \int_0^a x^{s+k-1} \, dx = \frac{a^s a^k}{s+k}, $$

and $a^s$ is only analytic in $s$ throughout $\mathbb{C}$ when $a=1$, so the only way to get meromorphic terms in the series was to choose $a=1$ (or $0$ or $\infty$, neither of which give us anything). Hence the choice is no choice at all, and the residues are what I stated they would be.

- Geometric interpretation of Euler's identity for homogeneous functions
- Polygons with coincident area and perimeter centroids
- Riemann integrable function
- Can a field be isomorphic to its subfield?
- Are $\sigma$-algebras that aren't countably generated always sub-algebras of countably generated $\sigma$-algebras?
- A family having 4 children has 3 girl children. What is the probability that their 4th child is a son?
- Symbol for the set of odd naturals?
- Find conditions on positive integers so that $\sqrt{a}+\sqrt{b}+\sqrt{c}$ is irrational
- Locally Compact Spaces: Characterizations
- Is the number of primes congruent to 1 mod 6 equal to the number of primes congruent to 5 mod 6?
- In need of tips/suggestions when to add or multiply probabilities
- Finding the general solution of the differential equation $\,\,y''+y=f(x)$
- Why $y=e^x$ is not an algebraic curve?
- Is series $\displaystyle\sum^{\infty}_{n=1}\frac{\cos(nx)}{n^\alpha}$, for $\alpha>0$, convergent?
- $a_0 = 5/2$. $a_k = (a_{k-1})^{2} – 2$ for $k\geq1. \prod_{k=0}^{\infty}{\left(1-1/a_k\right)}.$