An asymptotic term for a finite sum involving Stirling numbers

The question is a by-product at the end of this post.

The following asymptotic term will ensure the convergence of some series.

\frac{1}{n!} \sum_{k = 1 }^{n } \frac{{n \brack k}}{k+1} = \mathcal{O} \left(\frac{1}{\ln n}\right), \quad \text{as} \quad n \rightarrow \infty.

Here $\displaystyle {n \brack k}$ are the unsigned Stirling numbers of the first kind.

Can you find a proof for it?

Update: A related interesting paper may be found here.

Solutions Collecting From Web of "An asymptotic term for a finite sum involving Stirling numbers"

We have:

$$\sum_{k=0}^n {n\brack k}x^k=x(x+1)\cdots(x+n-1)$$
(See Comtet, Advanced combinatorics)
$$u_n=\frac{1}{n!} \sum_{k = 1 }^{n } \frac{{n \brack k}}{k+1}=\int_0^1\frac{t(t+1)\cdots (t+n-1)}{n!} dt$$

Now we have $\exp(-x)\geq 1-x$ for $x\geq 0$. Hence for $t\in [0,1]$ we have
$$\frac{t(t+1)\cdots (t+n-1)}{n!}=t\prod_{k=1}^{n-1} (1-\frac{1-t}{k+1})\leq t\exp(-v_n (1-t))$$ where $1+v_n=1+\frac{1}{2}+\cdots+\frac{1}{n}$.

$$u_n\leq \int_0^1 t\exp(-v_n (1-t))=\frac{1}{v_n}-\frac{1-\exp(-v_n)}{v_n^2}\leq \frac{1}{v_n}$$

As $v_n\sim \log n$, we are done.

Consider the following:

\frac{1}{n!}\sum_{k=1}^{n}\frac{{n\brack k}}{k+1}
&\stackrel{\color{red}{[1]}}=\frac{1}{n!}\sum_{k=0}^{n}\frac{{n\brack k}}{k+1}\\
&=\frac{1}{n!}\sum_{k=0}^{n}{n\brack k}\int_{0}^{1}x^k\,\mathrm{d}x\\
&=\frac{1}{n!}\int_{0}^{1}\left(\sum_{k=0}^{n}{n\brack k}x^k\right)\,\mathrm{d}x\\


$\color{red}{[1]}\;\;\;$ For all natural numbers $n>0$, the unsigned Stirling numbers of the first kind satisfy the condition ${n\brack 0}=0$.

$\color{red}{[2]}\;\;\;$ The unsigned Stirling numbers of the first kind arise as coefficients of the rising factorial:

$$(x)^{(n)}=x(x+1)\cdots(x+n-1)=\sum_{k=0}^{n}{n\brack k}x^k.$$

$\color{red}{[3]}\;\;\;$ For large $n$ ($x$ is fixed), Stirling’s appproximation gives the asymptotic formula

$$\operatorname{B}{(x,n)}\sim \Gamma{(x)}\,n^{-x}.$$

For me, the demonstration by David H has some problems. How can you justify this approximation $$\int_0^1\frac{n^{x-1}}{\Gamma(x)} \, dx \approx \int_0^1 x\, n^{x-1} \, dx$$ ? What’s the reason to take only the first term in the expansion of $\Gamma^{-1}(x)$? To illustrate the point, I remark that $$\frac{1}{\Gamma(x)} = x+ \gamma x^2 + O(x^3)$$ and hence, if we take only the first term, we arrive at the David’s result. But if we take into account the second term as well, the result will be different: $$\int_0^1 (x+ \gamma x^2)\, n^{x-1} \, dx = \frac{\ln n-2\gamma-n\ln n +2\gamma n +n\ln^2 n-2n\gamma\ln n +n\gamma \ln^2 n}{n\ln^3 n}\sim\frac{1+\gamma}{\ln n}$$ for large $n$. Taking into account more terms in the expansion for $\Gamma^{-1}(x)$ makes the divergence more visible.

So, I would like to propose another demonstration, which does not provide the exact result, but gives a good upper bound. In fact $$\int_0^1\frac{n^{x-1}}{\Gamma(x)} \, dx \leqslant \max\limits_{[0,1]}\frac{1}{\Gamma(x)}\cdot\int_0^1 n^{x-1} \, dx = \int_0^1 n^{x-1} \, dx = \frac{n-1}{n\ln n}\sim\frac{1}{\ln n}$$ as $n\to\infty$. Numerical simulations show that the ratio between $\ln^{-1} n$ and the first integral on the left tends to $1^{+}$ as $n\to\infty$.

I’ve just found a rigorous proof. All the ingredients were already there. I consider again expansion $$\frac{1}{\Gamma(x)}=x+\gamma x^2+\ldots=\sum_{k=1}^\infty x^k a_k$$ whose coefficients are denoted by $a_k$ for brevity. Now, it can be remarked that for positive integer $k$ and for large $n$ $$\int_0^1 n^{x-1} x^k \, dx\sim\frac{1}{\ln n}$$ Thus, in virtue of the uniform convergence, we have $$\int_0^1\frac{n^{x-1}}{\Gamma(x)} \, dx = \sum_{k=1}^\infty a_k \int_0^1 n^{x-1}x^k \sim \frac{1}{\ln n} \underbrace{\sum_{k=1}^\infty a_k}_{\Gamma^{-1}(1)}=\frac{1}{\ln n}\,,\qquad \text{as }\; n\to\infty$$
Moreover, by extending the above method to higher-order terms, one may obtain a more accurate asymptotics
$$\frac{1}{n!}\sum_{k=1}^n\frac{\left[{n \atop k}\right]}{k+1}\,=\,\frac{1}{\,\ln n\,} – \frac{\gamma}{\,\ln^2 \!n\,} +\frac{\,6\gamma^2-\pi^2\,}{\,6\ln^3 \!n\,} + O\!\left(\frac{1}{\,\ln^4\!n\,}\right)\,,\qquad \text{as }\; n\to\infty$$ The details of the derivation, as well similar asymptotics for general terms in series with rational coefficients for $\ln^{-1}\!2$, $\ln^{-2}\!2$ and for the Euler’s constant $\gamma$, may be found in my recent work (this particular asymptotics is treated in Appendix)

I computed this asymptotic expansion for this answer. To relate the two, we will use
Integrating $(1)$ on $[0,1]$ and dividing by $n!=\Gamma(n+1)$ gives
For $x\in[0,1]$, Gautschi’s Inequality says
(n+1)^{x-1}\le\frac{\Gamma(n+x)}{\Gamma(n+1)}\le n^{x-1}\tag{3}
Dividing $(3)$ by $\Gamma(x)$ and integrating on $[0,1]$ gives
Substituting $x\mapsto1-x$, we get the asymptotic expansion
&=\int_0^1\left(1-\gamma x-\left(\tfrac{\pi^2}{12}-\tfrac{\gamma^2}2\right)x^2+O\left(x^3\right)\right)e^{-x\log(n)}\,\mathrm{d}x\\
Since the difference between $(5)$ for $n$ and $n+1$ is $O\left(\frac1n\right)$, $(4)$ says we have