Is there a slowest rate of divergence of a series?

diverges slower than
, by which I mean $\lim_{n\rightarrow \infty}(g(n)-f(n))=\infty$. Similarly, $\ln(n)$ diverges as fast as $f(n)$, as $\lim_{n \rightarrow \infty}(f(n)-\ln(n))=\gamma$, so they ‘diverge at the same speed’.

I think there are an infinite number of ‘speeds of divergence’ (for example, $\sum_{i=1}^n\frac{1}{i^k}$ diverge at different rates for different $k<1$). However, is there a slowest speed of divergence?

That is, does there exist a divergent series, $s(n)$, such that for any other divergent series $S(n)$, the limit $\lim_{n \rightarrow \infty}(S(n)-s(n))=\infty$ or $=k$? If so, are there an infinite number of these slowest series?

Solutions Collecting From Web of "Is there a slowest rate of divergence of a series?"

The proof in the paper “Neither a worst convergent series nor a best divergent series exists” by J. Marshall Ash that I referenced above is so nice that I wanted to reproduce it below before it gets lost on the Internet.

$\bf{Theorem: }$ Let $\sum_{n=1}^{\infty} c_n$ be any convergent series with positive terms. Then, there exists a convergent series $\sum_{n=1}^{\infty} C_n$ with much bigger terms in the sense that $\lim_{n\rightarrow\infty} C_n/c_n = \infty$. Similarly, for any divergent series $\sum_{n=1}^{\infty} D_n$ with positive terms, there exists a divergent series $\sum_{n=2}^{\infty} d_n$ with much smaller terms in the sense that $\lim_{n\rightarrow\infty} \frac{d_n}{D_n} = 0$.

$\bf{Proof: }$ For each $n$, let $r_n = c_n + c_{n+1}+\cdots$ and $s_n = D_1 + \cdots + D_n$. Letting $C_n = \frac{c_n}{\sqrt{r_n}}$ and $d_n = \frac{D_n}{s_{n-1}}$, then $\lim_{n\rightarrow\infty} \frac{C_n}{c_n} = \lim_{n\rightarrow\infty} \frac{1}{\sqrt{r_n}}=\infty$ and $\lim_{n\rightarrow\infty} \frac{d_n}{D_n} = \lim_{n\rightarrow\infty} \frac{1}{s_{n-1}} = 0$, so it only remains to check $\sum C_n$ converges and that $\sum d_n$ diverges. To see that this is indeed the case, simply write $C_n = (1/\sqrt{r_n})(r_n-r_{n+1})$ and $d_n = 1/s_{n-1}(s_n-s_{n-1})$; observe that $\int_0^{r_1} 1/\sqrt{x}dx<\infty$ and $\int_{s_1}^{\infty} 1/xdx = \infty$; and note that the $n$th term of series $\sum C_n$ is the area of the gray rectangle in Figure 1a, while the $n$th term of series $\sum d_n$ is the area of the gray rectangle in Figure 1b.

enter image description here

Just to expand on Raymond Manzoni and to partially answer your question with one of the coolest things I ever learned in my early childhood (as a math undergrad),

consider the family of series,

\sum \frac{1}{n}&=&\infty \\
\sum \frac{1}{n\ln(n)}&=&\infty \\
\sum \frac{1}{n\ln(n)\ln(\ln(n))}&=&\infty

and so on. They all diverge and you can use the integral test to easily prove this. But here is the kicker. The third series actually requires a googolplex numbers of terms before the partial terms exceed 10. It is only natural that if natural log is a slowly increasing function, then log of log diverges even slower.

On the other hand, consider the family,

\sum \frac{1}{n^2}&<&\infty \\
\sum \frac{1}{n(\ln(n))^2}&<&\infty \\
\sum \frac{1}{n\ln(n)(\ln(\ln(n)))^2}=38.43…&<&\infty

all converge which can be easily verified easily again using the integral test. But the third series converges so slowly that it requires $10^{3.14\times10^{86}}$ terms before obtaining two digit accuracy presented. Talking about getting closer to the “boundary” between convergence and divergence. Using this you can easily make up a series to converge or to diverge as slow as you want. So to answer your question, no there is no such thing as “the slowest diverging series”. Any slowly diverging series you pick, we can come up with one diverging even slower.

Zwillinger, D. (Ed.). CRC Standard Mathematical Tables and Formulae, 30th
ed. Boca Raton, FL: CRC Press, 1996.

The Cauchy condensation test allows to find an infinity of slower divergent series :

  • $\displaystyle \sum \frac 1n\;$ behaves like $\;\displaystyle \sum 2^n \frac 1{2^n}=\sum 1\;$ which diverges
  • $\displaystyle \sum \frac 1{n\ln(n)}\;$ behaves like $\;\displaystyle \sum 2^n \frac 1{2^n\;n}=\sum \frac 1n\;$ which diverges
  • $\displaystyle \sum \frac 1{n\ln(n)\ln(\ln(n))}\;$ behaves like $\;\displaystyle \sum 2^n \frac 1{2^n\;n\ln(n)}=\sum \frac 1{n\ln(n)}\;$ which diverges… QED.

(from Chaitin’s book ‘Algorithmic Information Theory’)

If you have $(a_n)_{n\in\mathbb{N}}$ a divergent series:
$$\sum_{n=0}^N a_n \xrightarrow[N\to\infty]{}\infty$$ with $a_n > 0$ for all $n\in\mathbb{N}$), consider the series with term $$b_n\stackrel{\rm{}def}{=}\frac{a_n}{\sum_{k=0}^n a_k}.$$
One can show that the series $\sum b_n$ diverges.

Let A(n) be an Ackermann function, let $1_{A} $
be its indicator function (i.e $1_{A} (x)$=1 if A(n)=x for some n ; 0 otherwise). Then $\sum1_{A}^{}$ diverges. A one that diverges faster but more interesting is $ s(0)=1, s(n+1)= s(n)+ \frac{1}{p_{n}s(n)}$ where $p_{n}$ is the nth prime.

Suppose there is a slowest-diverging series. Call it $(a_n)$.

Define $A(n)=\sum a_n$. Thus, $A(n)$ is a slowly-diverging function.

Define $B(n)=\ln A(n)$. Thus, $B(n)$ diverges even slower than $A(n)$.

Define $b_n=B(n)-B(n-1)$. Thus, $(b_n)$ is an even slower diverging series. Contradiction.