Convergence of $\sum \frac{a_n}{S_n ^{1 + \epsilon}}$ where $S_n = \sum_{i = 1} ^ n a_n$

Let $a_n$ be a sequence of positive reals, such that the partial sums $S_n = \sum_{i = 1} ^ n a_i$ diverge to $\infty$. For given $\epsilon > 0$ do we have $$\sum_{n = 1} ^ \infty \frac{a_n}{S_n^{1 + \epsilon}} < \infty?$$

For $\epsilon \ge 1$ we can resolve this quickly by noting $$\frac{a_n}{S_n ^ 2} \le \frac 1 {S_{n – 1}} – \frac 1 {S_n}$$ so for sufficiently large $n$ we can bound $\frac{a_n}{S_n^{1 + \epsilon}}$ by $\frac 1 {S_{n – 1}} – \frac 1 {S_n}$ as well. I’m wondering if this is true for arbitrary $\epsilon > 0$. I know that the series in question diverges for $ \epsilon = 0$, so all that is missing is what happens in $(0, 1)$.

Solutions Collecting From Web of "Convergence of $\sum \frac{a_n}{S_n ^{1 + \epsilon}}$ where $S_n = \sum_{i = 1} ^ n a_n$"

I proved something like this a while ago where I showed that if $0<a_{n-1}\le a_n$ and if $\epsilon>0$, then
$$
\sum_{n=0}^\infty\frac{a_n-a_{n-1}}{a_n^{1+\epsilon}}
$$
converges. I believe this is the same stiuation, where my $a_n$ is the $S_n$ in this problem. However, there is no requirement that $S_n$ (my $a_n$) diverges. Here is the proof I gave with my $a_n$ replaced by $S_n$.

By the Mean Value Theorem, for some $z_n$ between $S_{n-1}$ and $S_n$, we have
$$
\frac{1}{S_{n-1}^\epsilon}-\frac{1}{S_n^\epsilon}=\epsilon\frac{S_n-S_{n-1}}{z_n^{1+\epsilon}}
$$
Let us use this in the following telescoping series
$$
\begin{align}
\frac{1}{S_{k-1}^\epsilon}-\frac{1}{S_N^\epsilon}
&=\sum_{n=k}^N\frac{1}{S_{n-1}^\epsilon}-\frac{1}{S_n^\epsilon}\\
&=\sum_{n=k}^N\;\epsilon\frac{S_n-S_{n-1}}{z_n^{1+\epsilon}}\\
&=\sum_{n=k}^N\;\epsilon\left(\frac{S_n}{z_n}\right)^{1+\epsilon}\;\frac{S_n-S_{n-1}}{S_n^{1+\epsilon}}\\
&\ge\epsilon\sum_{n=k}^N\frac{S_n-S_{n-1}}{S_n^{1+\epsilon}}\\
&=\epsilon\sum_{n=k}^N\frac{a_n}{S_n^{1+\epsilon}}
\end{align}
$$
This last inequality, along with the fact that $\frac{1}{S_n^\epsilon}$ is a non-increasing sequence bounded below by $0$, implies that the summation converges.

I believe using $\displaystyle \frac{1}{S_{n-1}^{\varepsilon}} – \frac{1}{S_n^{\varepsilon}}$ will work for $\varepsilon \gt 0$.

(The same as yours, for $\varepsilon = 1$).

$\displaystyle \frac{1}{S_{n-1}^{\varepsilon}} – \frac{1}{S_n^{\varepsilon}} = \frac{1}{(S_n – a_n)^{\varepsilon}} – \frac{1}{S_n^{\varepsilon}}$

If $\displaystyle t = \frac{a_n}{S_n}$, then this is same as

$\displaystyle \frac{1}{S_n^\varepsilon}((1-t)^{-\varepsilon} – 1) \ge \frac{\varepsilon t}{S_n^{\varepsilon}} = \frac{\varepsilon a_n}{S_n^{1 + \varepsilon}}$

(We used Bernoulli’s inequality $(1+x)^r \ge 1 + rx, x \gt -1, r \le 0$). Even Binomial theorem will work)

I’d like to offer an alternative proof of this, even though it’s an old question and an answer has been accepted, since I think it’s simpler and more geometric, and this is a fact I needed to use in an assignment recently.

Let $p>1$. Then $\displaystyle \int_{S_{n-1}}^{S_n}\frac{dx}{x^p}\ge \int_{S_{n-1}}^{S_n}\frac{dx}{S_n^p}=\frac{a_n}{S_n^p}$ $\displaystyle\implies\frac{1}{p-1}\frac{1}{S_1^{p-1}}\ge\frac{1}{p-1}\bigg(\frac{1}{S_1^{p-1}}-\frac{1}{S_n^{p-1}}\bigg)=\int_{S_{1}}^{S_n}\frac{dx}{x^p}\ge \sum\limits_{k=2}^{n}\frac{a_n}{S_n^p}$, and so the sum converges.

For completeness I will also prove that for $p=1$ the series diverges if the partial sums diverge and the sequence is bounded that I haven’t seen: $\displaystyle \ln(S_n)-\ln(S_1)=\int_{S_1}^{S_n}\frac{dx}{x}=\sum\limits_{k=2}^n\int_{S_{k-1}}^{S_k}\frac{dx}{x}\le \sum\limits_{k=2}^n\frac{a_k}{S_{k-1}}$ and since $S_n\to\infty$ as $n\to\infty$, the sum diverges. Since the sequence is bounded, $\frac{S_n}{S_{n-1}}=\frac{S_{n-1}+a_n}{S_{n-1}}\to 1$ as $n\to\infty$, and so by the limit comparison test the sums of $\frac{a_n}{S_n}$ and $\frac{a_n}{S_{n-1}}$ converge or diverge together, and so $\displaystyle\sum\limits_n\frac{a_n}{S_n}$ diverges.