# application of strong vs weak law of large numbers

By definition, the weak law states that for a specified large $n$, the average is likely to be near $\mu$. Thus, it leaves open the possibility that $|\bar{X_n}-\mu| \gt \eta$ happens an infinite number of times, although at infrequent intervals.

The strong law shows that this almost surely will not occur. In particular, it implies that with probability 1, we have that for any $\eta > 0$ the inequality $|\bar{X_n}-\mu| \lt \eta$ holds for all large enough $n$.

Now my question is application of these laws. How do I know which distribution satisfies the strong law vs the weak law. For example, consider a distribution $X_n$ be iid with finite variances and zero means. Does the mean $\frac{\sum_{k=1}^{n} X_k}{n}$ converge to $0$ almost surely (strong law of large numbers) or only in probability (weak law of large numbers)?

#### Solutions Collecting From Web of "application of strong vs weak law of large numbers"

If $X_1,X_2,\ldots$ is a sequence of i.i.d. random variables with finite mean $\mu$ (in your example, $\mu = 0$),
then by the strong law of large numbers, $\frac{{\sum\nolimits_{i = 1}^n {X_i } }}{n}$ converges to $\mu$ almost surely. In particular, $\frac{{\sum\nolimits_{i = 1}^n {X_i } }}{n}$ converges to $\mu$ in probability. So, you actually don’t have to assume finite variance.

From section 7.4 of Grimmett and Stirzaker’s Probability and Random Processes (3rd edition).

The independent and identically distributed sequence $(X_n)$, with common distribution function $F$, satisfies $${1\over n} \sum_{i=1}^n X_i\to \mu$$ in probability for some constant $\mu$ if and only if the characteristic function $\phi$ of $X_n$ is differentiable at $t=0$ and $\phi^\prime(0)=i \mu$.

For instance, the weak law holds but the strong law fails for $\mu=0$ and symmetric random variables with $1-F(x)\sim 1/(x\log(x))$ as $x\to\infty$.