If $\mu(X)=1$, then $\lim_{p\rightarrow 0}\|f\|_p=\exp\left(\int_X\ln(|f|)d\mu\right)$

I am trying to prove the following result:

Let $(X,\mu)$ be a measure space of measure 1 and $f$ a complex-valued function on $X$ such that there exists $r>0$ satisfying $\|f\|_r<\infty$. Prove that $\lim_{p\rightarrow 0}\|f\|_p=\exp\left(\int_X\ln(|f|)d\mu\right)$.

I already proved that the RHS term is lower than the LHS term, thanks to Jensen’s inequality. However, I find the converse inequality much harder. I thought about defining $X_-=\{x\in X:|f(x)|<1\}$, $X_+=\{x\in X:|f(x)|\geq1\}$ and proving that

$$\forall 0<a<1, \forall b>1, \exists p>0: a\int_{X_-}\ln(|f|)d\mu + b\int_{X_+}\ln(|f|)d\mu\geq \frac{1}{p}\ln\left(\int_X |f|^p d\mu\right),$$

which would imply the result by applying the exponential and considering $a,b\rightarrow 1$ (and recalling that the $p$-norm decreases when $p$ decreases). But I haven’t found a way to prove that this last inequality holds for a certain $p$. Is there a way to do it ? (or maybe a simpler way to solve the problem)

Solutions Collecting From Web of "If $\mu(X)=1$, then $\lim_{p\rightarrow 0}\|f\|_p=\exp\left(\int_X\ln(|f|)d\mu\right)$"

The proof can be found in Section 6.8 of the old book Inequalities by Hardy, Littlewood and Polya. As Stephen Montgomery-Smith remarks in his answer, the proof is rather easy if $\log |f|$ stays bounded. In the general case, Hardy suggests to write
$$
\int \log |f| \leq \log \|f\|_q \leq \int \frac{|f|^q-1}{q}
$$
and to remark that, for $t>0$, $q \mapsto \frac{t^q-1}{q}$ decreases to $\log t$ as $q$ decreases to zero. By monotone convergence, you can now let $q \to 0^+$ and conclude.

Do it first for the case $\ln|f|$ is bounded. Use $|f|^p = \exp(p\,\ln|f|) = 1 + p \ln|f| + O(p^2)$ as $p \to 0$. Then
$$ \left(\int |f|^p \, d\mu\right)^{1/p} = \left(1 + p\int \ln|f| \, d\mu + O(p)^2\right)^{1/p} $$
and use $(1+px + O(p^2))^{1/p} \to e^x$ as $p\to 0$. To get it for unbounded functions, you probably have to use some kind of dominated or monotone convergence theorem, but I’ll let you figure out those details.