PDF of a sum of exponential random variables

Let $X_i$ for $i=1,2,…$ be a sequence of i.i.d exponential random variables with common parameter $\lambda$. Let $N$ be a geometric random variable with parameter $p$ that is independent of the sequence $X_i$. What is the pdf of the random variable $Y=\Sigma_{i=1}^N X_i$.

Solutions Collecting From Web of "PDF of a sum of exponential random variables"

Let’s start by observing that the conditional random variable $Y| N$ follows $\Gamma$-distribution with parameters $N$ and mean $\mathsf{E}(Y|N) = N \, \mathsf{E}(X) = N \lambda$.

Then, for $y>0$
$$
f_Y(y) = \mathsf{E}\left(f_{Y|N}(y)\right) = \mathsf{E}\left( \frac{\lambda^N y^{N-1}}{(N-1)!} \mathrm{e}^{-\lambda y} \right) = \sum_{n=1}^\infty \frac{\lambda^n y^{n-1}}{(n-1)!} \mathrm{e}^{-\lambda y} (1-p)^{n-1} p = \lambda p \mathrm{e}^{-\lambda y} \mathrm{e}^{\lambda (1-p) y} = \lambda p \mathrm{e}^{-\lambda p y}
$$
Hence $Y$ is also the exponential random variable.

Another way of seeing this could be via use of characteristic function:
$$
\phi_Y(t) = \mathsf{E}\left(\exp\left(i t Y\right)\right) = \mathsf{E}\left(\mathsf{E}\left(\exp\left(i t Y\right)\mid N\right)\right) = \mathsf{E}\left( \left(\phi_{X}(t)\right)^N \right) = \frac{p \phi_X(t)}{1-(1-p) \phi_X(t)}
$$
using $\phi_X(t) = \frac{\lambda}{\lambda – i t}$ rearranging terms we get
$$
\phi_Y(t) = \frac{\lambda p}{\lambda p – i t}
$$
confirming that $Y$ is exponentially distributed with parameter $\lambda p$.

Another way to understand Sasha’s answer:

Consider a Poisson Process with rate $\lambda$. It is known that the waiting time between events is distributed according to a Exponential$(\lambda)$. Now consider a new process in which you remove events independently with probability $p$. This is a new Poisson Process with rate $p \lambda$. Hence, the waiting time is Exponential($p\lambda$). Observe that the waiting time between events in the new process follows the distribution you are looking after (where $X_{i}$ are the waiting times in the previous process).

We can also answer this with the following consideration:

The expected value of $Y$ is
$$E(\sum_{i=1}^N T_i) = E_{geom}\left(E_{exp}\left(\sum_{i=1}^N T_i | N\right)\right) = \frac{1}{p\lambda}.$$

So if $Y$ is exponentially distributed, it is so with parameter $p\lambda$. That is, we are left with the need to prove it being exponentially distributed. We get help from the following theorem:

A continuous random variable $Y : \Omega \to (0, \infty]$ has an exponential distribution if and only if it has the memoryless property
$$P(Y>s+t|Y>s) = P(Y>t) \text{ for all } s,t \ge 0.$$

We know that the geometric distribution is the only discrete random variable with the same property.

Here is where I struggle to give a formal proof. Imagine however a “clock” that ticks in exponentially distributed time intervals (i.e., a Poisson process). At any time point, independent of ticks in the past, there is no added information because the clock does not know how often it will still tick because the geometric distribution is memoryless and it also does not know when the next tick will be because the exponential distribution is memoryless. And so, the whole process is memoryless and $Y$ is exponentially distributed with paramter $p\lambda$.