Intereting Posts

A Banach space is reflexive if and only if its dual is reflexive
A complete metric space with approximate midpoints is intrinsic
Decomposition by subtraction
Kostrikin's Definition of Tensor Product
Prove Dirac's Theorem by induction on the number of vertices
A bounded net with a unique limit point must be convergent
Factorization of a map between CW complexes
Proof that if $p$ and $p^2+2$ are prime then $p^3+2$ is prime too
Upwind differencing scheme in Finite Volume Method (FVM)
k-th hitting time is a stopping time
Finding a unique subfield of $\mathbb{Q}(\zeta)$ of degree $2$?
Number of “passing” paths in football
Rangespace of Moore-Penrose pseudoinverse : $\mathcal{R}(A^+)=\mathcal{N}(A)^\perp$?
Can anyone explain the intuitive meaning of 'integrating on both sides of the equation' when solving differential equations?
The right “weigh” to do integrals

Let $X_i$ for $i=1,2,…$ be a sequence of i.i.d exponential random variables with common parameter $\lambda$. Let $N$ be a geometric random variable with parameter $p$ that is independent of the sequence $X_i$. What is the pdf of the random variable $Y=\Sigma_{i=1}^N X_i$.

- How to calculate probability using multinomial distribution?
- Andrei flips a coin over and over again until he gets a tail followed by a head, then he quits. What is the expected number of coin flips?
- Characteristic function of product of normal random variables
- Must the sequence $X_n$ converge to $0$ in probability?
- Random walk on natural number
- Is a linear factor more likely than a quadratic factor?
- What's the probability of a an outcome after N trials, if you stop trying once you're “successful”?
- Expected rank of a random binary matrix?
- Average bus waiting time
- Finding the number of red balls drawn before the first black ball is chosen

Let’s start by observing that the conditional random variable $Y| N$ follows $\Gamma$-distribution with parameters $N$ and mean $\mathsf{E}(Y|N) = N \, \mathsf{E}(X) = N \lambda$.

Then, for $y>0$

$$

f_Y(y) = \mathsf{E}\left(f_{Y|N}(y)\right) = \mathsf{E}\left( \frac{\lambda^N y^{N-1}}{(N-1)!} \mathrm{e}^{-\lambda y} \right) = \sum_{n=1}^\infty \frac{\lambda^n y^{n-1}}{(n-1)!} \mathrm{e}^{-\lambda y} (1-p)^{n-1} p = \lambda p \mathrm{e}^{-\lambda y} \mathrm{e}^{\lambda (1-p) y} = \lambda p \mathrm{e}^{-\lambda p y}

$$

Hence $Y$ is also the exponential random variable.

Another way of seeing this could be via use of characteristic function:

$$

\phi_Y(t) = \mathsf{E}\left(\exp\left(i t Y\right)\right) = \mathsf{E}\left(\mathsf{E}\left(\exp\left(i t Y\right)\mid N\right)\right) = \mathsf{E}\left( \left(\phi_{X}(t)\right)^N \right) = \frac{p \phi_X(t)}{1-(1-p) \phi_X(t)}

$$

using $\phi_X(t) = \frac{\lambda}{\lambda – i t}$ rearranging terms we get

$$

\phi_Y(t) = \frac{\lambda p}{\lambda p – i t}

$$

confirming that $Y$ is exponentially distributed with parameter $\lambda p$.

Another way to understand Sasha’s answer:

Consider a Poisson Process with rate $\lambda$. It is known that the waiting time between events is distributed according to a Exponential$(\lambda)$. Now consider a new process in which you remove events independently with probability $p$. This is a new Poisson Process with rate $p \lambda$. Hence, the waiting time is Exponential($p\lambda$). Observe that the waiting time between events in the new process follows the distribution you are looking after (where $X_{i}$ are the waiting times in the previous process).

We can also answer this with the following consideration:

The expected value of $Y$ is

$$E(\sum_{i=1}^N T_i) = E_{geom}\left(E_{exp}\left(\sum_{i=1}^N T_i | N\right)\right) = \frac{1}{p\lambda}.$$

So if $Y$ is exponentially distributed, it is so with parameter $p\lambda$. That is, we are left with the need to prove it being exponentially distributed. We get help from the following theorem:

A continuous random variable $Y : \Omega \to (0, \infty]$ has an exponential distribution if and only if it has the memoryless property

$$P(Y>s+t|Y>s) = P(Y>t) \text{ for all } s,t \ge 0.$$

We know that the geometric distribution is the only discrete random variable with the same property.

Here is where I struggle to give a formal proof. Imagine however a “clock” that ticks in exponentially distributed time intervals (i.e., a Poisson process). At any time point, independent of ticks in the past, there is no added information because the clock does not know how often it will still tick because the geometric distribution is memoryless and it also does not know when the next tick will be because the exponential distribution is memoryless. And so, the whole process is memoryless and $Y$ is exponentially distributed with paramter $p\lambda$.

- Prove Cardinality of Power set of $\mathbb{N}$, i.e $P(\mathbb{N})$ and set of infinite sequences of $0$s & $1$s is equal.
- Dicyclic group as subgroup of $S_6$?
- Drawing large rectangle under curve
- Identifying the series $\sum\limits_{k=-\infty}^{\infty} 2^k x^{2^k}$
- Guaranteeing a path is on the outer face of a planar graph
- Positive definiteness of the matrix $A+B$
- Proofs that the degree of an irrep divides the order of a group
- Extension of measures
- Matrix multiplication – Express a column as a linear combination
- Clarifying the relationship between outer measures, measures and measurable spaces: the converse direction
- Is this determinant always non-negative?
- A new imaginary number? $x^c = -x$
- Reference to self-study Abstract Algebra and Category Theory
- inequality involving complex exponential
- Solving an integral (with substitution?)