Articles of random variables

Difference between two independent binomial random variables with equal success probability

Let $X$ ~ $Bin(n,p)$ and $Y$ ~ $Bin(m,p)$ be two independent random variables. Find the distribution of $Z=X-Y$. see also Difference of two binomial random variables I figured this out: $$ P(Z=z)=\cases{\sum_{i=o}^{min(m,n)} Bin(k+i,n,p)*Bin(i,m,p), &if $z\ge0$;\cr \sum_{i=0}^{min(m,n)} Bin(i,n,p) * Bin(i-z, m, p),&otherwise. \cr}$$ I also validated it by Monte Carlo simulation. For $n=30$, $m=20$ and $p=0.5$, […]

Show that $ \mathbb{E} < \infty \Longleftrightarrow \sum_{n=1}^\infty \mathbb{P} < \infty $ for random variable $X$

Let $X$ be a discrete random variable taking values in $\mathbb{N}$. Show that $ \mathbb{E}[X] < \infty \Longleftrightarrow \sum_{n=1}^\infty \mathbb{P}[X>n] < \infty $. I am not that good at math, but could you please explain to me why this does not work the way I think it does .. I am going to speak loud […]

Show that $C_1= [\frac{k}{2^n},\frac{k+1}{2^n})$ generates the Borel σ-algebra on R.

Let $C_1$ be the collection of intervals of the form $[\frac{k}{2^n},\frac{k+1}{2^n})$ where $n = 1, 2, 3, . . . and $ k ∈ Z , together with the empty set. a) Show that C_1 generates the Borel σ-algebra on R. b) Show clearly and explicitly that $C_1$ is a p-system. I think for first […]

Kolmogorov-Smirnov two-sample test

I want to test if two samples are drawn from the same distribution. I generated two random arrays and used a python function to derive the KS statistic $D$ and the two-tailed p-value $P$: >>> import numpy as np >>> from scipy import stats >>> a=np.random.random_integers(1,9,4) >>> a array([3, 7, 4, 3]) >>> b=np.random.random_integers(1,9,5) >>> […]

Characteristic function of a product of random variables

I am facing the following problem. Let $X,Y$ be independent random variables with standard normal distribution. Find the characteristic function of a variable $ XY $. I have found some information, especially the fact that if $ \phi_X,\phi_Y $ denote characteristic functions, then $$ \phi_{XY}(t) = \mathbb{E}\phi_X(tY).$$ The only problem is that the proof required […]

Variance of sample mean (problems with proof)

Assuming that I have $\{x_1,\ldots, x_N\}$ – an iid (independent identically distributed) sample size $N$ of observations of random variable $\xi$ with unknown mean $m_1$, variance (second central moment) $m_{c_2}$ and second raw moment $m_2$. I try to use sample mean $\overline{x}=\frac{1}{N}\sum_{i=1}^Nx_i$ as an estimator of the true mean. So I want to find its’ […]

Find the probability density function of $Y=3X+2$.

A continuous random variable $X$ has probability density function, $$f(x)=\begin{cases}\frac{3}{5}e^{-3x/5}, & x>0 \\ 0, & x\leq0\end{cases}$$ Then find the probability density function of $Y=3X+2$. So I have difficulty in solving this problem. I am confuse about how to relate the density function of $X$ to that of $Y$. So I look up for hints, so […]

Distribution of the lifetime of a system consisting of two exponentially distributed components, one being backup

I have a system consisting of components $S_1$ and $S_2$ whose lifetimes $T_1$ and $T_2$ follow the exponential distribution with parameter $\lambda$. At time $t=0$ the component $S_1$ is switched on and $S_2$ is kept off until $S_1$ fails (and is immediately switched on). What is the distribution of the lifetime of the system? To […]

Density of Gaussian Random variable conditioned on sum

I am struggling with this simple problem. I have two Gaussian independent random variables $X \sim \mathcal{N}(0,\sigma_x^2,), Y \sim \mathcal{N}(0,\sigma_y^2,)$. I have to find the density of $X$ given that $X + Y > 0$. I know that $X, X+Y$ shall be jointly normal distributed and I also know the forms of conditional distribution of […]

Discrete random variable with infinite expectation

Consider a discrete random variable taking only positive integers as values with $$\mathbb{P}[X=n]=\frac{1}{n(n+1)}.$$ (a) Show that $\mathbb{E}[X]=\infty$. (b) Show that $\mathbb{P}[X \geq n]= \frac{1}{n}$. What does this imply for Markov’s inequality ?