Articles of expectation

Density of Gaussian Random variable conditioned on sum

I am struggling with this simple problem. I have two Gaussian independent random variables $X \sim \mathcal{N}(0,\sigma_x^2,), Y \sim \mathcal{N}(0,\sigma_y^2,)$. I have to find the density of $X$ given that $X + Y > 0$. I know that $X, X+Y$ shall be jointly normal distributed and I also know the forms of conditional distribution of […]

Should you ever stop rolling, THE SEQUEL

Inspired by this question, I want to know if there is a version of the scenario that actually fits Newb’s intuition about the problem. Scenario template You roll a 6-sided die and add up the cumulative sum of your rolls. The game ends under the following conditions, with the associated payouts: You choose to stop […]

Since $\mathbb{E}$ is defined as the integral from $0$ to infinity of $S(x)$, what do you do when $-1<x<1$?

I have the PDF $f(x)=\frac{3}{4}(1-x^2)\mathbf 1_{-1<x<1}$ and accordingly the CDF $$F(x)=\begin{cases}0, &\phantom{-}x\le -1\\\frac{3}{4}x-\frac{1}{4}x^3+\frac12, & -1<x<1 \\1, & \phantom{-}1\le x\end{cases}$$ Since the formula for $\mathbb{E}[X]$ is the integral of $S(x)=1-F(x)$ is from $0$ to $\infty$, how do I account for the fact that $x$ is only defined from $-1$ to $1$ when I need to calculate […]

Find a sequence of r.v's satisfying the following conditions

I think part a) can be solved by using $X_n=\frac{1}{n}\chi_{[0,n^2]}$ Not sure about part b).

Is $E\left$ = $\frac{1}{\sum_{i=1}^{n} E\left}$?

If $X_i$’s are i.i.d. random variables then is this statement true? $$E\left[\frac{1}{\sum_{i=1}^{n}X_i}\right] = \frac{1}{\sum_{i=1}^{n} E\left[X_i \right]}$$ Here $E\left[X\right]$ is the expected value of a random variable $X$ Edit – I was thinking that if each $X_i$ corresponds to the result of an independent random experiment, then will the given equation be true or false? I […]

How to calculate expected number of trials of this geometric distribution

I understand why the expected number of trials until there is a success is given by $$ \sum_{i=0}^{\infty} i p q^{i-1} \ = \ E[\text{number of trials until} \ X=1] = \frac{1}{p} $$ where $p$ is the probability of success and $X=1$ denotes the first success. However, I have different problem. In my setting, after […]

Swapping the $i$th largest card between $2$ hands of cards

Introduction to “game” There are $2$ players and $2n$ cards, labelled $1, 1, 2, 2, 3, 3, 4, 4, \dots, n, n$. ($2$ of each card from $1$ to $n$) Firstly, the $2n$ players are each given $n$ cards randomly. More specifically, a random ordering of the $2n$ cards is achieved and the first player […]

convergence in mean square implies convergence of variance

I need some hints for the following question: Suppose $X,X_1,X_2, \cdots \in L^2(\Omega)$ are random variables that converge in mean square. Show that $Var[X_n] \rightarrow Var[X]$. Convergence in mean square implies that as $n \rightarrow \infty$ we have that $\mathbb{E}[(X_n-X)^2] \rightarrow 0$. I tried to use the definition of variance $Var[X]=\mathbb{E}[X^2]-\mathbb{E}[X]^2$ and trying to prove […]

Expectation of maximum of arithmetic means of i.i.d. exponential random variables

Given the sequence $(X_n), n=1,2,… $, of iid exponential random variables with parameter $1$, define: $$ M_n := \max \left\{ X_1, \frac{X_1+X_2}{2}, …,\frac{X_1+\dots+X_n}{n} \right\} $$ I want to calculate $\mathbb{E}(M_n)$. Running a simulation leads me to believe that $$ \mathbb{E}(M_n)=1+\frac{1}{2^2}+\cdots+\frac{1}{n^2} = H_n^{(2)}.$$ Is this correct? If yes, how would one go proving it? I tried […]

Coupon collector problem doubts

The Coupon Collector problem off Wikipedia: Suppose that there is an urn of $n$ different coupons, from which coupons are being collected, equally likely, with replacement. How many coupons do you expect you need to draw with replacement before having drawn each coupon at least once? The standard method to solve it divides the time […]