Articles of conditional expectation

Version of Conditional Expectation

I would like to proof the following theorem. Let $(X,Y)$ be a random variable with values in $\mathbb{R}^2$. Supposte that $\mathcal{L}(X,Y)$ has density $f(.,.)$ with respect to Lebesgue measure $\lambda^2$ on $\mathbb{R}^2$. If $E[|X|]<\infty$ holds, then $$E[X|Y]=\frac{\int_\mathbb{R}xf(x,Y)dx}{\int_\mathbb{R}f(x,Y)dx}$$ I started like this: Let $A=Y^{-1}(B)$ for $B\in\mathcal{B}(\mathbb{R})$. Then $$ \int_A\frac{\int_\mathbb{R}xf(x,Y)dx}{\int_\mathbb{R}f(x,Y)dx}dP \\=\int_B\frac{\int_\mathbb{R}xf(x,y)dx}{\int_\mathbb{R}f(x,y)dx} P\circ Y^{-1}(dy) \\=\int_B(\frac{\int_\mathbb{R}xf(x,y)dx}{\int_\mathbb{R}f(x,y)dx}\int_\mathbb{R}f(x,y)dx)\ dy \\=\int_B\int_\mathbb{R}xf(x,y)dx\ dy […]

Properties Least Mean Fourth Error

I am interested in whether a quantity \begin{align*} E[(X-E[X|Y])^4] \end{align*} has been studied in the literature before. I am not even sure if “least mean fourth error” is a correct name, since $g(Y)=E[X|Y]$ might not be the best estimator for the $\inf_{g(y)} E[(X-g(Y))^4]$. However, I am intersted in $E[(X-E[X|Y])^4]$ rather then $\inf_{g(y)} E[(X-g(Y))^4]$. Specifically, I […]

Independence and conditional expectation

So, it’s pretty clear that for independent $X,Y\in L_1(P)$ (with $E(X|Y)=E(X|\sigma(Y))$), we have $E(X|Y)=E(X)$. It is also quite easy to construct an example (for instance, $X=Y=1$) which shows that $E(X|Y)=E(X)$, does not imply independence of $X$ and $Y$. However, since $X,Y$ are independent iff $E(f(X)g(Y))=E(f(X))E(g(Y))$ for all bounded Borel functions $f,g:\mathbb{R}\to\mathbb{R}$, does it not make […]

Conditional expectation w.r.t. random variable and w.r.t. $\sigma$-algebra, equivalence

Let $\Omega = \{ \omega_i \}$ be a countable set, and consider some probability space $(\omega, \mathcal F, P)$ with $p_i := P(\{ w_i \})$. Let $X : \Omega \to \mathbb R$ be a random variable, then it’s expected value is $$ E(X) := \sum_i X(\omega_i) \cdot p_i. $$ Now it is possible to define […]

Conditional distribution of maximum of i.i.d. normal sample given its sum

I’m trying to find the conditional probability $P(Y_n>a|\bar{X}=x)$ where $Y_n=\max(X_1,…,X_n)$. $X_1,…X_n$ are normal sample from $N(\mu,1)$ My attempt : $$P(Y_n>a|\bar{X}=x)=\frac{P(Y_n>a , \bar{X}=x)}{f_{\bar{X}}(x)}$$ $$P(Y_n>a, \bar{X}=x) = \int_a^\infty f_{Y_n, \bar{X}}(y,x)dy$$ and I know that $n\bar{X}=\sum_1^nY_i$. But I cannot find the joint density of $Y_n$ and $\bar{X}$. How can I get further from here?

Why is $\mathbb{E}(\phi(X,Y) \mid X) = \mathbb{E}\phi(x,Y)|_{x=X}$ if $X$ and $Y$ are independent?

Let $X$ and $Y$ be two independent (real – valued) random variables , both defined on the probability space $(\Omega ,A,P)$ (a) $E|X| < \infty $ , $E|Y| < \infty $ . Let $g(.):R \to R$ (set of real number) be $g(x) = x + E[Y]$. Show that $E[X + Y \mid X] = g(X)$. […]

Understanding the measurability of conditional expectations

My question is about the conditional expectation of random variables with respect to a $\sigma$-algebra. I am having trouble getting an intuition behind the definitions among other things. I know that if $X$ is $\mathcal{G}$-measurable then $\mathbb{E} [X| \mathcal{G}] = X$, but what if $X$ is not $\mathcal{G}$-measurable? Is this expression just not defined? Furthermore, […]

$X = E(Y | \sigma(X)) $ and $Y = E(X | \sigma(Y))$

Suppose $X, Y$ are random variables in $L^2$ such that $$X = E(Y | \sigma(X)) $$ $$Y = E(X | \sigma(Y))$$ Then I want to show that $X=Y$ almost everywhere. What I’ve done: By conditional Jensen $$E(X^2|\sigma(Y)) \geq E(X|\sigma(Y))^2 = Y^2 $$ a.e and thus $||X||_2 \geq ||Y||_2 $. Analogously $||Y||_2 \geq ||X||_2 $ and […]

Is it true that $E = X$?

Let $X$ be an integrable random variable. I want to know if it is true that $E[X|X^2] = X$. At first I thought it is true because $X$ is measurable with respect of the $\sigma$-algebra generated by $X^2$, because $f(x) = x^2$ is a Borel measurable function. But after a while I noticed that: $$\sigma(f(X))\subset […]

Expectation of maximum of arithmetic means of i.i.d. exponential random variables

Given the sequence $(X_n), n=1,2,… $, of iid exponential random variables with parameter $1$, define: $$ M_n := \max \left\{ X_1, \frac{X_1+X_2}{2}, …,\frac{X_1+\dots+X_n}{n} \right\} $$ I want to calculate $\mathbb{E}(M_n)$. Running a simulation leads me to believe that $$ \mathbb{E}(M_n)=1+\frac{1}{2^2}+\cdots+\frac{1}{n^2} = H_n^{(2)}.$$ Is this correct? If yes, how would one go proving it? I tried […]