Properties Least Mean Fourth Error

I am interested in whether a quantity
\begin{align*}
E[(X-E[X|Y])^4]
\end{align*}
has been studied in the literature before. I am not even sure if “least mean fourth error” is a correct name, since $g(Y)=E[X|Y]$ might not be the best estimator for the $\inf_{g(y)} E[(X-g(Y))^4]$. However, I am intersted in $E[(X-E[X|Y])^4]$ rather then $\inf_{g(y)} E[(X-g(Y))^4]$.

Specifically, I am interested in the case when
\begin{align*}
Y=X+Z
\end{align*}
where $X$ is zero mean and unit variance and $Z \sim \mathcal{N}(0,1)$ and independent of $X$.

Here are some bounds that I was able to come up with:

Lower Bound: For example we can related it to MMSE via Jensen’s inequality
\begin{align*}
E[(X-E[X|Y])^4] \ge E^2[(X-E[X|Y])^2]=MMSE^2
\end{align*}

Upper Bound 1: Also, using Minkowski inequality and assuming $E[X^4]$ exists we can have
\begin{align*}
E[(X-E[X|Y])^4]& \le (E[X^4]^{1/4}+E[E^4[X|Y]]^{1/4})^4 \\
&\le 2^4 E[X^4]
\end{align*}

Upper bound 2:
Observe that
\begin{align*}
(X-E[X|Y])=-(Z-E[Z|Y])
\end{align*}
So,
\begin{align*}
E[(X-E[X|Y])^4]=E(Z-E[Z|Y])^4] \le 2^4 E[Z^4].
\end{align*}

The nice thing about upper bound 2 is that it does not require any assumptions about $X$. Moreover, since $Z$ is Gaussian $E[Z^4]$ is well defined.

Need Help with

  1. Improve the upper bounds I have. For example, can we get rid of factor $ 2^4$?
  2. Can we say that over all random variables with $E[X^2]\le 1 $ Gaussian $X$ maximizes $E[(X-E[X|Y])^4]$. This is true for $E[(X-E[X|Y])^2]$ and I don’t feel that $E[(X-E[X|Y])^4]$ is very different.
  3. Any references on quantities $E[(X-E[X|Y])^4]$ or $\inf_{g(y)} E[(X-g(Y))^4]$

Thanks a lot for any hoel

Solutions Collecting From Web of "Properties Least Mean Fourth Error"

Well, for finding a deterministic function $g(y)$ to minimize $E[(X-g(Y))^4]$, we can do this:

\begin{align}
E[(X-g(Y))^4] &= \int_{y \in \mathbb{R}} E[(X-g(Y))^4|Y=y]f_Y(y)dy \\
&\geq \int_{y\in\mathbb{R}} \left(\inf_{\theta}E[(X-\theta)^4|Y=y]\right)f_Y(y)dy
\end{align}
where equality is achieved by the function $g^*(y) = \arg\inf_{\theta} E[(X-\theta)^4|Y=y]$. To compute this function in more detail, first consider a more general problem: For a general random variable $W$, find a constant $\theta \in \mathbb{R}$ to minimize $E[(W-\theta)^4]$. So, we minimize the expression:
$$ E[(W-\theta)^4] = E[X^4] – 3E[X^3]\theta + 6E[X^2]\theta^2 -3E[X]\theta^3 + \theta^4 $$
Taking derivatives gives:
$$ 0 = -3E[X^3] + 12E[X^2]\theta – 9E[X]\theta^2 + 4\theta^3 $$
There are at most three roots, we need to find all three and then select the best one.

Thus, using $W = X|Y$, we can find $g^*(y)$ for each given $y$ by selecting the best root of the (at most three) roots of:
$$ \boxed{0 = -3E[X^3|Y=y] + 12E[X^2|Y=y]\theta – 9E[X|Y=y]\theta^2 + 4\theta^3} $$


One way to show the minimizer $g^*(y)$ for $E[(X-g(Y))^4]$ is different from that of $E[(X-g(Y))^2]$ is to use the example when $Z$ and $X$ are zero mean and unit variance. In this case the best mean-square-error estimator is linear in $Y$: $g(Y)=Y/2$.

On the other hand, you can find the best linear estimator for minimizing $E[(X-g(Y))^4]$ by defining $g(Y)=aY$ for some constant $a$, and then minimizing $E[(X-aY)^4]$:
$$ E[(X-aY)^4] = E[X^4] -3aE[X^3Y] + … $$
I believe the answer will be different from $g(Y)=Y/2$, and the optimal (possibly nonlinear) is no worse, so $E[X|Y]$ is strictly suboptimal in this case.