Articles of covariance

How to find the variance of $U= X-2Y+4Z$? & The Co-variance of $U=X-2Y+4Z$ and $V = 3X-Y-Z$

EDIT If the random variables $X,Y, Z$ have the expected, $$\text{ means: }\mu_{x}=2 \qquad \qquad \mu_{y}=-3 \qquad \qquad \mu_{z} = 4$$ $$ \text{variances: }\sigma_{x}^{2}=3 \qquad \qquad \sigma_{y}^{2}=2 \qquad \qquad \sigma^{2}_{z}=8$$ $$\text{covariances: }\text{cov}(X,Y) =1 \quad \quad \text{cov}(X,Z) = -2 \quad \quad \text{Cov}(Y,Z) = 3$$ find the variance of $U = X-2Y+4Z$. The co-variance of $U$ and […]

How to optimize a singular covariance-weighted residual?

Definitions: $$v(x)\equiv\{g_1(x),g_2(x),\ldots,g_n(x)\}^T$$ $$C\equiv \operatorname{cov}(v)=\langle vv^T \rangle -\langle v\rangle \langle v^T \rangle =\int f(x)v(x)v(x)^T \, dx-\int f(x)v(x) \, dx \int f(x’)v(x’)^T \, dx’$$ $$R(x)\equiv v(x)^T C^\dagger v(x)$$ z is an implicit parameter of f(x) and of all the g(x)’s. How can one go about optimizing R wrt z if C is singular? For an non-singular D, […]

Calculation of covariances $cov(x_i^{2},x_j)$ and $cov(x_i^{2},x_j^{2})$ for multinomial distribution

We know that,$ \ \ \ \ cov(x_i,x_j)=-n \ x_i \ x_j$. It can be proven in this manner: We know, $Var(x_i+x_j)=cov((x_i+x_j),(x_i+x_j))$ Now, $cov((x_i+x_j),(x_i+x_j))=cov(x_i,x_i)+2 \ cov(x_i,x_j) + cov(x_j,x_j)=Var(x_i)+Var(x_j)+2 \ cov(x_i,x_j)$ Since, $Var(x_i+x_j)=n(p_i+p_j)(1-p_i-p_j)$ and $Var(x_i)=np_i$ and $Var(x_j)=np_j$ Hence, $cov(x_i,x_j)=(\frac{1}{2})[Var(x_i+x_j)-Var(x_i)-Var(x_j)]=(\frac{1}{2})[n(p_i+p_j)(1-p_i-p_j)-np_i-np_j]=(\frac{1}{2})[-2 \ n \ p_i p_j]=-n p_i p_j$ Hence, $cov(x_i,x_j)=-n \ p_i \ p_j$ Now I am interested […]

Covariance between squared and exponential of Gaussian random variables

Assuming the random vector $[X \ \ Y]’$ follows a bivariate Gaussian distribution with mean $[\mu_X \ \ \mu_Y]’$, and covariance matrix $ \left[ \begin{array}{cc} \sigma_X ^2 & \sigma_Y \ \sigma_X \ \rho \\ \sigma_Y \ \sigma_X \ \rho & \sigma_Y^2 \end{array} \right]$, I am looking for the expression of $\text{Cov}(X^2, \exp{Y})$. I know there […]

Minimum / Maximum and other Advanced Properties of the Covariance of Two Random Variables

Are there any advanced results established regarding the behavior of the Covariance of two random variables other than the bounds on the correlation and independence when it is zero etc. which are usually summarized in introductory notes on this topic such as at this link? http://www.stat.yale.edu/~pollard/Courses/241.fall97/Variance.pdf Some Possibilities for Advanced Properties 1) For example, whether […]

A question on conditional expectation leading to zero covariance and vice versa

In my probability class I was tackled with this seemingly weird question involving conditional expectation: Let X,Y be two random variables (it is not mentioned whether or not they are discrete or continuous) and we are asked the following: For all constants $ \beta $ we have $ E[X | Y = \beta] = E[X] […]

How to prove inverse direction for correlation coefficient?

To show: If |Cor(X,Y)| = 1, then there exists a, b ∈ R s.t Y = bX + a. Any ideas or hints to proceed? Basically, I’ve to prove that if the absolute value of correlation b/w two random variables is 1, then they should be linearly related. So far, $$ |cor(X, Y)| = 1 […]

Weak/strong law of large numbers for dependent variables with bounded covariance

Let $(X_i)_{i\in\mathbb{N}}$ be a sequence of $L^2$ random variables with expected value $m$ for all $n$. Let $S_n=\sum_{i=1}^n X_i$ and $|\mathrm{Cov}(X_i,X_j)|\leq\epsilon_{|i-j|}$ for finite, non-negative constants $\epsilon_k$. Show that: (1) If $\lim_{n\to\infty} \epsilon_n=0$ then $S_n/n\to m$ in $L^2$ and probability (2) If $\sum_{k=1}^\infty \epsilon_k<\infty$, then $\mathrm{Var}(S_n/n)$ is of order $O(1/n)$ and $S_n/n\to m$ almost surely (1) […]

Uncorrelating random variables.

I was reading this answer, and the first sentence seemed more intuitive at first than after thinking through it: If $\pmatrix{X\\ Y}$ is bivariate normal with mean $\pmatrix{0\\0}$ and covariance matrix $\Sigma=\pmatrix{1&\rho\\\rho&1}$, then $\pmatrix{U\\V}=\Sigma^{-1/2} \pmatrix{X\\Y}$ is bivariate normal with mean $\pmatrix{0\\0}$ and covariance matrix $\pmatrix{1&0\\ 0&1}.$ That is, $U$ and $V$ are independent, standard normal […]

What does rotational invariance mean in statistics?

What does rotational invariance mean in statistics? The property that the normal distribution satisfies for independent normal distributed $X_i$, $\Sigma_i X_i$ is also normal with variance $\Sigma_i Var(X_i)$ is referred to as rotational invariance and I want to know why.