Simplifying covariance matrices in distributions

In the multivariate Gaussian distribution, it is required that the covariance matrix be positive semidefinite. I have read that a positive semidefinite matrix $\Sigma$ can be written as $LL^{T}$. I have also seen that $\Sigma=(C^{T}C)^{-1}$ where $C$ is positive definite (is this true?). However, I’m entirely sure why this is true.

I want to know what facts do we know about positive semidefinite matrices that can be used in order to manipulate a multivariate Gaussian or similar distributions. For example, in a Gaussian distribution:


we can use $\Sigma^{-1}=C^{T}C$ (although I’m not entirely certain this is correct). Therefore, a change of variable $y = C (x-\mu)$ can be used to write $f(x)$ as:


Another example could be using $\Sigma^{-1} = \sum_{i=1}^{D} \frac{1}{\lambda_{i}}u_{i}u_{i}^{T}$ where $u_{i}$ are the eigenvectors of $\Sigma$ and $\lambda_{i}$ its eigenvalues. This expression also can be used to simplify $f(x)$.

What other representations are there for $\Sigma^{-1}$? Or what is the correct procedure in the examples I described above?


A third example could be simply taking $\Sigma^{-1} = \Sigma^{-\frac{1}{2} T}\Sigma^{-\frac{1}{2}}$ (because $\Sigma$ is symmetric) and change variables with $y = \Sigma^{-1/2}(x-\mu)$.

Initiallly I had written a fourth option with $\Sigma$ as $G^{T}DG$ with $G$ as a orthogonal matrix and $D$ as an diagonal matrix containing $\Sigma$’s eigenvalues. I think that’s wrong and the correct decomposition should be $\Sigma=G\Lambda G^{T}$, however this still produces issues because a change of variable $y=G^{T}(x-\mu)$ leads to a differential volume in the following form:

$$dy_{1}dy_{2}…dy_{n} = |G^{T}|dx_{1}dx_{2}…dx_{n}$$

but the determinant of $G$ might be 1 or -1, which is not good.


Solutions Collecting From Web of "Simplifying covariance matrices in distributions"