Intereting Posts

How to define own group action in GAP?
Coin Tossing Game Optimal Strategy Part 2
Convergence of $\frac{a_n}{n}$ where $a_0=1$ and $a_n=a_{\frac{n}{2}}+a_{\frac{n}{3}}+a_{\frac{n}{6}}$
Why do differential forms and integrands have different transformation behaviours under diffeomorphisms?
Comparison Statement
A closed ball in a metric space is a closed set
Regulators and uniqueness
Ring germs of $C^{\infty}$ functions on the real line
How to calculate the lens distortion coefficients with a known displacement vector field?
Showing every knot has a regular projection using differential topology
Why is $\mathbb{Z}$ not a euclidean domain? What goes wrong with the degree function?
How to show that $\sum_{n=1}^{\infty} \frac{1}{(2n+1)(2n+2)(2n+3)}=\ln(2)-1/2$?
Does the sequence$ f_n(x)=\frac{x}{1+nx^2}$ converge uniformly on $\mathbb{R}$?
Operator norm. Alternative definition
order of operations division

In the multivariate Gaussian distribution, it is required that the covariance matrix be positive semidefinite. I have read that a positive semidefinite matrix $\Sigma$ can be written as $LL^{T}$. I have also seen that $\Sigma=(C^{T}C)^{-1}$ where $C$ is positive definite (is this true?). However, I’m entirely sure why this is true.

I want to know what facts do we know about positive semidefinite matrices that can be used in order to manipulate a multivariate Gaussian or similar distributions. For example, in a Gaussian distribution:

$$f(x)=\exp{\{-\frac{1}{2}(x-\mu)^{T}\Sigma^{-1}(x-\mu)\}}$$

- Degree of minimum polynomial at most n without Cayley-Hamilton?
- Simultaneously Diagonalizable Proof
- Prove that if $f$ is continuous at $0$, it is continuous on $\mathbb{R}$
- Every $n\times n$ matrix is the sum of a diagonalizable matrix and a nilpotent matrix.
- Reference for matrix calculus
- Determinant of matrix exponential?

we can use $\Sigma^{-1}=C^{T}C$ (although I’m not entirely certain this is correct). Therefore, a change of variable $y = C (x-\mu)$ can be used to write $f(x)$ as:

$$f(x)=\exp{\{-\frac{1}{2}y^{T}y\}}$$

Another example could be using $\Sigma^{-1} = \sum_{i=1}^{D} \frac{1}{\lambda_{i}}u_{i}u_{i}^{T}$ where $u_{i}$ are the eigenvectors of $\Sigma$ and $\lambda_{i}$ its eigenvalues. This expression also can be used to simplify $f(x)$.

What other representations are there for $\Sigma^{-1}$? Or what is the correct procedure in the examples I described above?

**UPDATE:**

A third example could be simply taking $\Sigma^{-1} = \Sigma^{-\frac{1}{2} T}\Sigma^{-\frac{1}{2}}$ (because $\Sigma$ is symmetric) and change variables with $y = \Sigma^{-1/2}(x-\mu)$.

Initiallly I had written a fourth option with $\Sigma$ as $G^{T}DG$ with $G$ as a orthogonal matrix and $D$ as an diagonal matrix containing $\Sigma$’s eigenvalues. I think that’s wrong and the correct decomposition should be $\Sigma=G\Lambda G^{T}$, however this still produces issues because a change of variable $y=G^{T}(x-\mu)$ leads to a differential volume in the following form:

$$dy_{1}dy_{2}…dy_{n} = |G^{T}|dx_{1}dx_{2}…dx_{n}$$

but the determinant of $G$ might be 1 or -1, which is not good.

Thanks!

- Expected value for maximum of n normal random variable
- Generate integer matrices with integer eigenvalues
- distance between two eigen vectors corresponding to two different matrices in a normed space
- Are most matrices invertible?
- What kind of matrices are non-diagonalizable?
- Prove that the eigenvalues of a block matrix are the combined eigenvalues of its blocks
- Computing the trace and determinant of $A+B$, given eigenvalues of $A$ and an expression for $B$
- Definiteness of a general partitioned matrix $\mathbf M=\left$
- Why is this true: The only orthogonal projection that is also unitary from $\Bbb C^n$ to $\Bbb C^n$ is the identity
- Taking a derivative with respect to a matrix

- If $a$, $a+2$ and $a+4$ are prime numbers then, how can one prove that there is only one solution for $a$?
- Comparing Eigenvalues of Positive Semidefinite Matrices
- Calculating a spread of $m$ vectors in an $n$-dimensional space
- If $f$ has a essential singularity at $P$, then for $(z-P)^m f(z)$ has also an essential singularity.
- Direct proof that $(5/p)=1$ if $p\equiv 1\pmod{5}$.
- small o(1) notation
- Dot product of two vectors
- Ring homomorphisms $\mathbb{R} \to \mathbb{R}$.
- Why are two permutations conjugate iff they have the same cycle structure?
- What are some mathematically interesting computations involving matrices?
- Learning differential calculus through infinitesimals
- why the column sums of character table are integers?
- When is matrix multiplication commutative?
- A Problem in Evans' PDE
- In stochastic calculus, why do we have $(dt)^2=0$ and other results?