I am trying to understand singular value decomposition. I get the general definition and how to solve for the singular values of form the SVD of a given matrix however, I came across the following problem and realized that I did not fully understand how SVD works: Let $0\ne u\in \mathbb{R}^{m}$. Determine an SVD for […]

Suppose $\Omega$ is a Gaussian matrix with entries distributed i.i.d. according to normal distribution $\mathcal{N}(0,1)$. Let $U \Sigma V^{\mathsf T}$ be its singular value decomposition. What would be the distribution of the column (or row) vectors of $U$ and $V$? Would it be a Gaussian or anything closely related?

Define $S^n_{++}$ to be the set that contains all the positive definite matrices. That is, if $A \in S^n_{++}$, then $A$ is a positive definite matrix. Now suppose that $A,B \in S^n_{++}$ are two positive definite matrices. How to prove that $$B – A \in S^n_{++}$$ if and only if $$I – A^{1/2}B^{-1}A^{1/2} \in S^n_{++}?$$

In this post J.M. has mentioned that … In fact, using the SVD to perform PCA makes much better sense numerically than forming the covariance matrix to begin with, since the formation of $XX^\top$ can cause loss of precision. This is detailed in books on numerical linear algebra, but I’ll leave you with an example […]

I am giving a presentation in two days about a search engine I have been making the past summer, and my research involved the use of singular value decompositions, or in other words, $A=U\Sigma V^T$. I took a high school course on Linear Algebra last year, but the course was not very thorough, and though […]

I am dealing with a problem similar to principal component analysis. Aka, I have a matrix and i want to recover the ‘most efficient basis’ to exaplin the matrix variability. With a square matrix these are the eigenvectors, weighted by the eigenvalues. Originally, I was dealing with square matrices, and I used eigendecomposition to recover […]

Given two square matrices $A$ and $B$, is the following inequality $$\operatorname{cond}(AB) \leq \operatorname{cond}(A)\operatorname{cond}(B),$$ where $\operatorname {cond}$ is the condition number, true? Is this still true for rectangular matrices? I know this is true: $$||AB|| \leq ||A|| \cdot ||B||$$ The definition of condition number of matrix is as follows: $$\operatorname{cond}(A)=||A|| \cdot ||A^{-1}||$$

Say i have the following maximization. $ max_R$ trace $(RZ): R^TR = I_n$ where $R$ is an $n$ x $n$ orthogonal transformational vector. Also, the SVD of $Z = USV^T$. I’m trying to find the optimal $R^*$ which intuitively I know is equal to $VU^T$ where $trace$ $(RZ)$ $=$ $trace$ $(VU^T USV^T)$ $=$ $trace(S)$. I […]

Here’s an attempt to motivate the SVD. Let $A \in \mathbb R^{m \times n}$. It’s natural to ask, in what direction does $A$ have the most “impact”. In other words, for which unit vector $v$ is $\| A v \|_2$ the largest? Denote this unit vector as $v_1$. Let $\sigma_1 = \| A v_1 \|_2$, […]

This may be a trivial question yet I was unable to find an answer: $$\left \| A \right \| _2=\sqrt{\lambda_{\text{max}}(A^{^*}A)}=\sigma_{\text{max}}(A)$$ where the spectral norm $\left \| A \right \| _2$ of a complex matrix $A$ is defined as $$\text{max} \left\{ \|Ax\|_2 : \|x\| = 1 \right\}$$ How does one prove the first and the second […]

Intereting Posts

Do there exist two primes $p<q$ such that $p^n-1\mid q^n-1$ for infinitely many $n$?
Is there a function such that $f(f(n)) = 2^n$?
Sum of the series $\frac{1}{2\cdot 4}+\frac{1\cdot3}{2\cdot4\cdot6}+\dots$
Finding all the numbers that fit $x! + y! = z!$
Why is it considered unlikely that there could be a contradiction in ZF/ZFC?
Linear combinations of sequences of uniformly integrable functions
$\int_0^\infty \frac{\log(1+x)}{x}e^{-\alpha x}dx$
If the Chaos Game result is a Sierpinski attractor when the random seed is a sequence (Möbius function), does it imply that the sequence is random?
Determination of the last two digits of $777^{777}$
A UFD for which the related formal power series ring is not a UFD
What is the affine connection, and what is the intuition behind/for affine connection?
Prove $\frac{2\sec\theta +3\tan\theta+5\sin\theta-7\cos\theta+5}{2\tan\theta +3\sec\theta+5\cos\theta+7\sin\theta+8}=\frac{1-\cos\theta}{\sin\theta}$
Suppose $A^n = 0$ matrix for some $n > 1$. Find an inverse for $I – A$.
$\operatorname{Ext}$ and injectives, respectively projectives
Are all extensions of finite fields cyclic?