Intereting Posts

Differentiability implies continuity – A question about the proof
Does the derivative of a continuous function goes to zero if the function converges to it?
Subgroups of finite index have finitely many conjugates
Equation to check if a set of vertices form a real polygon?
How to verify this limit?
Can an open ball have just one point. As per my understanding it cannot. Please clarify.
Resolvent: Definition
Why do characters on a subgroup extend to the whole group?
Uniform Continuity implies Continuity
Any functionally complete sets with XOR?
Closed form for ${\large\int}_0^1\frac{\ln^2x}{\sqrt{1-x+x^2}}dx$
Flea on a triangle
what are the product and coproduct in the category of topological groups
How to prove $|a+b|^k \leq 2^{k-1} (|a|^k+|b|^k)$?
Where to start learning Linear Algebra?

Suppose ${\bf{X}} = ({X_1},{X_2},\ldots,{X_n})$ are the original components (also random variables) and ${{\bf{w}}_j} = ({\omega _1},{\omega _2},\ldots,{\omega _n})$ are loadings for the $j$th principal component satisfying ${\bf{w}}_j^\rm{T}{{\bf{w}}_j} = 1$ and ${\bf{w}}_\rm{i}^\rm{T}{\bf{w}}_j = 0$ for $i\neq j$, thus ${z_j} = {\bf{w }}_j^{\rm{T}}{\bf{X}}$ is the $j$th component.

To find out the first principal component, we try to maximize the variance of $z_1$, which is $\rm{var}(z_1)=\rm{var}({\bf{w }}_1^{\rm{T}}{\bf{X}})=\bf{w}_\rm{1}^\rm{T}\rm{var}(\bf{X})\bf{w}_\rm{1}$. We estimate $\rm{var}(\bf{X}\rm{)}$ by the sample co-variance matrix $\bf{S}$, we maximize $L=\bf{w}_\rm{1}^\rm{T}\bf{S}\bf{w}_\rm{1}-\lambda({\bf{w}}_1^\rm{T}{{\bf{w}}_1} – 1)$ where $\lambda$ is the Lagrange multiplier. By taking derivative we arrive at $(\bf{S}-\lambda\bf{I})\bf{w}_\rm{1}=0$. It is obvious $\bf{w}_1$ is an eigenvector of the sample co-variance matrix $\bf{S}$.

**Now the problem comes**. Solving the equation gets you all eignenvalues and eigenvectors. I searched the internet all materials I found simply tell you to rank the eigenvalues and the eigenvector of the largest eigenvalue is the first principal component, and the eigenvector of the second eigenvalue is the second principal component, and so one so forth.

- Why, historically, do we multiply matrices as we do?
- Help in understanding Derivation of Posterior in Gaussian Process
- Are Jordan chains always linearly independent?
- A general element of U(2)
- Invertibility of $BA$
- Rank $1$ bilinear form is a product of two linear functionals on a finite dimensional vector space.

**My question is how do we show or prove** the largest eigenvalue corresponds to the largest variance and the second largest eigenvalue corresponds to the second largest variance and so on. Thank you.

- Expected rank of a random binary matrix?
- Prove: Every Subspace is the Kernel of a Linear Map between Vector Spaces
- Determinant of a special matrix
- How to prove and interpret $\operatorname{rank}(AB) \leq \operatorname{min}(\operatorname{rank}(A), \operatorname{rank}(B))$?
- Computing matrices of linear transformation under different basis
- Is my proof correct: rank-nullity in a field $K$
- Intuition of Wronskian determinant and linear independence
- Invariant subspaces using matrix of linear operator
- List of connected Lie subgroups of $\mathrm{SL}(2,\mathbb{C})$.
- A method of finding the eigenvector that I don't fully understand

Ok. I now feel this question is a little dumb. I finally know why. I hope my answer will be helpful for someone else.

The reason is **so simple**! Since we are trying to maximize ${{\bf{w}}^{\rm{T}}}{\bf{Sw}}$ and now we know the candidates are eigenvectors, just **plug them back in**, then we have ${{\bf{w}}^{\rm{T}}}{\bf{Sw}} = {{\bf{w}}^{\rm{T}}}\lambda {\bf{w}} = \lambda {{\bf{w}}^{\rm{T}}}{\bf{w}} = \lambda $, that means the eigenvalues are nothing but the variances. That’s why we rank by eigenvalues!

It is so simple, but it really took me an entire day to figure out…

- Are all non-squares generators modulo some prime $p$?
- Is the maximum function of a continuous function continuous?
- Yoneda's lemma and $K$-theory.
- Is there a positive function $f$ on real line such that $f(x)f(y)\le|x-y|, \forall x\in \mathbb Q , \forall y \in \mathbb R \setminus \mathbb Q$?
- Proof that $(1-x)^n\cdot \left ( \frac{1}{1-x} \right )^n=1$
- Prove that if a set $E$ is closed iff it's complement $E^{c}$ is open
- Find all $x,y\in \mathbb{N}$ such that: $2^x+17=y^2$.
- Maps of primitive vectors and Conway's river, has anyone built this in SAGE?
- When does the inverse limit preserve the localisation?
- Is this GCD statement true?
- Why the Riemann hypothesis doesn't imply Goldbach?
- Galois Group of $x^{4}+7$
- Find a torsion free, non cyclic, abelian group $A$ such that $\operatorname{Aut}(A)$ has order 2
- 2D walks on a square grid; The number of Paths leading to specific $(X,Y)$
- Prove Continuous functions are borel functions