Articles of linear algebra

Why isn't the derivative of a rotation matrix skew symmetric?

Consider the rotation matrix $$R(\theta) = \begin{pmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{pmatrix}$$ Differentiating $R(\theta)$ with respect to $\theta$ gives a matrix that is not skew symmetric. Aren’t infinitesimal rotations heuristically supposed to be skew-symmetric?

Let $T$ be a self-adjoint operator and$\langle T(w),w\rangle>0$ . If $\operatorname{dim}(W) = k$ then $T$, has at least $k$ positive eigenvalues

Qn: Let T be a finite-dimensional complex inner product space, and T a self-adjoint linear operator. Suppose there exists a subspace W of V such that $\langle T(w),w \rangle$ is positive for all non zero $w$. If $\operatorname{dim}(W) = k$, prove that $T$ has at least $k$ positive eigenvalues (counting algebraic multiplicities). Below is my […]

Show that the linear transformation $T:V\to V$ defined by $T(B)=ABA^{-1}$ is diagonalizable

Notation: If $\Bbb F=\Bbb {R}$ or $\Bbb C$, denote by $M_n(F)$ the $n\times n$ matrices with entries in $\Bbb F$. Let $V=M_3(C)$ be a $9$-dimension vector space over $\Bbb C$ and let $$A=\begin{pmatrix} 0 & 0 & 2 \\ 1 & 0 & 1 \\ 0 & 1 & -2 \\ \end{pmatrix}.$$ Define the linear […]

Symplectic basis $(A_i,B_i)$ such that $S= $ span$(A_1,B_1,…,A_k,B_k)$ for some $k$ when $S$ is symplectic

Let $(V,\omega)$ be a symplectic vector space of dimension $2n$. How can I show that for a symplectic subspace $S \subset V$, there exists a symplectic basis $(A_i,B_i)$ such that $S= $ span$(A_1,B_1,…,A_k,B_k)$ for some $k$. I know that since $S$ is symplectic, $S \cap S^\perp= \lbrace0 \rbrace$ and this is true iff $\omega|_S$ is […]

Is there a polynomial $f\in \mathbb Q$ such that $f(x)^2=g(x)^2(x^2+1)$

I was asked the following question: $g\in \mathbb Q[x]$ is a polynomial (not the zero polynomial). Find $f \in \mathbb Q[x]$ such that $f(x)^2=g(x)^2(x^2+1)$ or show that such an $f$ does not exist. I really have no idea where to begin and would appreciate all help I can get to solve this.

Proof that the characteristic polynomial of a $2 \times 2$ matrix is $x^2 – \text{tr}(A) x + \det (A)$

Let $$ A=\begin{bmatrix} a_{11} & a_{12}\\ a_{21} & a_{22}\\ \end{bmatrix}$$ Let $C_{A}(x) := \det(xI-A)$ be the characteristic polynomial of A. Show that $$C_{A}(x)=x^2-\text{tr}(A)x+\det(A).$$ I know that $\text{tr}(A)=a_{11}+a_{22}$ and $\det(A)=a_{11}a_{22}-a_{21}a_{12}$. Plugging this into the above equation I get $$C_{A}(x)=x^2-(a_{11}+a_{22})x+a_{11}a_{22}-a_{21}a_{12}.$$ I’m not sure how to get past this. As you can tell, I’m not too good at […]

Norm and inner product

Is always square of the norm of a vector is same as the inner product of that vector with itself? In Probability theory we frequently use $L^P$ norm: $\|X\|=E^{1/p}\left(X^p\right)$. But don’t we still use $E(XY)$ as the inner product? In that case, $\langle x,x\rangle=EX^2$ which is not equal to the square of the norm when […]

Logical formula of definition of linearly dependent

A subset $S$ of a vector space $V$ is said to be linearly dependent if there exist a finite number of distinct vectors $x_1, \ldots , x_n$ in $S$ and scalars $a_1 , \ldots ,a_n$ not all zero, such that $$ a_1 x_1 + \cdots + a_n x_n =0 $$ I want to translate this […]

Linear Algebra, eigenvalues and eigenvectors

For any $m \times m$ matrix, I will get a characteristic polynomial of degree $m$ with $m$ eigenvalues. But for the matrix $$A = \pmatrix{2 & 1 & 1 & 1 & 1 & 1 \\ 1 & 1 & 0 & 1 & 0 & 1 \\ 1 & 0 & 1 & 0 […]

How to prove $AB$ is a diagonalizable matrix?

Let $A$ be a positive definite matrix, $B$ an Hermitian matrix. How to prove $AB$ is a diagonalizable matrix?