Articles of matrices

Symbol for Euclidean norm (Euclidean distance)

Which symbol is more commonly used to denote the Euclidean norm: $ \left \| \textbf a \right \| $ or $ \left | \textbf b \right |$?

Prove this consequence of Cramer's theorem.

Prove that for every matrix $A$ in $K^{n\times n}$, where $K$ is a field, there exists a $B$ in $K^{n\times n}$ such that $AB = BA = (\det A) \times I$ ($I$ denotes the unit matrix). Later edit: Sure, for $\det A \ne 0$, $B = A^*$ satisfies the equalities, where $A^*$ is the adjoint […]

Let $A$ be an 8 x 5 matrix of rank 3, and let $b$ be a nonzero vector in $N(A^T)$. Show $Ax=b$ must be inconsistent.

Here’s the entire question: Let $A$ be an 8 x 5 matrix of rank 3, and let $b$ be a nonzero vector in $N(A^T)$. a) Show that the system $Ax = b$ must be inconsistent. Gonna take a wild stab at this one… If the rank is 3, that means the dimension of the column […]

The variance of the expected distortion of a linear transformation

Let $A: \mathbb{R}^n \to \mathbb{R}^n$ be a linear transformation. I am interested in the “average distortion” caused by the action of $A$ on vectors. (i.e stretching or contraction of the norm). Consider the uniform distribution on $\mathbb{S}^{n-1}$, and the random variable $X:\mathbb{S}^{n-1} \to \mathbb{R}$ defined by $X(x)=\|A(x)\|_2^2$. It is easy to see that $E(X)=\frac{1}{n}\sum_{i=1}^n \sigma_i^2$, […]

If $A$ is positive definite then so is $A^k$

I know how to show the inverse of positive definite is positive definite but I don’t know how to expand that. Suppose $A$ is positive definite then $A$ is invertible, so define $y=Ax$ for $x\neq 0$. Then $y^TA^{-1}y=x^TA^TA^{-1}Ax=x^TAx>0$, so the inverse of $A$ is positive definite. How can I show that for other powers of […]

Determine invariant subspaces

imagine that a matrix of an endomorphism has the characteristic polynomial $(\lambda-2)^2(\lambda-3)$ now i was wondering whether all invariant subspaces can be determined by $0,V$ and $\ker(A-2)^2, \ker(A-2), \ker(A-3)$? or how do I find them?

Eigenvectors of the $2\times2$ zero matrix

I have been given a problem that involves the following matrix: $$\begin{bmatrix}-2 & 0\\0 & -2\end{bmatrix}$$ I calculated the eigenvalues to be $\lambda_{1,2} = -2$ When I go to calculate the eigenvectors I get the following system: $$\begin{bmatrix}0 & 0\\0 & 0\end{bmatrix}\begin{bmatrix}x \\ y\end{bmatrix} = \begin{bmatrix}0 \\ 0\end{bmatrix}$$ The eigenvectors are clearly $\begin{bmatrix}1 \\ 0\end{bmatrix}$ […]

Matrix sequence convergence vs. matrix power series convergence:

Is my thinking correct? The sequence $A^n$ converges if each entry converges to a finite number. But for a matrix power series, $ I + A + \cdots + A^n + \cdots $ can never converge if it has, for example a “1” in the upper left corner, in entry $a_{11}$. Take, for simplicty, $A$ […]

Induction for Vandermonde Matrix

Given real numbers $x_1<x_2<\cdots<x_n$, define the Vandermonde matrix by $V=(V_{ij}) = (x^j_i)$. That is, $$V = \left(\begin{array}{cccccc} 1 & x_1 & x^2_1 & \cdots & x^{n-1}_1 & x^n_1 \\ 1 & x_2 & x^2_2 & \cdots & x^{n-1}_2 & x^n_2 \\ \vdots & \vdots & \vdots & & \vdots & \vdots \\ 1 & x_{n-1}& […]

Where did this matrix come from?

In my lecture my professor spoke of this function $R$ that takes a vector $\vec u=\left(\begin{matrix}a\\b\end{matrix}\right)$ and rotates by $\frac{\pi}{6}$ radians counter clockwise. Then he talked about a matrix $M=\left(\begin{matrix}\frac{\sqrt{3}}{2} & -\frac12\\\frac12 & \frac{\sqrt{3}}{2}\end{matrix}\right)$. He showed finding the lengths on a triangle with a hypotenuse length $1$, and I have a feeling that the $x$ […]