Articles of matrices

trace of the matrix $I + M + M^2$ is

Let $ \alpha = e^{\frac{2\pi \iota}{5}}$ and the matrix $$ M= \begin{pmatrix}1 & \alpha & \alpha^2 & \alpha^3 & \alpha^4\\ 0 & \alpha & \alpha^2 & \alpha^3 & \alpha^4\\ 0 & 0 & \alpha^2 & \alpha^3 & \alpha^4 \\ 0 & 0 & 0 & \alpha^3 & \alpha^4\\ 0 & 0 & 0 & 0 […]

Complex square matrices. Difficult proof.

$det(I+A\cdot\bar{A}) \ge 0$ Is it possible to prove the inequality is true for all complex square matrices $A$ where $I$ is the identity matrix and $\bar{A}$ is the complex conjugated matrix.

Is always possible to find a generalized eigenvector for the Jordan basis M?

$A$ is a defective matrix, meaning that there are fewer linearly independent eigenvectors than eigenvalues; the algebraic multiplicity of $\lambda_1$ is $v_i = 2$ while the geometric multiplicity is $\mu_1 = 1$: $$ A = \begin{bmatrix}1 & 1 \\ 0 & 1\end{bmatrix}, \lambda_1 = 1, e_1 = \begin{bmatrix}1 \\0\end{bmatrix} $$ The block diagonal matrix $J$ […]

What is the isomorphism function in $M_m(M_n(\mathbb R))\cong M_{mn}(\mathbb R)$?

What is the isomorphism function in $M_m(M_n(\mathbb R))\cong M_{mn}(\mathbb R)$. I tried this $[[a_{ij}]_{kl}]\mapsto[a_{ijkl}]$ , but I couldn’t prove all steps.

FLOSS tool to visualize 2- and 3-space matrix transformations

I’m looking for a FLOSS application (Windows or Ubuntu but preferably both) that can help me visualize matrix transformations in 2- and 3-space. So I’d like to be able to enter a vector or matrix, see it in 2-space or 3-space, enter a transformation vector or matrix, and see the result. For example, enter a […]

Minimum of a quadratic form

If $\bf{A}$ is a real symmetric matrix, we know that it has orthogonal eigenvectors. Now, say we want to find a unit vector $\bf{n}$ that minimizes the form: $${\bf{n}}^T{\bf{A}}\ {\bf{n}}$$ How can one prove that this vector is given by the eigenvector corresponding to the minimum eigenvalue of $\bf{A}$? I have a proof of my […]

Proving that a right (or left) inverse of a square matrix is unique using only basic matrix operations

Proving that a right (or left) inverse of a square matrix is unique using only basic matrix operations — i.e. without any reference to higher-order matters like rank, vector spaces or whatever ( :)). More precisely, armed with the knowledge only of: rules of matrix equality check, addition, multiplication, distributive law and friends Gauss-Jordan elimination […]

Proof of existence of square root of unitary and symmetric matrix

I’m struggling with this exercise Let $U$ be a unitary and symmetric matrix ($U^T = U$ and $U^*U = I$). Prove that there exists a complex matrix $S$ such that: $S^2 = U$ $S$ is a unitary matrix $S$ is symmetric Each matrix that commutes with $U$, commutes with $S$

When is the matrix $\text{diag}(\mathbf{x}) + \mathbf{1}$ invertible?

Given a vector $\mathbf{x} \in \mathbb{R}^N$, let’s define: $$\text{diag}(\mathbf{x}) = \begin{pmatrix} x_1 & 0 & \ldots & 0 \\ 0 & x_2 & \ldots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \ldots & x_N \end{pmatrix}.$$ Moreover, let $$\mathbf{1}= \begin{pmatrix} 1 & 1 & \ldots & 1 \\ […]

Eigenvalues of a matrix with binomial entries

I am trying to determine the eigenvalues of the following matrix: $$M_{ij} = 4^{-j}\binom{2j}{i}$$ where it is understood that the binomial coefficient $\binom{m}{k}$ is zero if $k<0$ or $k>m$. Here $i,j$ go from $0$ to $N$, therefore the matrix is $(N+1)\times(N+1)$. If an exact expression is not available, I would content myself with approximations valid […]