Intereting Posts

How to find the partial sum of a given series?
Dogbone contour integral/branch cuts/residue at infinity
Using the open set definition of continuity to directly prove a function is continuous
Understanding the solution of $\int\left(1-x^{p}\right)^{\frac{n-1}{p}}\log\left(1-x^{p}\right)dx$
Prove: The weak closure of the unit sphere is the unit ball.
Using induction to prove that $\sum_{r=1}^n r\cdot r! =(n+1)! -1$
Does it hold? $\int_{0}^{\pi/2}\cos^{2k}xdx=\frac{(2k-1)!!}{2k!!}\frac{\pi}{2}$
Interval of definition of the solutions of $\dot x=e^x\sin x$
Describe the Riemann surface for $w^2=z^2-1$.
limit $\lim_{n\to ∞}\sin(\pi(2+\sqrt3)^n)$
Link between a Dense subset and a Continuous mapping
How to sum up this series?
Riccati differential equation $y'=x^2+y^2$
How do you prove that vectors are linearly independent in $ \mathcal{C}$?
Find the equation of the tangent line to the ellipse at the given point

If $A \in M_n(\mathbb{R})$ is an anti-diagonal $n \times n$ matrix, is there a quick way to find its eigenvalues in a way similar to finding the eigenvalues of a diagonal matrix? The standard way for finding the eigenvalues for any $n \times n$ is usually straightforward, but may sometimes lead to painstaking computational time. Just wondering if there was a quicker way in doing so for any anti-diagonal matrix without having to resort to the standard method of finding the determinant of $A – \lambda I$, where $I$ is the $n \times n$ identity matrix, and setting it equal to $0$ and solving for $\lambda$.

- Null Space of Transformation
- Transvection matrices generate $SL_n(\mathbb{R})$
- Prove $\ker {T^k} \cap {\mathop{\rm Im}\nolimits} {T^k} = \{ 0\}$
- Orthogonal Projection onto the $ {L}_{2} $ Unit Ball
- what are pivot numbers in LU decomposition? please explain me in an example
- Matrices which commute with all the matrices commuting with a given matrix
- Uniqueness of Duals in a Monoidal Category
- Prove a residue matrix $A$ (with coefficients in $\mathbb Z_n)$ has an inverse if and only if $\gcd(\det A,n) = 1$
- Dual of a Linear Program
- nilpotent and linear transformation

For ease of formatting and explanation, I’ll be doing everything for the $5 \times 5$ example. However, the same trick works for any $n \times n$ antisymmetric matrix (though slightly differently for even $n$).

Suppose

$$

A =

\begin{pmatrix}0&0&0&0&a_{15}\\0&0&0&a_{24}&0\\0&0&a_{33}&0&0\\0&a_{42}&0&0&0\\a_{51}&0&0&0&0 \end{pmatrix}

$$

Here’s a neat trick: we note that

$$

A^2 = \pmatrix{

a_{15}a_{51}&&&&\\

&a_{24}a_{42}&&&\\

&&(a_{33})^2&&\\

&&&a_{24}a_{42}&\\

&&&&a_{15}a_{51}\\

}

$$

So, the eigenvalues of $A^2$ are precisely $\{a_{15}a_{51}, a_{24}a_{42}, (a_{33})^2\}$.

Now, note that if $\lambda$ is an eigenvalue of $A$, then $\lambda^2$ must be an eigenvalue of $A^2$. This gives you six candidates for the eigenvalues of $A$.

In fact, with more thorough analysis, we can guarantee that the eigenvalues will be precisely $\lambda = \pm \sqrt{a_{i,(n+1-i)}a_{(n+1-i),i}}$ for $i = 1,\dots,\lfloor n/2\rfloor$ and, for odd $n$, $\lambda = a_{(n+1)/2,(n+1)/2}$.

**Proof that this is the case:** Let $e_1,\dots,e_n$ denote the standard basis vectors. Let $S_{ij}$ denote the span of the vectors $e_i$ and $e_j$.

Note that $A$ is invariant over $S_{i(n-i)}$ for $i = 1,\dots,\lfloor n/2\rfloor$. We may then consider the restriction $A_{i(n-i)}: S_{i(n-i)} \to S_{i(n-i)}$, which can be represented by the matrix

$$

\pmatrix{0 & a_{i(n-i)}\\a_{(n-i)i} & 0}

$$

It suffices to find the eigenvalues of this transformation.

For the case of an odd $n$, it is sufficient to note that $a_{(n+1)/2,(n+1)/2}$ lies on the diagonal with zeros in its row and column.

Another explanation: denote the matrix

$S = \pmatrix{e_1 & e_{n} & e_2 & e_{n-1} & \cdots}$

Noting that $S = S^{-1}$, we find that

$$

SAS^{-1} =

\pmatrix{

0&a_{1,n}\\

a_{n,1}&0\\

&&0&a_{2,n-1}\\

&&a_{n-1,2}&0\\

&&&&\ddots

}

$$

This matrix is similar, and therefore has the same eigenvalues. However, it is also block diagonal.

Suppose $A$ has even size, say, $2m \times 2m$. Then by reordering the basis we can produce a block diagonal matrix with the same eigenvalues as the original.

$$

\left(

\begin{array}{cc}

0 & a_{1,2m} \\

a_{2m,1} & 0 \\

\end{array}

\right)

\oplus

\cdots

\oplus

\left(

\begin{array}{cc}

0 & a_{m,m+1} \\

a_{m+1,m} & 0 \\

\end{array}

\right).

$$

The characteristic polynomial is

$$\det(\lambda I – A) = \det\left(\lambda I – \left(

\begin{array}{cc}

0 & a_{1,2m} \\

a_{2m,1} & 0 \\

\end{array}

\right)\right)

\cdots \det\left(\lambda I – \left(

\begin{array}{cc}

0 & a_{m,m+1} \\

a_{m+1,m} & 0 \\

\end{array}

\right)\right)

= (\lambda^2 – a_{1,2m}a_{2m,1})\cdots(\lambda^2 – a_{m,m+1}a_{m+1,m}) .

$$

and so the eigenvalues are the roots of these factor polynomials, namely both square roots of each of the products $a_{1,2m}a_{2m,1}, \ldots, a_{m,m+1} a_{m+1,m}$.

If $A$ has odd size, say, $(2m + 1) \times (2m + 1)$ when we change basis we can send the middle ($(m+1)$st) element to the end, in which case the characteristic polynomial takes a similar form as in the even case (with indices $> m$ shifted up one), but with an additional factor of $\lambda – a_{m+1, m+1}$, so the additional eigenvalue in this case is just the middle entry of the matrix.

One can actually find the eigenvalues of $A$ **directly** without reordering the indices (note: I’m not saying that this is the quickest or preferred way to find those eigenvalues — this is not). The eigenvalues of $A$ are just the roots of the characteristic polynomial $\det(xI-A)$. When $n$ is odd, let $m=\frac{n+1}2$ and write

$$

A=\left[\begin{array}{c|c|c}0&0&A_{13}\\ \hline 0&a_{mm}&0\\ \hline A_{31}&0&0\end{array}\right]\tag{$\ast$}

$$

where $A_{13}$ and $A_{31}$ are two anti diagonal matrices of size $m-1$. By Laplace expansion along the middle row, we get $\det(xI-A)=(x-a_{mm})\det\left[\begin{array}{c|c}xI&-A_{12}\\ \hline -A_{21}&xI\end{array}\right]$. Using the formula $\det\pmatrix{X&Y\\ Z&W}=\det(XW-YZ)$ when $Z$ and $W$ commute, we further obtain

\begin{align*}

\det(xI-A)&=(x-a_{mm})\det(x^2I-A_{12}A_{21})\\

&= (x-a_{mm})(x^2-a_{1n}a_{n1})(x^2-a_{2,n-1}a_{n-1,2})\cdots\left(x^2-a_{m-1,m+1}a_{m+1,m-1}\right)

\end{align*}

and finding its roots is a trivial matter.

When $n$ is even, the central element in $(\ast)$ vanishes and we may skip the Laplace expansion step. The characteristic polynomial of $A$ is then

$$

(x^2-a_{1n}a_{n1})(x^2-a_{2,n-1}a_{n-1,2})\cdots\left(x^2-a_{\frac n2,\frac n2+1}a_{\frac n2+1,\frac n2}\right).

$$

- Improper Integral $\int_0^1\frac{\arcsin^2(x^2)}{\sqrt{1-x^2}}dx$
- $3\times 3$ Orthogonal Matrices with an Analysis of Eigenvalues
- Real and Imaginary Parts of tan(z)
- Limit of Multivariate Probability Density Function as one or more or all variables approach positive or negative infinity
- C* algebra inequalities
- Prove that if $n$ is not the square of a natural number, then $\sqrt{n}$ is irrational.
- Solve for n: $\varphi(2n)=\varphi(3n)$
- Is Cartan's magic formula applicable to time dependent vector fields?
- Associative ring with identity, inverses, divisors of zero and Artinianity
- Quaternion–Spinor relationship?
- Can $n(n+1)2^{n-2} = \sum_{i=1}^{n} i^2 \binom{n}{i}$ be derived from the binomial theorem?
- If $(|G|, |H|) > 1$, does it follow that $\operatorname{Aut}(G \times H) \neq \operatorname{Aut}(G) \times \operatorname{Aut}(H)$?
- What is the least amount of questions to find out the number that a person is thinking between 1 to 1000 when they are allowed to lie at most once
- Why don't we study algebraic objects with more than two operations?
- Number of Non-isomorphic models of Set Theory