Quick way to find eigenvalues of anti-diagonal matrix

If $A \in M_n(\mathbb{R})$ is an anti-diagonal $n \times n$ matrix, is there a quick way to find its eigenvalues in a way similar to finding the eigenvalues of a diagonal matrix? The standard way for finding the eigenvalues for any $n \times n$ is usually straightforward, but may sometimes lead to painstaking computational time. Just wondering if there was a quicker way in doing so for any anti-diagonal matrix without having to resort to the standard method of finding the determinant of $A – \lambda I$, where $I$ is the $n \times n$ identity matrix, and setting it equal to $0$ and solving for $\lambda$.

Solutions Collecting From Web of "Quick way to find eigenvalues of anti-diagonal matrix"

For ease of formatting and explanation, I’ll be doing everything for the $5 \times 5$ example. However, the same trick works for any $n \times n$ antisymmetric matrix (though slightly differently for even $n$).

Suppose
$$
A =
\begin{pmatrix}0&0&0&0&a_{15}\\0&0&0&a_{24}&0\\0&0&a_{33}&0&0\\0&a_{42}&0&0&0\\a‌​_{51}&0&0&0&0 \end{pmatrix}
$$

Here’s a neat trick: we note that
$$
A^2 = \pmatrix{
a_{15}a_{51}&&&&\\
&a_{24}a_{42}&&&\\
&&(a_{33})^2&&\\
&&&a_{24}a_{42}&\\
&&&&a_{15}a_{51}\\
}
$$
So, the eigenvalues of $A^2$ are precisely $\{a_{15}a_{51}, a_{24}a_{42}, (a_{33})^2\}$.

Now, note that if $\lambda$ is an eigenvalue of $A$, then $\lambda^2$ must be an eigenvalue of $A^2$. This gives you six candidates for the eigenvalues of $A$.


In fact, with more thorough analysis, we can guarantee that the eigenvalues will be precisely $\lambda = \pm \sqrt{a_{i,(n+1-i)}a_{(n+1-i),i}}$ for $i = 1,\dots,\lfloor n/2\rfloor$ and, for odd $n$, $\lambda = a_{(n+1)/2,(n+1)/2}$.

Proof that this is the case: Let $e_1,\dots,e_n$ denote the standard basis vectors. Let $S_{ij}$ denote the span of the vectors $e_i$ and $e_j$.

Note that $A$ is invariant over $S_{i(n-i)}$ for $i = 1,\dots,\lfloor n/2\rfloor$. We may then consider the restriction $A_{i(n-i)}: S_{i(n-i)} \to S_{i(n-i)}$, which can be represented by the matrix
$$
\pmatrix{0 & a_{i(n-i)}\\a_{(n-i)i} & 0}
$$
It suffices to find the eigenvalues of this transformation.

For the case of an odd $n$, it is sufficient to note that $a_{(n+1)/2,(n+1)/2}$ lies on the diagonal with zeros in its row and column.


Another explanation: denote the matrix
$S = \pmatrix{e_1 & e_{n} & e_2 & e_{n-1} & \cdots}$

Noting that $S = S^{-1}$, we find that
$$
SAS^{-1} =
\pmatrix{
0&a_{1,n}\\
a_{n,1}&0\\
&&0&a_{2,n-1}\\
&&a_{n-1,2}&0\\
&&&&\ddots
}
$$
This matrix is similar, and therefore has the same eigenvalues. However, it is also block diagonal.

Suppose $A$ has even size, say, $2m \times 2m$. Then by reordering the basis we can produce a block diagonal matrix with the same eigenvalues as the original.

$$
\left(
\begin{array}{cc}
0 & a_{1,2m} \\
a_{2m,1} & 0 \\
\end{array}
\right)
\oplus
\cdots
\oplus
\left(
\begin{array}{cc}
0 & a_{m,m+1} \\
a_{m+1,m} & 0 \\
\end{array}
\right).
$$
The characteristic polynomial is

$$\det(\lambda I – A) = \det\left(\lambda I – \left(
\begin{array}{cc}
0 & a_{1,2m} \\
a_{2m,1} & 0 \\
\end{array}
\right)\right)
\cdots \det\left(\lambda I – \left(
\begin{array}{cc}
0 & a_{m,m+1} \\
a_{m+1,m} & 0 \\
\end{array}
\right)\right)
= (\lambda^2 – a_{1,2m}a_{2m,1})\cdots(\lambda^2 – a_{m,m+1}a_{m+1,m}) .
$$
and so the eigenvalues are the roots of these factor polynomials, namely both square roots of each of the products $a_{1,2m}a_{2m,1}, \ldots, a_{m,m+1} a_{m+1,m}$.

If $A$ has odd size, say, $(2m + 1) \times (2m + 1)$ when we change basis we can send the middle ($(m+1)$st) element to the end, in which case the characteristic polynomial takes a similar form as in the even case (with indices $> m$ shifted up one), but with an additional factor of $\lambda – a_{m+1, m+1}$, so the additional eigenvalue in this case is just the middle entry of the matrix.

One can actually find the eigenvalues of $A$ directly without reordering the indices (note: I’m not saying that this is the quickest or preferred way to find those eigenvalues — this is not). The eigenvalues of $A$ are just the roots of the characteristic polynomial $\det(xI-A)$. When $n$ is odd, let $m=\frac{n+1}2$ and write
$$
A=\left[\begin{array}{c|c|c}0&0&A_{13}\\ \hline 0&a_{mm}&0\\ \hline A_{31}&0&0\end{array}\right]\tag{$\ast$}
$$
where $A_{13}$ and $A_{31}$ are two anti diagonal matrices of size $m-1$. By Laplace expansion along the middle row, we get $\det(xI-A)=(x-a_{mm})\det\left[\begin{array}{c|c}xI&-A_{12}\\ \hline -A_{21}&xI\end{array}\right]$. Using the formula $\det\pmatrix{X&Y\\ Z&W}=\det(XW-YZ)$ when $Z$ and $W$ commute, we further obtain
\begin{align*}
\det(xI-A)&=(x-a_{mm})\det(x^2I-A_{12}A_{21})\\
&= (x-a_{mm})(x^2-a_{1n}a_{n1})(x^2-a_{2,n-1}a_{n-1,2})\cdots\left(x^2-a_{m-1,m+1}a_{m+1,m-1}\right)
\end{align*}
and finding its roots is a trivial matter.

When $n$ is even, the central element in $(\ast)$ vanishes and we may skip the Laplace expansion step. The characteristic polynomial of $A$ is then
$$
(x^2-a_{1n}a_{n1})(x^2-a_{2,n-1}a_{n-1,2})\cdots\left(x^2-a_{\frac n2,\frac n2+1}a_{\frac n2+1,\frac n2}\right).
$$