Intereting Posts

Can one differentiate an infinite sum?
Graph type taxonomy
Proving that $\int_0^1 \frac{\log \left(\frac{1}{t}\right) \log (t+2)}{t+1} \, dt=\frac{13}{24} \zeta (3)$
What is the sprague-grundy value of these games?
How do I get $\cos{\theta} \lt \frac{\sin{\theta}}{\theta} \lt 1$?
How can I show that $n! \leqslant (\frac{n+1}{2})^n$?
How to deduce the CDF of $W=I^2R$ from the PDFs of $I$ and $R$ independent
Need help finding limit $\lim \limits_{x\to \infty}\left(\frac{x}{x-1}\right)^{2x+1}$
Question about Axler's proof that every linear operator has an eigenvalue
Error Analysis and Modes of Convergences
Error-correcting codes used in real life
Normed space where all absolutely convergent series converge is Banach
Conditions for distinct real roots of cubic polynomials.
If a prime $p\mid ab$, then $p\mid a$ or $p\mid b$
How can I correct my wrong Intuition that $\forall \, x \, \in \,\emptyset : P(x) \quad $ is false?

If $A \in M_n(\mathbb{R})$ is an anti-diagonal $n \times n$ matrix, is there a quick way to find its eigenvalues in a way similar to finding the eigenvalues of a diagonal matrix? The standard way for finding the eigenvalues for any $n \times n$ is usually straightforward, but may sometimes lead to painstaking computational time. Just wondering if there was a quicker way in doing so for any anti-diagonal matrix without having to resort to the standard method of finding the determinant of $A – \lambda I$, where $I$ is the $n \times n$ identity matrix, and setting it equal to $0$ and solving for $\lambda$.

- Area of a parallelogram, vertices $(-1,-1), (4,1), (5,3), (10,5)$.
- Geometric intuition for the tensor product of vector spaces
- Generating a fan beam sweep algebraically.
- a question about a canonical form of a quadratic form using Gauss theorem
- Looking for an identity for characteristic polynomial of a matrix to the power of n
- Determinant of a large block matrix
- Prerequisites/Books for A First Course in Linear Algebra
- Is the rank of a matrix the same of its transpose? If yes, how can I prove it?
- What are the answers for these basic quetions?
- How to get ratio of a,b,c from 2 equations in a,b,c

For ease of formatting and explanation, I’ll be doing everything for the $5 \times 5$ example. However, the same trick works for any $n \times n$ antisymmetric matrix (though slightly differently for even $n$).

Suppose

$$

A =

\begin{pmatrix}0&0&0&0&a_{15}\\0&0&0&a_{24}&0\\0&0&a_{33}&0&0\\0&a_{42}&0&0&0\\a_{51}&0&0&0&0 \end{pmatrix}

$$

Here’s a neat trick: we note that

$$

A^2 = \pmatrix{

a_{15}a_{51}&&&&\\

&a_{24}a_{42}&&&\\

&&(a_{33})^2&&\\

&&&a_{24}a_{42}&\\

&&&&a_{15}a_{51}\\

}

$$

So, the eigenvalues of $A^2$ are precisely $\{a_{15}a_{51}, a_{24}a_{42}, (a_{33})^2\}$.

Now, note that if $\lambda$ is an eigenvalue of $A$, then $\lambda^2$ must be an eigenvalue of $A^2$. This gives you six candidates for the eigenvalues of $A$.

In fact, with more thorough analysis, we can guarantee that the eigenvalues will be precisely $\lambda = \pm \sqrt{a_{i,(n+1-i)}a_{(n+1-i),i}}$ for $i = 1,\dots,\lfloor n/2\rfloor$ and, for odd $n$, $\lambda = a_{(n+1)/2,(n+1)/2}$.

**Proof that this is the case:** Let $e_1,\dots,e_n$ denote the standard basis vectors. Let $S_{ij}$ denote the span of the vectors $e_i$ and $e_j$.

Note that $A$ is invariant over $S_{i(n-i)}$ for $i = 1,\dots,\lfloor n/2\rfloor$. We may then consider the restriction $A_{i(n-i)}: S_{i(n-i)} \to S_{i(n-i)}$, which can be represented by the matrix

$$

\pmatrix{0 & a_{i(n-i)}\\a_{(n-i)i} & 0}

$$

It suffices to find the eigenvalues of this transformation.

For the case of an odd $n$, it is sufficient to note that $a_{(n+1)/2,(n+1)/2}$ lies on the diagonal with zeros in its row and column.

Another explanation: denote the matrix

$S = \pmatrix{e_1 & e_{n} & e_2 & e_{n-1} & \cdots}$

Noting that $S = S^{-1}$, we find that

$$

SAS^{-1} =

\pmatrix{

0&a_{1,n}\\

a_{n,1}&0\\

&&0&a_{2,n-1}\\

&&a_{n-1,2}&0\\

&&&&\ddots

}

$$

This matrix is similar, and therefore has the same eigenvalues. However, it is also block diagonal.

Suppose $A$ has even size, say, $2m \times 2m$. Then by reordering the basis we can produce a block diagonal matrix with the same eigenvalues as the original.

$$

\left(

\begin{array}{cc}

0 & a_{1,2m} \\

a_{2m,1} & 0 \\

\end{array}

\right)

\oplus

\cdots

\oplus

\left(

\begin{array}{cc}

0 & a_{m,m+1} \\

a_{m+1,m} & 0 \\

\end{array}

\right).

$$

The characteristic polynomial is

$$\det(\lambda I – A) = \det\left(\lambda I – \left(

\begin{array}{cc}

0 & a_{1,2m} \\

a_{2m,1} & 0 \\

\end{array}

\right)\right)

\cdots \det\left(\lambda I – \left(

\begin{array}{cc}

0 & a_{m,m+1} \\

a_{m+1,m} & 0 \\

\end{array}

\right)\right)

= (\lambda^2 – a_{1,2m}a_{2m,1})\cdots(\lambda^2 – a_{m,m+1}a_{m+1,m}) .

$$

and so the eigenvalues are the roots of these factor polynomials, namely both square roots of each of the products $a_{1,2m}a_{2m,1}, \ldots, a_{m,m+1} a_{m+1,m}$.

If $A$ has odd size, say, $(2m + 1) \times (2m + 1)$ when we change basis we can send the middle ($(m+1)$st) element to the end, in which case the characteristic polynomial takes a similar form as in the even case (with indices $> m$ shifted up one), but with an additional factor of $\lambda – a_{m+1, m+1}$, so the additional eigenvalue in this case is just the middle entry of the matrix.

One can actually find the eigenvalues of $A$ **directly** without reordering the indices (note: I’m not saying that this is the quickest or preferred way to find those eigenvalues — this is not). The eigenvalues of $A$ are just the roots of the characteristic polynomial $\det(xI-A)$. When $n$ is odd, let $m=\frac{n+1}2$ and write

$$

A=\left[\begin{array}{c|c|c}0&0&A_{13}\\ \hline 0&a_{mm}&0\\ \hline A_{31}&0&0\end{array}\right]\tag{$\ast$}

$$

where $A_{13}$ and $A_{31}$ are two anti diagonal matrices of size $m-1$. By Laplace expansion along the middle row, we get $\det(xI-A)=(x-a_{mm})\det\left[\begin{array}{c|c}xI&-A_{12}\\ \hline -A_{21}&xI\end{array}\right]$. Using the formula $\det\pmatrix{X&Y\\ Z&W}=\det(XW-YZ)$ when $Z$ and $W$ commute, we further obtain

\begin{align*}

\det(xI-A)&=(x-a_{mm})\det(x^2I-A_{12}A_{21})\\

&= (x-a_{mm})(x^2-a_{1n}a_{n1})(x^2-a_{2,n-1}a_{n-1,2})\cdots\left(x^2-a_{m-1,m+1}a_{m+1,m-1}\right)

\end{align*}

and finding its roots is a trivial matter.

When $n$ is even, the central element in $(\ast)$ vanishes and we may skip the Laplace expansion step. The characteristic polynomial of $A$ is then

$$

(x^2-a_{1n}a_{n1})(x^2-a_{2,n-1}a_{n-1,2})\cdots\left(x^2-a_{\frac n2,\frac n2+1}a_{\frac n2+1,\frac n2}\right).

$$

- $A \oplus B = A \oplus C$ imply $B = C$?
- Is it true that a subset that is closed in a closed subspace of a topological space is closed in the whole space?
- Why does topology rarely come up outside of topology?
- Which parentheses are implied by $\prod$?
- Change of basis matrix exercise, find the basis given the matrix.
- Dividing the linear congruence equations
- Solution of $ax=a^x$
- Evaluating the series $\sum_{n=1}^{\infty} \frac{1}{n^{3} \binom{2n}{n}} $
- If $\sigma_n=\frac{s_1+s_2+\cdots+s_n}{n}$ then $\operatorname{{lim sup}}\sigma_n \leq \operatorname{lim sup} s_n$
- Randomly dropping needles in a circle?
- If $A$ an integral domain contains a field $K$ and $A$ over $K$ is a finite-dimensional vector space, then $A$ is a field.
- Finding the all roots of a polynomial by using Newton-Raphson method.
- Finding the dual cone
- Example where $f\circ g$ is bijective, but neither $f$ nor $g$ is bijective
- What is the non-trivial, general solution of these equal ratios?