How to prove $I-BA$ is invertible

This question already has an answer here:

  • $I-AB$ be invertible $\Leftrightarrow$ $I-BA$ is invertible [duplicate]

    3 answers

Solutions Collecting From Web of "How to prove $I-BA$ is invertible"

Question 1: Let $A$ and $B$ be square matrices of the same order. Prove that $I-AB$ is invertible if and only if $I-BA$ is invertible.

Proof: Let $C$ be the inverse of $I-AB$. Then
$$I-BA=I-BIA=I-BC(I-AB)A=I-BCA(I-BA),$$
which gives us
$$(I-BA)(I+BCA)=I.$$
Thus $I-BA$ is invertible with the inverse $I+BCA$.

Question 2: Let $A$ be an $m\times n$ and $B$ be an $n\times m$ matrix with $m\le n$. Prove that $AB$ and $BA$ have the same nonzero eigenvalues, counting multiplicities, with $BA$ having an additional $n-m$ eigenvalues equal to $0$.

Proof: (It’s a proof by C. R. Johnson and E. Schreiner published in American Mathematical Monthly 103 (1996), 578-582)

First notice that the $(m+n)\times (m+n)$ partitioned matrices
$$\left[
\begin{array}{cc}
AB & 0\\
B & 0
\end{array}
\right]
\qquad \text{and} \qquad
\left[
\begin{array}{cc}
0 & 0\\
B & BA
\end{array}
\right]$$
are similar to each other via the partitioned block calculation:
$$\left[
\begin{array}{cc}
AB & 0\\
B & 0
\end{array}
\right]
\left[
\begin{array}{cc}
I_m & A\\
0 & I_n
\end{array}
\right]=
\left[
\begin{array}{cc}
AB & ABA\\
B & BA
\end{array}
\right]=
\left[
\begin{array}{cc}
I_m & A\\
0 & I_n
\end{array}
\right]
\left[
\begin{array}{cc}
0 & 0\\
B & BA
\end{array}
\right].
$$
Since
$$\left[
\begin{array}{cc}
I_m & A\\
0 & I_n
\end{array}
\right]
$$
is invertible, it provides the similarity. Beacause
$$\left[
\begin{array}{cc}
AB & 0\\
B & 0
\end{array}
\right]
$$
is block triangular, its eigenvalues are those of the two diagonal blocks, $AB$ and the $n\times n$ zero matrix (See eigenvalues of a block matrix or the-eigenvalues-of-a-block-matrix). Similarly, the eigenvalues of
$$
\left[
\begin{array}{cc}
0 & 0\\
B & BA
\end{array}
\right]
$$
are the eigenvalues of $BA$, together with $m$ zeroes. Because the two partitioned matrices are similar and similar matrices have the same eigenvalues (see similar matrices have the same eigenvalues), $AB$ and $BA$ must have the same nonzero eigenvalues (counting multiplicities) and the additional $n-m$ eigenvalues of $BA$ must all be $0$.


Alternate Proof: (It’s a proof from the book Matrix Analysis (Roger A. Horn, Charles R. Johnson) suggested in Exercise 9, page 55.)

(a) First, suppose that $A,B\in M_n$ and that at least one of them is invertible, Show that $AB$ is similar to $BA$ and hence the characteristic polynomials of $AB$ and $BA$ are the same. Hint: If $A$ is invertible, $BA=A^{-1}(AB)A$.

(b) Show that if $A,B\in M_n$ are both singular, $AB$ and $BA$ have the same eigenvalues, counting multiplicities. $Hint:$ Consider the following analytic argument. For all sufficiently small $\varepsilon>0$, $A_{\varepsilon}:=A+\varepsilon I$ is invertible; thus $A_{\varepsilon}B$ and $BA_{\varepsilon}$ are similar and hence the characteristic polynomials of $A_{\varepsilon}B$ and $BA_{\varepsilon}$ are the same. If we now let $\varepsilon \to 0$, similarity may fail in the limit, but equality of the characteristic polynomials continues to hold since $p_{A_{\varepsilon}B}(t)=\det{(tI-A_{\varepsilon}B)}$ depends continuously on $\varepsilon$. Thus, $AB$ and $BA$ have the same characteristic polynomials and therefore the same eigenvalues, counting multiplicities.

(Far now parts (a) and (b) are the same as the answer of @A.G.)

(c) Finally, if $A\in M_{m, n}$ and $B\in M_{n, m}$, with $m<n$,show that $AB$ and $BA$ have the same eigenvalues, counting multiplicities, except that $BA$ has an additional $n-m$ eigenvalues equal to $0$; equivalently, $p_{BA}(t)=t^{n-m}p_{AB}(t)$. Hint: Make $n$ by $n$ matrices out of both $A$ (by appending $0$ rows) and $B$ (by appending $0$ columns), apply the last result, and compare the two new products (appropriately partitioned) to the old ones.

Suppose $I – AB$ is invertible.

Suppose $(I – BA)x = 0$.

Then :

$$BAx = x$$

so

$$ABAx = Ax$$

or, what is the same,

$$(I – AB)Ax = 0$$

Since $I – AB$ is invertible, this last equality implies

$$Ax = 0$$

Hence $x = BAx = 0$. Thus the only solution of $(I – AB)x = 0$ is $x = 0$, so $I – AB$ is invertible.

Let’s show that $AB$ and $BA$ have the same eigenvalues.

First, let $\lambda$ be a nonzero eigenvalue of $AB$; then $ABv=\lambda v$, for some $v\ne0$. Therefore $BA(Bv)=B(\lambda v)=\lambda(Bv)$ and so $\lambda$ is an eigenvalue of $BA$ (because $Bv\ne0$).

If $0$ is an eigenvalue of $AB$, at least one of $A$ and $B$ is not invertible. Thus also $BA$ is not invertible and has the eigenvalue $0$.


Now, the eigenvalues of $C$ are the (complex) numbers that make $C-\lambda I$ not invertible. Saying that $AB-I$ is invertible is the same as saying that $1$ is not an eigenvalue of $AB$; thus it is not an eigenvalue of $BA$ either.

Hint: prove that
$$
\det(\lambda I-AB)=\det(\lambda I-BA).
$$

P.S. Matrices has to be square otherwise the statement about eigenvalues is not correct.

Edit:

There is a nice proof of this fact:

  1. If $A$ is invertible then
    $$
    AB=ABAA^{-1}=A(BA)A^{-1},
    $$
    and, hence, $AB$ and $BA$ are similar. Similar matrices have the same characteristic polynomial.
  2. If $A$ is singular then it can be disturbed to a nonsingular $A_\epsilon$ such that $A_\epsilon\to A$ when $\epsilon\to 0$ (for example,$A_\epsilon=A+\epsilon I$). By the first item
    $$
    \det(\lambda I-A_\epsilon B)=\det(\lambda I-BA_\epsilon).
    $$
    Now take the limit when $\epsilon\to 0$ and use the continuous dependence of determinants on matrix components.