If $(A-\lambda{I})$ is $\lambda$-equivalent to $(B-\lambda{I})$ then $A$ is similar to $B$

When reading the topic about primary and rational canonical form of matrices I stuck myself on this theorem:

The matrices $A,B\in K^{n\times n}$ are similar if and only if their characteristic $\lambda$-matrices $(A-\lambda{I})$ and $(B-\lambda{I})$ are $\lambda$-equivalent; more precisely if $P(\lambda), Q(\lambda) \in K[\lambda]^{n\times n}$ are invertible such that $(B-\lambda{I})=Q(\lambda)(A-\lambda{I})P(\lambda)$ then $P(\lambda)=P$ and $Q(\lambda)=Q$ are constant matrices satisfying $Q=P^{-1}$ and $B=Q A P$.

Theorem 21.5.1 on page 448 (in Slovak).

  • The main point in the proof of this theorem seems to be the assertion: If $P(\lambda), Q(\lambda)$ are invertible and $(B-\lambda{I})=Q(\lambda)(A-\lambda{I})P(\lambda)$ then $P(\lambda)=P$ and $Q(\lambda)=Q$ are constant matrices. What is the proof of this assertion?

Solutions Collecting From Web of "If $(A-\lambda{I})$ is $\lambda$-equivalent to $(B-\lambda{I})$ then $A$ is similar to $B$"

The only proof I know is non-trivial and involves division of matrices. For a reference see: Felix Gantmacher – The theory of matrices, Volume 1, Chapter VI, Theorem 6, p. 145.

If you are familiar with division of matrices, the proof is very readable, but a bit too long and technical to copy down here.

ps. I am not yet allowed to post this as a comment, but I believe it is too important not to mention.

Inspired by @Vincent’s hint and Gantmacher’s book:

  • Matrix polynomial $A(\lambda)$ of degree n is a polynomial of the form:
    where $A_i$ are constant matrices.
    Matrix polynomial is said to be regular if $A_n$ is regular.
    (Each matrix $A(\lambda)\in K[\lambda]^{n\times n}$ is a matrix polynomial.)

  • Assertion: Degree of product of two matrix polynomials is less or equal than sum od degrees of factors. If at least one of factors is regular, degree of product equal to sum of degrees of factors.
    (Easy to verify.)

  • Division of matrix polynomials from left or right respectively.
    $$ \begin{array}{rcccl}
    A(\lambda) & : & B(\lambda) & = & C(\lambda) \\
    R(\lambda) & & & &
    dividend & : & divisor & = & quotient \\
    remainder & & & &
    \end{array}\right) $$
    $$ \begin{array}{ccccl}
    (A_n{\lambda}^n+\ldots+A_0) & : & (B_m{\lambda}^m+\ldots+B_0) & = & C_{n-m}{\lambda}^{n-m}+\ldots+C_0 \\
    R_{m-1}{\lambda}^{m-1}+\ldots+R_0 & & & &
    \end{array} $$
    It si the same process as division of polynomials, just the quotient coefficients for example $C_{n-m}=A_n:B_m$ is $B_m^{-1}A_n$ or $A_nB_m^{-1}$ respectively, and product of two matrices $C_{n-m}\times B_m$ is $B_mC_{n-m}$ or $C_{n-m}B_m$ (when evaluating remainder).
    Conclusion: Whenever $B(\lambda)$ is regular matrix polynomial we have:
    $$A(\lambda)=B(\lambda)C(\lambda)+R(\lambda) \text{(division from left)}$$
    $$A(\lambda)=C(\lambda)B(\lambda)+R(\lambda) \text{(division from right)}$$
    where degree of $R(\lambda)$ is less than degree of $B(\lambda.)$

Let $Q(\lambda)$ and $P(\lambda)$ be invertible matrices. It gives existence of inverse matrices $M(\lambda)$ and $N(\lambda)$. (I.e. $M(\lambda)Q(\lambda)=Q(\lambda)M(\lambda)=I$, the same for $P(\lambda)$ and $N(\lambda)$)

can be rewritten as
M(\lambda)(B-\lambda{I})=(A-\lambda{I})P(\lambda). \;\;\;\;\;(1)
Let us divide $M(\lambda)$ from left side by $(A-\lambda{I})$. It means
M(\lambda)=(A-\lambda{I})R(\lambda)+M. \;\;\;\;\;(2)
Let us divide $P(\lambda)$ from right side by $(B-\lambda{I})$. It means
P(\lambda)=S(\lambda)(B-\lambda{I})+P. \;\;\;\;\;(3)
Matrices $P,M$ are constant matrices.
After plugging (2) and (3) into (1) and a quick modification we obtain:
(A-\lambda{I})[R(\lambda)-S(\lambda)](B-\lambda{I})=(A-\lambda{I})P-M(B-\lambda{I}) \;\;\;\;\;(4)
Matrices $(A-\lambda{I}),(B-\lambda{I})$ are regular matrix polynomials of 1-st degree. Degree of the left side of (4) is at least 2, provided $[R(\lambda)-S(\lambda)]$ isn’t constant zero matrix. (based on Assertion) But the degree of the right side is at most 1.
It means $[R(\lambda)-S(\lambda)]=\mathbb{O}$ and consequently:
To finish the proof, it’s enough to show regularity of $M$.

Let us divide $Q(\lambda)$ by $(B-\lambda{I})$ from left side.
Q(\lambda)=(B-\lambda{I})U(\lambda)+Q \;\;\;\;\;(5)
Let us plug $Q(\lambda)$ from (5) to $M(\lambda)Q(\lambda)=I$.
We obtain:

Now substitute $M(\lambda)(B-\lambda{I})$ in the first summand by $(A-\lambda{I})P(\lambda)$ according to(1) and substitute $M(\lambda)$ in the second summand by $(A-\lambda{I})R(\lambda)+M$ according to (2).
We obtain:
(A-\lambda{I})[P(\lambda)U(\lambda)+R(\lambda)Q]+MQ=I \;\;\;\;\;(6)
Expression in the brackets in (6) equals $\mathbb{O}$. Otherwise the degree of the first summand in (6) is greater than $0$ (based on Assertion and the regularity of $(A-\lambda{I})$). Degree of the second summand and of the right side is $0$. Contradiction.
It gives:
Regularity of $M$ is proved.

I’ll write $X$ for the indeterminate $\lambda$, which I think is less confusing.

Saying that $XI-A$ is equivalent over $K[X]$ to $XI-B$ means that these matrices have the same Smith normal form (over $K[X]$). That means that $A$ and $B$ have the same Rational Canonical Form (RCF; this is over $K$, but the RCF actually does not depend on the base field), because the blocks in the RCF of $A$ are the companion matrices for tthe diagonal entries of $XI-A$. But every square matrix over $K$ is similar over $K$ to its RCF, so $A,B$ having the same RCF means they are similar.