# How to tell if some power of my integer matrix is the identity?

Given an $n\times n$-matrix $A$ with integer entries, I would like to decide whether there is some $m\in\mathbb N$ such that $A^m$ is the identity matrix.

I can solve this by regarding $A$ as a complex matrix and computing its Jordan normal form; equivalently, I can compute the eigenvalues and check whether they are roots of $1$ and whether their geometric and algebraic multiplicities coincide.

Are there other ways to solve this problem, perhaps exploiting the fact that $A$ has integer entries? Edit: I am interested in conditions which are easy to verify for families of matrices in a proof.

Edit: Thanks to everyone for this wealth of answers. It will take me some time to read all of them carefully.

#### Solutions Collecting From Web of "How to tell if some power of my integer matrix is the identity?"

The following conditions on an $n$ by $n$ integer matrix $A$ are equivalent:

(1) $A$ is invertible and of finite order.

(2) The minimal polynomial of $A$ is a product of distinct cyclotomic polynomials.

(3) The elementary divisors of $A$ are cyclotomic polynomials.

Answer amended in view of Rasmus’s comment:

I’m not sure how useful it is, but here’s a remark. If $A$ has finite order, clearly
$\{\|A^{m}\|: m \in \mathbb{N} \}$ is bounded (any matrix norm you care to choose will do).

On the other hand, if the semisimple part (in its Jordan decomposition as a complex matrix) of $A$ does not have finite order, at least one of its eigenvalues has absolute value greater than $1$, so $\{ \|A^{m}\| :m \in \mathbb{N} \}$ is unbounded.

(I am using the fact that all eigenvalues of $A$ are algebraic integers, and they are closed under algebraic conjugation: it is a very old theorem (and fun to prove) that if all algebraic conjugates of an algebraic integer $\alpha$ are of absolute value $1$, then $\alpha$ is a root of unity).

On the other hand, if the semi-simple part of $A$ has finite order, but $A$ itself does not, then (a conjugate of) some power of $A,$ say $A^h$, (as a complex matrix) has a Jordan block of size greater than $1$ associated to the eigenvalue $1$. Then the entries of the powers of $A^h$ become arbitrarily large, and $\{ \| A^{m} \|: m \in \mathbb{N} \}$ is still unbounded.

If $A$ is an $n\times n$ integer matrix, and $A^m=1$, then $\phi(m)\le n$, where $\phi$ is Euler’s phi-function (because $\phi(m)$ is the degree of the minimal polynomial for the $m$th roots of unity, and $n$ is the degree of the characteristic polynomial of $A$). Given $n$, there are only finitely many $m$ with $\phi(m)\le n$, and a little elementary number theory lets you find the biggest such $m$. So all you have to do is calculate $A^j$ for all $j$ up to that biggest $m$; if you don’t get the identity matrix by then, you never will.

This may take less calculation than finding the eigenvalues, Jordan form, etc.

EDIT: Jyrki notes in the comments that it’s not so easy. It still may be salvageable.

FURTHER EDIT: A very nice paper which considers, among other things, the question of the maximal order of an $n\times n$ matrix with integer entries is James Kuzmanovich and Andrey Pavlichenkov, Finite groups of matrices whose entries are integers, The American Mathematical Monthly Vol. 109, No. 2, Feb., 2002, pages 173-186. With regard to Geoff Robinson’s comment, “It is larger than Landau’s function, but not by much,” the authors find that the ratio between the two functions is less than 51 for $n\lt100,000$ (the maximum in that range being 50.978, first achieved at $n=22434$), and they admit to not knowing whether the ratio is unbounded.

There are these so-called Pascal matrices. These special matrices are suitable to work with for this problem and also a direct subclass of your solution set.

But notice that if there is a such matrix then there is a property that is related to @Pierre-Yves’ answer and that is $A^{m-1}A=A^{m-2}A^2=\ldots = I$ hence the inverse of $A^p$ is $A^{m-p}$. So if there is such $A$ then $A^{-1}$ first it must be integer valued and further it must be some power of $A$ i.e. checking whether $A^{-1}=A^{m-1}$ should be sufficient. And this can be done with a relatively fast computation on any machine.

EDIT 1: As Geoff and Yuval commented below, the matrix inverse and its relatively low order powers already encode a lot of information that can be checked with ease.

EDIT 2: Bah, of course the obvious numerical solution is to check whether $A = A^{m+1}$ which involves only matrix multiplication with a few lines of code in any environment ðŸ™‚

we have the classification of such matrix in general case,

here is the paper:

Reginald Koo, A Classification of Matrices of Finite Order over $\mathbb{C, R}$ and $\mathbb{Q}$, Mathematics Magazine, Vol. $76$, No. $2$ (Apr., $2003$), pp. $143-148$.

Depending on your actual situation you might want to use crude necessary criteria to weed out candidates.

For example, the trace of the matrix is the sum of $n$ roots of unity and cannot have large absolute value.

The same criteria can be repeatedly checked while computing powers.

If we were to know that $A$ is normal, then we could use spectral calculus to conclude from $A^m=1$ that the norm fulfills $||A||=1$. But since $A$ has only integer entries, we know that the rows $Ae_i$ must be unit vectors, otherwise the norm of the matrix would be equal or larger than $\sqrt{2}$. After all,

$$||Ae_i|| = ||\sum_j a_{ij} e_j|| = \sqrt{\sum_j a_{ij}^2} \geq \sqrt{2} \quad \text{ if more than one } a_{ij} \geq 1 .$$

This means that $A$ must be a permutation matrix.

I don’t know what happens when the matrix is not normal, i.e. when it does not commute with its transpose.