Intereting Posts

Ellipsoid but not quite
A boundary version of Cauchy's Integral Theorem
Special Cases of Quadratic Reciprocity and Counting Fixed Points
How should I calculate $\lim_{n\rightarrow \infty} \frac{1^n+2^n+3^n+…+n^n}{n^n}$
Why dividing by zero still works
How can one intuitively think about quaternions?
Can the golden ratio accurately be expressed in terms of e and $\pi$
numbers' pattern
Pros and cons of the conventional definition of the Gamma and Beta functions
Has the point dimension zero or one?
Why should I care about fields of positive characteristic?
How many ways to arrange the flags?
Does convergence in distribution implies convergence of expectation?
Prove every group of order less or equal to five is abelian
Prove $ 1 + 2 + 4 + 8 + \dots = -1$

Let $\mathrm A \in \mathbb C^{2 \times 2}$. How many $2 \times 2$ matrices $\mathrm A$ satisfy $\mathrm A^{3} = \mathrm A$. Infinitely many?

If it is $3 \times 3$ matrix then by applying Cayley-Hamilton theorem I could have said that given matrix is diagonalizable. Also zero is eigenvalue of $\mathrm A$. So it would be collection of all singular diagonalizable matrices. But how to count them?

Here I have $2 \times 2$ matrices i feel I can’t apply the Cayley-Hamilton Theorem here?

I am stuck with these thoughts?

- Sum of eigenvalues and singular values
- Axis of rotation of composition of rotations (Artin's Algebra)
- Is the matrix diagonalizable for all values of t?
- Minimum of a quadratic form
- What kind of matrices are non-diagonalizable?
- Prove that the trace of the matrix product $U'AU$ is maximized by setting $U$'s columns to $A$'s eigenvectors

- Isometry group of a norm is always contained in some Isometry group of an inner product?
- Can a basis for a vector space be made up of matrices instead of vectors?
- Properties of invertible matrices
- Gaussian Elimination, Question Check.
- Solution of $A^\top M A=M$ for all $M$ positive-definite
- Minimal polynomial for an invertible matrix and its determinant
- Can you determine a formula for this problem?
- The relation between rational forms and Jordan forms.
- How to show $f(x)$ has no root within $\Bbb Q$
- Is the product of three positive semidefinite matrices positive semidefinite

If $A^3=A$, then $A$ satisfies the polynomial $t^3-t = t(t^2-1)=t(t-1)(t+1)$.

Since you say you know about the minimal polynomial

The minimal polynomial divides every polynomial that the matrix satisfies, so the minimal polynomial divides $t(t-1)(t+1)$. In particular, since it is squarefree, $A$ must be diagonalizable.

If the matrix has a single eigenvalue, then it is a scalar multiple of the identity; the three possibilities are $0$, $I$, and $-I$.

If the matrix has distinct eigenvalues, then the eigenvalues are either $0$ and $1$, $0$ and $-1$, or $1$ and $-1$. In any of these cases, there are infinitely many distinct matrices that satisfy these conditions. For example, for eigenvalues $0$ and $1$, any projection onto a 1-dimensional subspace will do; there are infinitely many different projections onto, say, the subspace $\{(x,0)\mid x\in\mathbb{C}\}$, one for every possible complement. That already solves the problem.

If you don’t remember about the minimal polynomial, you can note that any eigenvalue $\lambda$ must be a root of the polynomial $t^3-t$ (more generally, if $A$ satisfies $f(t)$, then every eigenvalue of $A$ must be a root of $f(t)$). That leads you to the fact that $A$ has eigenvalues $0$, $1$, and $-1$, and from there you can obtain infinitely many different matrices as noted above.

Any projection matrix satisfies $A^2=A$ and so $A^3=A$. You can get one projection matrix for each line through the origin and so there are infinitely many solutions. As Arturo has noticed, this already answers the question.

If you want to find all matrices $A$ such that $A^3=A$, you can proceed like this:

The Cayley-Hamilton theorem implies that $A^2 = \alpha A + \beta I$ for some scalars $ \alpha $ and $\beta$. Then, $A=A^3= \alpha A^2 + \beta A=\alpha (\alpha A + \beta I)+ \beta A=(\alpha^2+\beta)A+\alpha\beta I$ implies $(1-\alpha^2-\beta)A=\alpha\beta I$.

If $1-\alpha^2-\beta\ne0$, we get $A=\gamma I$. In this case, we must have $\gamma^3=\gamma$ and so there are only three solutions: $A=0$, $A=I$, $A=-I$.

If $1-\alpha^2-\beta=0$, then $\alpha\beta=0$, and the only solutions come from $A^2=I$ and $A^2=\pm A$. Now there are plenty of solutions for these. Even $A^2=I$ has infinitely many solutions. See for instance Involutory matrix. Alex’s comment contains an infinite subfamily.

All this is essentially Arturo’s answer without using minimal polynomials or eigenvalues.

- Is there a reason why the number of non-isomorphic graphs with $v=4$ is odd?
- finding $\lim\limits_{n\to\infty }\dfrac{1+\frac{1}{2}+\frac{1}{3}+\cdots+\frac{1}{n}}{1+\frac{1}{3}+\frac{1}{5}+\cdots+\frac{1}{2n+1}}$
- About the identity $\sum\limits_{i=0}^{\infty}\binom{2i+j}{i}z^i=\frac{B_2(z)^j}{\sqrt{1-4z}}$
- Is $\sqrt {2 \sqrt {3 \sqrt {4 \ldots}}}$ algebraic or transcendental?
- Extreme points of the unit ball of the space $c_0 = \{ \{x_n\}_{n=1}^\infty \in \ell^\infty : \lim_{n\to\infty} x_n = 0\}$
- Sylow questions on $GL_2(\mathbb F_3)$.
- What is the number of distinct subgroups of the automorphism group of $\mathbf{F}_{3^{100}}$?
- If $\operatorname{ran} F \subseteq \operatorname{dom} G$, then $\operatorname{dom}(G \circ F) = \operatorname{dom} F$.
- Understanding second derivatives
- Existence of vector space complement and axiom of choice
- Does this integral have a closed form: $\int_0^1 \frac{x^{\beta-1}}{1-x}\log\frac{1-y x^\delta}{1-y}\mathrm dx$?
- Regular average calculated accumulatively
- Need help solving linear equations with elimination and substiution method
- How to calculate the two tangent points to a circle with radius R from two lines given by three points
- Characteristic 2