Intereting Posts

How does one denote the set of all positive real numbers?
FO-definability of the integers in (Q, +, <)
Is $\int_0^\infty \frac{dt}{e^t-xt}$ analytic continuation of $\sum_{k=1}^\infty \frac{(k-1)!}{k^k} x^{k-1}$?
What mathematical questions or areas have philosophical implications outside of mathematics?
How to make a smart guess for this ODE
Why study Algebraic Geometry?
$F:C\to D$, $G:D\to E$, $G$ has an adjoint, $F$ is fully faithful and for each $Z$ there is $X$ s.t. $F(X) = H(G(Z))$: Does $F$ has an adjoint?
Why is the Riemann sum less than the value of the integral?
Subgroups of $D_4$
Bases having countable subfamilies which are bases in second countable space
What is the difference between Hom and Sheaf Hom?
Exponential Function as an Infinite Product
If this is a telescoping series then how does it collapse? $\frac{3r+1}{r(r-1)(r+1)}$
Really Stuck on Partial derivatives question
For any prime $p > 3$, why is $p^2-1$ always divisible by 24?

Let $\mathrm A \in \mathbb C^{2 \times 2}$. How many $2 \times 2$ matrices $\mathrm A$ satisfy $\mathrm A^{3} = \mathrm A$. Infinitely many?

If it is $3 \times 3$ matrix then by applying Cayley-Hamilton theorem I could have said that given matrix is diagonalizable. Also zero is eigenvalue of $\mathrm A$. So it would be collection of all singular diagonalizable matrices. But how to count them?

Here I have $2 \times 2$ matrices i feel I can’t apply the Cayley-Hamilton Theorem here?

I am stuck with these thoughts?

- What kind of matrices are non-diagonalizable?
- Eigenvalues appear when the dimension of the Prime Index Matrix is a prime-th prime. Why?
- If two real symmetric square matrices commute then does they have a common eigenvector ?
- Find trace of linear operator
- If $Q$ is an operator on a Hilbert space with $Qe_n=λ_ne_n$ for all $n$, then $Q^{-\frac 12}e_n=\frac 1{\sqrt{λ_n}}e_n$ for all $n$ with $λ_n>0$
- Evaluating eigenvalues of a product of two positive definite matrices

- Sum of positive definite matrices still positive definite?
- Prove any orthogonal 2-by-2 can be written in one of these forms…
- Symmetrical and skew-symmetrical part of rotation matrix
- Cross product: matrix transformation identity
- To Find $A^{50}$
- What exactly are eigen-things?
- Is $SO_n({\mathbb R})$ a divisible group?
- Angle preserving linear maps
- If the dot product between two vectors is $0$, are the two linearly independent?
- In Linear Algebra, what is a vector?

If $A^3=A$, then $A$ satisfies the polynomial $t^3-t = t(t^2-1)=t(t-1)(t+1)$.

Since you say you know about the minimal polynomial

The minimal polynomial divides every polynomial that the matrix satisfies, so the minimal polynomial divides $t(t-1)(t+1)$. In particular, since it is squarefree, $A$ must be diagonalizable.

If the matrix has a single eigenvalue, then it is a scalar multiple of the identity; the three possibilities are $0$, $I$, and $-I$.

If the matrix has distinct eigenvalues, then the eigenvalues are either $0$ and $1$, $0$ and $-1$, or $1$ and $-1$. In any of these cases, there are infinitely many distinct matrices that satisfy these conditions. For example, for eigenvalues $0$ and $1$, any projection onto a 1-dimensional subspace will do; there are infinitely many different projections onto, say, the subspace $\{(x,0)\mid x\in\mathbb{C}\}$, one for every possible complement. That already solves the problem.

If you don’t remember about the minimal polynomial, you can note that any eigenvalue $\lambda$ must be a root of the polynomial $t^3-t$ (more generally, if $A$ satisfies $f(t)$, then every eigenvalue of $A$ must be a root of $f(t)$). That leads you to the fact that $A$ has eigenvalues $0$, $1$, and $-1$, and from there you can obtain infinitely many different matrices as noted above.

Any projection matrix satisfies $A^2=A$ and so $A^3=A$. You can get one projection matrix for each line through the origin and so there are infinitely many solutions. As Arturo has noticed, this already answers the question.

If you want to find all matrices $A$ such that $A^3=A$, you can proceed like this:

The Cayley-Hamilton theorem implies that $A^2 = \alpha A + \beta I$ for some scalars $ \alpha $ and $\beta$. Then, $A=A^3= \alpha A^2 + \beta A=\alpha (\alpha A + \beta I)+ \beta A=(\alpha^2+\beta)A+\alpha\beta I$ implies $(1-\alpha^2-\beta)A=\alpha\beta I$.

If $1-\alpha^2-\beta\ne0$, we get $A=\gamma I$. In this case, we must have $\gamma^3=\gamma$ and so there are only three solutions: $A=0$, $A=I$, $A=-I$.

If $1-\alpha^2-\beta=0$, then $\alpha\beta=0$, and the only solutions come from $A^2=I$ and $A^2=\pm A$. Now there are plenty of solutions for these. Even $A^2=I$ has infinitely many solutions. See for instance Involutory matrix. Alex’s comment contains an infinite subfamily.

All this is essentially Arturo’s answer without using minimal polynomials or eigenvalues.

- A right angle at the focus of a hyperbola
- In a finite ring extension there are only finitely many prime ideals lying over a given prime ideal
- Proving if $\limsup x_n = \liminf x_n = c$, then $x_n \rightarrow c, n \rightarrow \infty$ using $\epsilon$
- Why does the tensor product of an irreducible representation with the sign representation yield another irreducible representation?
- Division of Complex Numbers
- Finding out the area of a triangle if the coordinates of the three vertices are given
- Why is $\pi$ = 3.14… instead of 6.28…?
- Prob. 2, Chap. 6, in Baby Rudin: If $f\geq 0$ and continuous on $$ with $\int_a^bf(x)\ \mathrm{d}x=0$, then $f=0$
- Nested Radicals: $\sqrt{a+\sqrt{2a+\sqrt{3a+\ldots}}}$
- Difference between “only if” and “if and only if”
- Are locally homotopic functions homotopic?
- Bound for the degree
- Contest problem about convergent series
- Finding the confidence level and number of successes?
- Manifold embedded in euclidean space with nontrivial normal bundle