Intereting Posts

Inverse Laplace Transform of $\bar p_D = \frac{K_0(\sqrts r_D)}{sK_0(\sqrts)}$
Help locating mini mandelbrots
polynomials such that $P(k)=Q(l)$ for all integer $k$
Product of Consecutive Integers is Not a Power
Tricks – Prove Homomorphism Maps Identity to Identity – Fraleigh p. 128 Theorem 13.12(1.)
Finite Groups with exactly $n$ conjugacy classes $(n=2,3,…)$
Motivation for Napier's Logarithms
Inner Products on Exterior Powers
An amazing approximation of $e$
What is the smallest unknown natural number?
Formalizing Those Readings of Leibniz Notation that Don't Appeal to Infinitesimals/Differentials
How find all ordered pairs $(m,n)$ such $\frac{n^3-1}{mn-1}$ is an integer
Relative merits, in ZF(C), of definitions of “topological basis”.
Prove that a simple planar bipartite graph on $n$ nodes has at most $2n-4$ edges.
Express the sequence which alternates three positive ones and three negative ones $1,1,1,-1,-1,-1,\dots$

It is said that one can prove that all 3×3 orthogonal matrices correspond to linear operators on $R^3$ of the following types:

- Rotations about lines through he origin
- Reflections about planes through the origin
- A rotation about a line through the origin followed by a reflection about the plane through the origin that is perpendicular to the line

Obviously, we can tell whether a 2×2 matrix $A$ represents a rotation or reflection by looking at its determinant: that is, a rotation if $det(A)=1$ and a reflection if $det(A)=-1$. But for a 3×3 matrix, this is different. My textbook says that if $det(A)=1$, it is a rotation. If $det(A)=-1$, it is either of type 2 or 3.

**My question is:** How to tell whether a 3×3 orthogonal matrix with determinant $-1$ represents a type 2 or a type 3 ? I heard that this has to do with an analysis of eigenvectors and eigenvalus, but could anyone shed some light on this please?

- Mathematical explanation for the Repertoire Method
- Do $ AB $ and $ BA $ have same minimal and characteristic polynomials?
- Underdetermined linear systems least squares
- $\forall \vec{b} \in \mathbb{R}^n, B \vec{x} = \vec{b}$ is consistent is equivalent to…
- Show that $T\to T^*$ is an isomorphism (Where T is a linear transform)
- Let $T,S$ be linear transformations, $T:\mathbb R^4 \rightarrow \mathbb R^4$, such that $T^3+3T^2=4I, S=T^4+3T^3-4I$. Comment on S.

For example, how can one tell whether the following operator is of type 2 or type 3? (Actually I was quite sure, as I “invented” it, if no mistake, that the following linear operator is neither a rotation, nor a reflection$-$it is of type 3.)

\begin{bmatrix}

\frac{1}{3} & 0 & \frac{4\sqrt{2}}{6} \\

\frac{2}{3} & \frac{1}{\sqrt{2}} & \frac{-\sqrt{2}}{6} \\

\frac{2}{3} & \frac{-1}{\sqrt{2}} & \frac{-\sqrt{2}}{6} \\

\end{bmatrix}

- What are the technical reasons that we must define vectors as “arrows” and carefully distinguish them from a point?
- Exercise books in linear algebra and geometry
- Determining Coefficients of a Finite Degree Polynomial $f$ from the Sequence $\{f(k)\}_{k \in \mathbb{N}}$
- Characterisation of inner products preserved by an automorphism
- Extending a Chebyshev-polynomial determinant identity
- How to get the characteristic equation from a recurrence relation of this form?
- Why is the condition enough for a matrix to be diagonalizable?
- Are all multiplicative functions additive?
- Why do Markov matrices converge to the same vector after infinite steps irrespective of the starting vector?
- Detecting symmetric matrices of the form (low-rank + diagonal matrix)

Eigenvalues (or strictly speaking, *eigendecomposition*) can be used to establish the fact that the three listed cases are the only possible ones. They are not necessary in differentiating type 2 and type 3 matrices.

If $A$ is a reflection about a plane through the origin, then by applying $A$ again, the reflected image will be reflected back to its original. Therefore $A^2=I$.

If $A$ is a rotation followed by a reflection through the rotation plane, pick any vector $u$ that has a nonzero component on the plane of rotation (i.e. pick any $u$ whose orthogonal projection on the rotation plane is nonzero). Decompose $u$ into the sum $v+w$, where $v$ lies on the plane of rotation and $w$ is parallel to the rotation axis (and hence orthogonal to $v$. Then $A^2u=A^2v+w$. When the angle of rotation is not an odd multiple of $\pi$, we have $A^2v\ne v$ and hence $A^2u\ne u$ and hence $A^2\ne I$. When the angle of rotation is an odd multiple of $\pi$, we get $A=-I$.

Therefore, when $A$ is real orthogonal and $\det A=-1$, it is of type 2 if and only if $A^2=I$ and $A\ne-I$, and it is of type 3 if and only if $A^2\ne I$ or $A=-I$.

If there’s a rotation involved, two of the eigenvalues will be a complex conjugate pair. If it’s a pure reflection, all of the eigenvalues will be real. There’s one exception to this—a rotation through an angle of $\pi$—but I’ll cover that below.

In more detail, a pure 3-D rotation matrix will have eigenvalues $1$ and $\cos\theta\pm i\sin\theta$, where $\theta$ is the rotation angle. If $\theta=n\pi$, then this complex conjugate pair will show up as $\pm1$. If positive, you just have the identity transformation; if negative, you have a 180-degree rotation (which is equivalent to a reflection relative the axis of rotation). The eigenspace of $1$ in the non-identity case is one-dimensional and represents the rotation axis. If you follow this rotation with a reflection in the plane normal to this axis, the eigenvalue $1$ will become $-1$.

On the other hand, the eigenvalues of a reflection will all be $\pm1$. The multiplicity of $-1$ tells you what sort of a reflection this is: a single $-1$ is a reflection in a plane, with the corresponding eigenspace normal to this plane; a multiplicity of 2 is a reflection in a line, which is equivalent to a rotation through an angle of $\pi$; a multiplicity of 3 represents releflection through the origin, which we see can be decomposed into a 180-degree rotation followed by a reflection relative to a plane.

To take your example, the matrix has $-1$ as an eigenvalue with multiplicity one, while the other eigenvalues aren’t $\pm1$, so it’s type 3. The axis of rotation is found by computing the kernel of $A+I$, giving $(2+\sqrt2,-2,-2-2\sqrt2)^T$. The angle of rotation can be found by finding the other eigenvalues and converting to polar form. (I did some quick calculations in a CAS program and it looks like the angle might be $\frac\pi4$.)

- How can I prove that a matrix is area-preserving?
- For an outer measure $m^*$, does $m^*(E\cup A)+m^*(E\cap A) = m^*(E)+m^*(A)$ always hold?
- Is it mathematically wrong to prove the Intermediate Value Theorem informally?
- Neumann series and spectral radius
- Why is mathematical induction a valid proof technique?
- Maximal size of independent generator set in simple group
- Different meanings of math terms in different countries
- How to calculate this Ei(x)-involved definite integral?
- Is there a continuous bijection between an interval and a square: $ \mapsto \times $?
- Cylinder object in the model category of chain complexes
- Intersecting maximal ideals of $k$ with $k$
- Show that: $\lim \limits_{n \rightarrow+\infty} \int_{0}^{1}{f(x^n)dx}=f(0)$
- example of irreductible transient markov chain
- how to find the remainder when a polynomial $p(x)$ is divided my another polynomial $q(x)$
- How to prove Liouville's theorem for subharmonic functions