Commutativity of projection and rotation

How to prove (if it is true … I suppose it is) that orthogonal projection from N-dimensional space onto 2-dimensional plane and rotation in this plane

(I assume components of vectors out of the plane are not affected by the rotation)

are commutative

operations independently from dimension of N-dimensional vector space ?

Solutions Collecting From Web of "Commutativity of projection and rotation"

Let us work in $\mathbb{R}^4$. Your question amounts to say that the following Projection and Rotation matrices
$$P=\begin{bmatrix}1&0&0&0\\0&1&0&0\\0&0&0&0\\0&0&0&0\end{bmatrix} \ \ \text{and} \ \
R=\begin{bmatrix}\cos \theta &-\sin \theta&0&0\\ \sin \theta& \cos \theta&0&0\\0&0&1&0\\0&0&0&1\end{bmatrix}$$

commute. This is true because all the computation is in the product of the upper left $2 \times 2$ blocks, which evidently commute.

Following OP’s comments, I would like to clarify JeanMarie’s answer.

As you may know, matrices represent transformations with regard to a particular basis. In $\mathbb{R}^n$ we usually work with the “standard” basis, which I am sure you know: ($e_1, e_2, \dots, e_n$).

Now, a projection matrix onto an arbitrary plane is pretty nasty looking. Depending on the plane it would be full of numbers, and we don’t want that. JeanMarie’s projection matrix looks like the identity matrix; nice and concise.

Here we make use of a little theorem which says that you can construct an orthonormal basis from a set of $n$ linearly independent vectors in $\mathbb{R}^n$. What you do is, you pick two vectors in your plane, and $n-2$ outside the planes (linearly independent). They form a basis for $\mathbb{R}^n$. Now, you can transform that basis so that it is orthonormal (by the Gram-Schmidt process). Now, you don’t need to actually do this, but you need to know that it is posssible. Once you have your $n$ orthonormal vectors, say $u_1, u_2, \dots, u_n$ such that $u_1$ and $u_2$ are in the plane, you can construct a matrix $M = [u_1 | u_2 | \dots | u_n]$ (with the basis vector for columns).

When you have done all this, you can express $P’$ as some matrix $Q$ (the projection under the new basis) multiplied by $M$:

P = M * P’*M^{-1} = M * P’ * M^T

The matrix M is a “change of basis” matrix, and since M is orthogonal, $M^{-1} = M^T$. Basically, the idea is:

  • $M^T$ => convert from standard basis to our nice basis
  • $P’$ => perform rotation under our nice basis
  • $M$ => convert back to standard basis

Under the new basis, $P’$ is simply the identity matrix in the first $2 \times 2$ block, $0$ everywhere else. Things become much easier to prove (including your skew-symmetric question; trivial under this representation).

The gist is that JeanMarie’s answer isn’t just a proof for a projection “on a specific plane”; if you include the change of basis business it is a proof for a projection on any plane.

This might be a lot to digest (and rather poorly explained) if you have never seen basis/change of basis/conjugation before. I would advice you do some reading on it; it’s a rather important topic.

I hope it helps!

PS: Note that any basis would do for this particular question, but since you asked about the proof for “skew-symmetric” on a post flagged as duplicate, an orthonormal basis makes this easier.