Intereting Posts

Example of a an endomorphism which is not a right divisor of zero and not onto
Integration by Parts? – Variable Manipulation
Can a number have infinitely many digits before the decimal point?
How Do I Compute the Eigenvalues of a Small Matrix?
Integrating $\int_0^ex^{1/x}\ \mathrm dx$
Why must polynomial division be done prior to taking the limit?
Why can't three unit regular triangles cover a unit square?
Combinatorial proof of binomial coefficient summation identity
Hard binomial sum
Every ideal of an algebraic number field can be principal in a suitable finite extension field
Let $f(x)$ be differentiable at $\mathbb R$, s.t $|f^\prime (x)| \le 6$ . Its given that $f(1)=20$, and $f(9)=68$ , prove that $f(7)=56$.
Differential and derivative of the trace of a matrix
Find all integers $x$ such that $x^2+3x+24$ is a perfect square.
How find this sum closed form?
Do there exist periodic fractals $A_f$ of this type?

How to prove (if it is true … I suppose it is) that orthogonal projection from N-dimensional space onto 2-dimensional plane and rotation in this plane

(I assume components of vectors out of the plane are not affected by the rotation)

**are** **commutative**

- On the decomposition of stochastic matrices as convex combinations of zero-one matrices
- Singular Values/l2-norm of Pseudo-inverse
- Eigenvectors of the Zero Matrix
- Extension of linearly independent set to a basis in an infinite dimensional vector space
- How to find the multiplicity of eigenvalues?
- How to deal with misapplying mathematical rules?

operations independently from dimension of N-dimensional vector space ?

- A norm which is symmetric enough is induced by an inner product?
- Why is determinant a multilinear function?
- $\forall \vec{b} \in \mathbb{R}^n, B \vec{x} = \vec{b}$ is consistent is equivalent to…
- Can a basis for a vector space $V$ can be restricted to a basis for any subspace $W$?
- Sum of positive definite matrices still positive definite?
- Understanding a Proof for Why $\ell^2$ is Complete
- Determining whether a symmetric matrix is positive-definite (algorithm)
- What exactly is a basis in linear algebra?
- Expressing the determinant of a sum of two matrices?
- Symmetrize eigenvectors of degenerate (repeated) eigenvalue

Let us work in $\mathbb{R}^4$. Your question amounts to say that the following Projection and Rotation matrices

$$P=\begin{bmatrix}1&0&0&0\\0&1&0&0\\0&0&0&0\\0&0&0&0\end{bmatrix} \ \ \text{and} \ \

R=\begin{bmatrix}\cos \theta &-\sin \theta&0&0\\ \sin \theta& \cos \theta&0&0\\0&0&1&0\\0&0&0&1\end{bmatrix}$$

commute. This is true because all the computation is in the product of the upper left $2 \times 2$ blocks, which evidently commute.

Following OP’s comments, I would like to clarify JeanMarie’s answer.

As you may know, matrices represent transformations with regard to a particular basis. In $\mathbb{R}^n$ we usually work with the “standard” basis, which I am sure you know: ($e_1, e_2, \dots, e_n$).

Now, a projection matrix onto an arbitrary plane is pretty nasty looking. Depending on the plane it would be full of numbers, and we don’t want that. JeanMarie’s projection matrix looks like the identity matrix; nice and concise.

Here we make use of a little theorem which says that you can construct an orthonormal basis from a set of $n$ linearly independent vectors in $\mathbb{R}^n$. What you do is, you pick two vectors in your plane, and $n-2$ outside the planes (linearly independent). They form a basis for $\mathbb{R}^n$. Now, you can transform that basis so that it is orthonormal (by the Gram-Schmidt process). Now, you don’t need to actually do this, but you need to know that it is posssible. Once you have your $n$ orthonormal vectors, say $u_1, u_2, \dots, u_n$ such that $u_1$ and $u_2$ are in the plane, you can construct a matrix $M = [u_1 | u_2 | \dots | u_n]$ (with the basis vector for columns).

When you have done all this, you can express $P’$ as some matrix $Q$ (the projection under the new basis) multiplied by $M$:

$$

P = M * P’*M^{-1} = M * P’ * M^T

$$

The matrix M is a “change of basis” matrix, and since M is orthogonal, $M^{-1} = M^T$. Basically, the idea is:

- $M^T$ => convert from standard basis to our nice basis
- $P’$ => perform rotation under our nice basis
- $M$ => convert back to standard basis

Under the new basis, $P’$ is simply the identity matrix in the first $2 \times 2$ block, $0$ everywhere else. Things become much easier to prove (including your skew-symmetric question; trivial under this representation).

The gist is that JeanMarie’s answer isn’t just a proof for a projection “on a specific plane”; if you include the change of basis business it is a proof for a projection on any plane.

This might be a lot to digest (and rather poorly explained) if you have never seen basis/change of basis/conjugation before. I would advice you do some reading on it; it’s a rather important topic.

I hope it helps!

*PS: Note that any basis would do for this particular question, but since you asked about the proof for “skew-symmetric” on a post flagged as duplicate, an orthonormal basis makes this easier.*

- An example of a norm which can't be generated by an inner product
- Can I use my powers for good?
- Count elements in a cyclic group of given order
- Reciprocal-based field axioms
- Determinant of a finite-dimensional matrix in terms of trace
- Prime divisors of the conductor of an order of an algebraic number field
- Particular case of every sequence has a Cauchy subsequence?
- We can define the derivative of a function whose domain is a subset of rational numbers?
- Factorial Inequality problem $\left(\frac n2\right)^n > n! > \left(\frac n3\right)^n$
- Application of Weierstrass theorem
- Probability that the sum of 6 dice rolls is even
- How to prove $\lim_{n\rightarrow \infty} {a^n \over n!}=0$
- How to find a linearly independent vector?
- Formal deductions on Hilbert system
- RSA: How Euler's Theorem is used?