# Projections onto ranges/subspaces

I’m stuck on a review problem.

Consider the matrix:

$$\left[ \begin{array}{ccc} -1 & 1 \\ 1 & 1\\ 2 & 1 \end{array} \right]$$

I’m asked to find a matrix $P$ which projects onto the range of $A$, with respect to the standard basis.

I’m not actually sure how to tackle this problem. I know that if you want to project a vector $u$ onto a vector $v$, you do $\operatorname{proj}_v u = \frac{\langle u,v\rangle}{\langle v.v\rangle}v$. Does a similar procedure exist if you want to project a matrix onto another matrix?

Also, what do they mean by range of $A$? By googling, it appears to refer to the column space, but if that’s true, then what is the domain of a matrix? Does it even exist?

So it’s easy to see that the two columns of $A$ are linearly independent, so they form a basis for $C(A)$. How should I go about solving the given problem? Also, what if I have to do it with respect to a non-standard basis?

#### Solutions Collecting From Web of "Projections onto ranges/subspaces"

Let your matrix be $A$. Then the projection matrix onto $R(A)$ is $P_{R(A)}=A(A^TA)^{-1}A^T$.

Indeed, the range of $A$, denoted $R(A)$, is the column space of $A$.

For a non-standard basis, express $A$ in the new basis, and then apply the above formula.

Edited:
The intuition on the structure of the projection matrix $P$ is as follows.
The defining property of $P$ is that for any vector $x$ we must have that
$x-Px \perp R(A)$ or equivalently that $x-Px \in R(A)^{\perp}$ and since $R(A)^{\perp}=N(A^T)$, this yields $x-Px \in N(A^T)$. So $A^T (x-Px)=0$ for all $x$. Selecting $x=e_1, e_2, …, e_n$, the standard basis yields $n$ equations, $A^T P e_i = A^T e_i, i=1, …,n$. Now, the minimum norm solution of the $i^{th}$ system with respect to $Pe_i$ is $Pe_i = A (A^T A)^{-1}A^T e_i$. Grouping all these equations in matrix notation yields $P=A (A^T A)^{-1}A^T$.

Manos posted $A(A^TA)^{-1}A^T$. Since the proof is so simple, I’ll add it here.

Any vector can be written as $u+v$, where $u$ is in the column space of $A$ and $v$ is orthogonal to the column space of $A$. What must be proved is then that
$A(A^TA)^{-1}A^T(u+v)= u$. Notice that $A^Tv = 0$, so we must only consider $A(A^TA)^{-1}A^Tu$. Since $u$ is in the column space of $A$, we have $u=Aw$ (where, in this case, $w$ is in $\mathbb{R}^2$). (PS: Note in response to a comment below: $w\in\mathbb R^2,$ NOT $\mathbb R^3$.) So
$$A(A^TA)^{-1}A^T u = \Big(A(A^TA)^{-1}A^T \Big) Aw = A(A^TA)^{-1} \Big( A^T A\Big)w.$$
Do the obvious cancelation and get $Aw$ and remember that $Aw=u$.