Articles of inner product space

Some kind of projection in a non-orthogonal basis

Sorry if the title sounds convoluted, I couldn’t find any better. In $R^d$, let $(e_1,\ldots, e_d)$ be a basis. Show there exists $(a_1,\ldots, a_d)$ d vectors of $R^d$ such that $$\forall x \in R^d, x=\sum_{i=1}^{d} \langle x, e_i \rangle a_i $$ I tried to mimic the proof of Gram-Schmidt orthogonalization, but failed. After that, I […]

Proof that orthogonal projection is a linear transformation

Let $f_1,f_2$ be two orthogonal non zero vectors that span a plane $\pi$. The projection of a vector $w$ onto the plane $\pi$ is given by $Proj_{\pi}(w)= $$\frac{w\cdot n}{f_1\cdot f_1}\cdot f_1 + \frac{w\cdot n}{f_2\cdot f_2}\cdot f_2$. Prove that the function $T : R^3 \to R^3$ defined by $T(w) = Proj_{\pi}(w) $ is a linear transformation. […]

Is the orthogonality between Associated Legendre polynomials preserved on an interval

So I am aware of the orthogonality between the Associated Legendre polynomials on the interval $[-1,1]$, that is: \begin{equation} \int_{-1}^{1}P^m_kP^m_ldx\propto\delta_{k,l} \end{equation} where $\delta_{k,l}$ is the kronecker delta function (I am only interested in the case where the upper indices of the Legendre polynomials are equal, but feel free to also discuss the opposite case as […]

Gradient of inner product in Hilbert space

Let $\mathcal{H}$ be a Hilbert space and \begin{align} f&\colon \mathcal{H} \to \mathbb{R}\\ f(x) &= ||x-c||_\mathcal{H} ^2 \end{align} from some constant $c \in \mathbb{H}$ Is the derivative of $f$ at $x$ equal $2x-2c$? I used \begin{align} f(x) &= \langle x,x \rangle – 2\langle x,c \rangle + \langle c,c \rangle \end{align} and the standard rules for derivation […]

To show that $S^\perp + T^\perp$ is a subset of $(S \cap T)^\perp$

I am also given that if $S$ and $T$ are subspaces of a vector field, then the above are equivalent

Does every infinite-dimensional inner product space have an orthonormal basis?

This question already has an answer here: An orthonormal set cannot be a basis in an infinite dimension vector space? 3 answers

How can we compute the square root of an operator of the form $Cv=\sum_{n\in\mathbb N}\langle v,e_n\rangle_Ve_n$?

Let $\mathbb K\in\left\{\mathbb C,\mathbb R\right\}$ $U$ and $V$ be $\mathbb K$-Hilbert spaces such that $U\subseteq V$ and that the inclusion $\iota$ is Hilbert-Schmidt $C:=\iota^\ast$ denote the adjoint of $\iota$ $(e_n)_{n\in\mathbb N}$ be an orthonormal basis of $U$ We can show that $$Cv=\sum_{n\in\mathbb N}\langle v,e_n\rangle_Ve_n\;\;\;\text{for all }v\in V\;.$$ How can we compute the square root $C^{1/2}$ […]

Gram-Schmidt Orthogonalization for subspace of $L^2$

I am a little stuck on the following problem: By using the Gram-Schmidt Orthogonalization, find an orthonormal basis for the subspace of $L^2[0,1]$ spanned by $1,x, x^2, x^3$. OK, so I have defined: $$e_1 = 1$$ I would then assume that we proceed as follows: $$e_2 = x – \frac{\langle x,1 \rangle}{\langle 1,1 \rangle} \cdot […]

Prove $\langle x,x \rangle < 0$ or $\langle x,x \rangle > 0$ for all $x \neq 0$

[Added by PLC: This question is a followup to this already answered question.] Keep the axioms for a real inner product (symmetry, linearity, and homogeneity). But make the fourth be $$\langle x,x \rangle = 0 \text{ if and only if } x = 0.$$ I want to prove that either $\langle x,x \rangle > 0$ […]

If $\|Tv\|=\|T^*v\|$ for all $v\in V$, then $T$ is a normal operator

I have solved a question but I am not sure the last step of the question. If someone can verify it that would be great. Let $V$ be a finite dimensional vector space with complex inner product. Let $T: V \rightarrow V$ be a linear transformation. If $\forall v\in V$, $\|Tv\|=\|T^*v\|$ then $T$ is a […]