Articles of orthogonality

Eigenvalue lower bound for this simple matrix

For orthonormal vectors ${\textbf q}_1, {\textbf q}_2, {\textbf q}_3 \in \mathbb{R}^3$, I want to show that the matrix $$\big(\begin{array}{c:c:c}2{\textbf q}_1 & -{\textbf q}_2 & -{\textbf q}_3\end{array} \big) \in \mathbb{R}^{3 \times 3}$$ has eigenvalues $\lambda_k$ fulfilling $1 \leq |\lambda_k| \leq 2$. Since the determinant is $2$, it suffices to show $1 \leq |\lambda_k|$. I know this […]

Orthogonal curvilinear coordinates (derivatives of unit vectors)

Suppose that $\{u_i\}_{1\le i\le 3}$ is a set of orthogonal curvilinear coordinates with unit vectors $\{\mathbf{\hat{e}_i}\}_{1\le i\le 3}$. I proved that $$\frac{\partial \mathbf{\hat{e}_i}}{\partial u_j} = \frac{\mathbf{\hat{e}_j}}{h_i}\frac{\partial h_j}{\partial u_i},\tag{1}$$ where $h_i$ is a scale factor such that for a position vector $\mathbf r$ we have $\dfrac{\partial \mathbf r}{\partial u_i}= h_i \mathbf{\hat{e}_i}$. Eq. $(1)$ is valid for […]

Why is the matrix-defined Cross Product of two 3D vectors always orthogonal?

By matrix-defined, I mean $$\left<a,b,c\right>\times\left<d,e,f\right> = \left| \begin{array}{ccc} i & j & k\\ a & b & c\\ d & e & f \end{array} \right|$$ …instead of the definition of the product of the magnitudes multiplied by the sign of their angle, in the direction orthogonal) If I try cross producting two vectors with no […]

Basis to Hyperplane

Given a hyperplane $\{x\in\mathbb R^n | a^T x=0\}$ where $a\in\mathbb R^n$, and I want to find some orthogonal basis to this hyperplane. I found many solutions for special cases, but non of which considers the general case. Thanks in advance!

How to find an Orthonormal Basis for Null( A$^T$ )

I’m studying for an exam and I’m not sure how to do this. I understand what the definitions mean (for the most part) but I’m not sure how to apply it to the problem. Let A = \begin{pmatrix}1/2&-1/2\\1/2&-1/2\\1/2&1/2\\1/2&1/2 \end{pmatrix} a) Find an orthonormal basis for Null( A$^T$ ) and b) Determine the projection matrix Q […]

Proving that Legendre Polynomial is orthogonal

This is from here $$\int^1_{-1}f_n(x)P_n(x)dx = 2(-1)^n\frac{a_n}{2^n}\int^1_0(x^2-1)^ndx=2(-1)^n\frac{a_n}{2^n}.I_n$$……..(6) I don’t understand as in shouldnt it be like this, $$\int^1_{-1}f_n(x)P_n(x)dx = (-1)^n\frac{a_n}{2^n}\int^1_{-1}(x^2-1)^ndx=0$$ as they should cancel out even if the integral is non-zero. Edited: Lastly, how does $\int^1_{-1}f_n(x)P_n(x)dx$ shows orthogonality?

Question About Orthoganality of Hermite Polynomials

It is known that is $m \ne n$: $$ \int_{-\infty}^{\infty} H_n(x) H_m(x) e^{-x^2}dx = 0 $$ Does this apply for any $f(x)$? $$ \int_{-\infty}^{\infty} H_n(f(x)) H_m(f(x)) e^{f(x)^2} dx = 0 $$

Orthogonal in inner product space

Let $(X,<.>)$ is an inner product space prove that $x$ and $y$ are orthogonal if and only if $||x+αy|| \ge ||x||$ for any scalar $α$ . The first direction if $x$ and $y$ are orthogonal then$||x+αy||^2=||x||^2+|α|^2||y||^2 \ge ||x||^2 $ so $||x+αy|| \ge ||x||$ . But the second direction if $||x+αy|| \ge ||x||$ for any scalar […]

$\operatorname{Im} A = (\operatorname{ker} A^*)^\perp$

This question already has an answer here: Is the formula $(\text{ker }A)^\perp=\text{im }A^T$ necessarily true? 1 answer

Prove The Orthogonal Complement of an Intersection is the Sum of Orthogonal Complements

How does one prove that $(A∩B)^⊥=A^⊥+B^⊥$? Seems a bit harder than proving $(A+B)^⊥=A^⊥∩B^⊥$.