Intereting Posts

Requesting deeper understanding of binomial coefficient
Proving $\int_{0}^{\infty}\frac{x}{(x^2+1)(e^{2\pi x}+1)} dx=1-\frac{\gamma}{2}-\ln2$
Fejér's Theorem (Problem in Rudin)
If $\phi$ is a tautology then dual $\phi$ is a contradiction.
Show the normal subgroups and cosets of a dihedral group (D6)
cutting a cake without destroying the toppings
for infinite compact set $X$ the closed unit ball of $C(X)$ will not be compact
Extreme points of unit ball of Banach spaces $\ell_1$, $c_0$, $\ell_\infty$
Center of $\mathfrak{sl}(n,F)$
Do there exist bump functions with uniformly bounded derivatives?
Pointwise convergence of $h_n(x) = x^{1+\frac{1}{2n-1}}$
How close can $\sum_{k=1}^n \sqrt{k}$ be to an integer?
The limit of composition of two functions
Understanding of nowhere dense sets
How to prove divergence elementarily

I saw in a definition for unitary matrices, that for a complex matrix being unitary if $M: \mathbb{C}^{n} \rightarrow \mathbb{C}^{n}$ is unitary, or:

$\langle Mv, Mw \rangle = \langle v,w \rangle$ $\forall v,w \in \mathbb{C}^{n}$

Then, an equivalent definition was that $M$ is unitary if and only if $MM^{*}=\mathrm{Id}$. The proof I saw went as follows (can take the standard basis since the inner product is linear):

- Is $\|x\| = \| \overline{x} \|$ in an inner product space?
- $||u||\leq ||u+av|| \Longrightarrow \langle u,v\rangle=0$
- Proving that if $T \in \mathcal{L}(V)$ is normal, then the minimal polynomial of $T$ has no repeated roots.
- Orthonormal basis
- A subspace $X$ is closed iff $X =( X^\perp)^\perp$
- Differentiating an Inner Product

$\langle Me_{i}, Me_{j} \rangle = \langle e_{i},e_{j} \rangle = \delta_{ij}$

Since $Me_{i}$ is the $i$-th column of $M$, it follows $\langle Me_{i}, Me_{j} \rangle = \langle M^{*}Me_{i}, e_{j} \rangle$ is the $ij$-th entry of $M^{*}M$. However, the point I don’t understand is why would this inner product give us such $ij$-th entry of the matrix. Are we assuming that this inner product is the standard inner product on $\mathbb{C}^{n}$? Or what would be the more precise definition of an unitary matrices that justifies this step?

Thanks for the help.

- How to solve this recurrence relation? $f_n = 3f_{n-1} + 12(-1)^n$
- How do I calculate the right value of this Integral?
- Solution of $A^\top M A=M$ for all $M$ positive-definite
- Suppose $S_1 =\{ u_1 , u_2 \}$ and $S_2 = \{ v_1 , v_2 \}$ are each independent sets of vectors in an n-dimensional vector space V.
- Fact about polynomials
- Multiplicity of eigenvalues
- Normal operator + only real eigenvalues implies self-adjoint operator?
- $ax+by+cz=d$ is the equation of a plane in space. Show that $d'$ is the distance from the plane to the origin.
- What kind of vector spaces have exactly one basis?
- Origin and use of an identity of formal power series: $\det(1 - \psi T) = \exp \left(-\sum_{s=1}^{\infty} \text{Tr}(\psi^{s})T^{s}/s\right)$

The inner product you’re considering is defined by

$$

\langle v,w\rangle=v^*w

$$

(or $w^*v$, but it’s immaterial, do the necessary changes if this is the case).

Suppose $\langle Mv,Mw\rangle=\langle v,w\rangle$ for every $v,w$. This means

$$

(Mv)^*(Mw)=v^*w

$$

or

$$

v^*(M^*Mw)=v^*w

$$

so

$$

v^*(M^*Mw-w)=0

$$

Since this holds for every $v$, we have that $M^*Mw-w=0$ for every $w$ and this is the same as $(M^*M-I)w=0$, so $M^*M-I$ is the zero matrix.

Conversely, if $M^*M=I$, we clearly have

$$

\langle Mv,Mw\rangle=(Mv)^*(Mw)=v^*(M^*M)w=v^*w=\langle v,w\rangle

$$

Whenever you do $\langle Ae_i,e_j\rangle$ where $A$ is a Hermitian matrix, you’re doing $e_iAe_j$: now $Ae_j$ is the $j$-th column of $A$, and multiplying by $e_i$ produces the coefficient in the $i$-th row. Hence we get the $(i,j)$ coefficient of $A$.

Finally, note that $M^*M$ is Hermitian.

If I get your question correctly your basic doubt arises from converting a linear operator given in dirac notation to its matrix notation with respect to some basis. Let $A$ be a linear operator $A:V \to W$ and let the orthonormal basis for hilbert spaces $V$ and $W$ be respectively $\{v_1,v_2..v_m\}$ and $\{w_1,w_2..w_n\}$ respectively then the operator can be defined as

$$A|v_i\rangle = \sum_i A_{ij}|w_i\rangle……(1)$$

here $A_{ij}$ are the entries of matrix representation of $A$ in input and output basis $\{v\}$ and $\{w\}$ respectively. Why is it so ? you can have a look for detailed explanation here Matrix Representation for Linear Operators in Some basis or prove it yourself.

Now according to the completeness relation if I have a hilbert space $V$ with an orthonormal basis $\{i\}$ then $\sum |i \rangle \langle i|=I_v$ ( identity operator for hilbert space $V$ ). So using the definition of linear operator and completeness relation you can write

$$A=\sum_{ij} \langle w_j|A|v_i\rangle |w_j\rangle \langle v_i|……(2)$$

Finally by comparing ( comparing $(1)$ and $(2)$ ) it with previous notation it is easy to see that $A_{ji}=\langle w_j|A|v_i\rangle$. Coming back to your example if we replace $w_j$ by $e_i$ and $v_i$ by $e_j$ we get

$A_{ij}=\langle e_i|A|e_j\rangle$ ( and in your case $A=MM^*$ ). I hope I answered your question.

- $K(u,v)$ is a simple extension of fields if $u$ is separable
- Upper bounds on the size of $\operatorname{Aut}(G)$
- Weakly convex functions are convex
- Why is this weighted least squares cost function a function of weights?
- Is Sobolev Spaces Uniformly Convex?
- separation theorem for probability measures
- What are the differences between Jacobson's “Basic Algebra” and “Lectures in Abstract Algebra”?
- Eigenvalues of a tridiagonal stochastic matrix
- Recursion relation for Euler numbers
- Finding Smith normal form of matrix over $ \mathbb{R} $
- What are sharp lower and upper bounds of the fast growing hierarachy?
- Point of logarithms?
- How to determine the existence of all subsets of a set?
- Asymptotic expansion of the integral $\int_0^\infty e^{-xt} \ln(1+\sqrt{t}) dt$ for $x \to \infty$
- Optimal strategy for this Nim generalisation?