Intereting Posts

Is the commutator subgroup of a profinite group closed?
Is every normed vector space, an inner product space
What is the remainder when $2^{1990}$ is divided by $1990$?
How to prove that $b^{x+y} = b^x b^y$ using this approach?
History of elliptic curves
Recurrence relation for the integral, $ I_n=\int\frac{dx}{(1+x^2)^n} $
A formula for heads and tail sequences
What is the geometric meaning of the inner product of two functions?
A question on behavior of a function which is a limit of a sequence of functions converging under some norm
Trigonometry : Simplify and find the value of $\tan\theta(1-\sec\frac{\theta}{2})(1-\sec\theta)(1-\sec2\theta)\dots(1-\sec2^{n-1}\theta)$ at n =1,2,3
“Strong” derivative of a monotone function
If $x$ is a positive rational but not an integer, is $x^x$ irrational?
Does $u v^T + v u^T$ have exactly one positive and one negative eigenvalue when $u \not \propto v$?
Convex combination
Probability that one part of a randomly cut equilateral triangle covers the other

Let $V$ be a finite-dimensional vector space over $\mathbb{F}$. In what follows, I assume as known $\operatorname{Bilinear}(V^* \times V, \mathbb{F}) \cong V \otimes V^*$ (since the derivation of this canonical isomorphism is covered in Greub’s *Multilinear Algebra*) and also that $V \cong V^{**}$. Thus what I want to show is:

Lemma:$\operatorname{Bilinear}(V^* \times V, \mathbb{F}) \cong \mathscr{L}(V,V)$.

Let $\varepsilon : V^* \times V \to \mathbb{F}$ be the bilinear map defined for all $f \in V^*, v \in V$ as: $$\varepsilon:(f,v) \mapsto f(v)\,. $$ Assume we are given a linear transformation $A \in \mathscr{L}(V,V)$, then this induces the bilinear map: $$\Phi_A \in \operatorname{Bilinear}(V^* \times V, \mathbb{F}), \quad \Phi_A: (f,v) \mapsto \varepsilon(f, Av)\,. $$ By the bilinearity of $\varepsilon$ and the fact that $\mathscr{L}(V,V)$ is a vector space, it is fairly clear that the assignment $\quad A \mapsto \Phi_A \quad$ is a linear map from $\mathscr{L}(V,V) \to \operatorname{Bilinear}(V^* \times V, \mathbb{F})$. Thus, in order to conclude the proof it suffices to show that the assignment is bijective.

- Number of Idempotent matrices.
- Nth roots of square matrices
- Is it true that $\sigma(AB) =\sigma(BA) $.
- Visualizing the four subspaces of a matrix
- Intersection of $n-1$ dimensional subspaces…
- Matrix Multiplication - Product of and Matrix

**Surjectivity:** Let $\beta \in \operatorname{Bilinear}(V^* \times V, \mathbb{F})$ . Fix a basis $v^1, \dots, v^n$ of $V$.

Then for all $i$, we have that $\beta(\cdot,v^i) \in V^{**} \cong V$. By (bi)linearity, the behavior of $\beta$ is completely determined by the behavior of the $n$ functions $\beta(\cdot, v^1),\dots,\beta(\cdot, v^n)$.

Let $A \in \mathscr{L}(V,V)$ be such that for all $i$: $Av^i “=” \beta(\cdot, v^i)$ (this makes sense since the isomorphism $V \cong V^{**}$ is canonical). Then we have that $\beta = \Phi_A = \varepsilon(\cdot, A(\cdot))\,.$

**Injectivity:** Assume $\Phi_A=\Phi_{A’}$. Then $$0=\Phi_A-\Phi_{A’}=\varepsilon(\cdot,A(\cdot))-\varepsilon(\cdot,A'(\cdot))=\varepsilon(\cdot,(A-A’)(\cdot))\,, $$ the last equality following from (bi)linearity.

For any $v\in V$, clearly, $\varepsilon(f, v)=f(v)=0$ for *all* $f \in V^*$ if and only if $v = 0$.

Since for any fixed $v$, $\varepsilon(f, (A-A’)v)=0$ for *all* $f \in V^*$, it follows that $(A-A’)v=0$ for any $v \in V$. Thus $A-A’=0$ and $A=A’$. $\square$

Is this correct?

- How does linear algebra help with computer science
- Looking for an intuitive explanation why the row rank is equal to the column rank for a matrix
- Conditions for the equivalence of $\mathbf A^T \mathbf A \mathbf x = \mathbf A^T \mathbf b$ and $\mathbf A \mathbf x = \mathbf b$
- Find the eigenvalues and eigenvectors with zeroes on the diagonal and ones everywhere else.
- $(AB-BA)^m=I_n$ has solution if and only if $n=mk$ where $m\geq 2$ is an integer number. Is it correct?
- Cyclic modules over a polynomial ring
- A finite-dimensional vector space cannot be covered by finitely many proper subspaces?
- Taking derivative of $L_0$-norm, $L_1$-norm, $L_2$-norm
- What REALLY is the modern definition of Euclidean Spaces?
- Distance from point $(1,1,1,1)$ to the subspace of $R^4$

The only part of the above that I feel unsure about is the claim:

Then $\beta = \Phi_A = \varepsilon(\cdot, A(\cdot))$.

since it skips so many steps, albeit all easy and simple steps. So to prove this, I will resort to extreme pedantry. Please bear with me.

Define the canonical isomorphism $V \cong V^{**}$ by $\xi : V \to V^{**}$, $\xi^{-1}: V^{**} \to V$.

Then more precisely, $A$ is defined so that for all $i \in [n]=\{1,\dots,n\}$: $$Av^i = \xi^{-1}[\beta(\cdot,v^i)]\,. $$ Any linear transformation (on a finite-dimensional vector space) is uniquely determined by its action on a basis, thus this uniquely determines $A$ (i.e. without ambiguity).

So let us now calculate $\Phi_A$ and show that it is indeed equal to $\beta$, as claimed.

Let $v\in V, v=a_1v^1 + \dots + a_nv^n$, and $f \in V^*$. Then we have that $$\Phi_A(f,v)= \varepsilon(f,Av)=\varepsilon(f,A(a_1v^1 + \dots a_nv^n))=\varepsilon(f,a_1Av^1 + \dots + a_n A v^n)\,, $$ the last inequality following from linearity of $A$. Then by bilinearity of $\varepsilon$: $$ = a_1\varepsilon(f,Av^1)+ \dots+a_n\varepsilon(f,Av^n)=a_1 \varepsilon(f,\xi^{-1}[\beta(\cdot, v^i)])+\dots+a_n\varepsilon(f,\xi^{-1}[\beta(\cdot,v^n)])\,. $$ Now the definition of $\xi$ is that for all $v\in V$, $\xi v=\varepsilon(\cdot,v)$ and for all $\zeta \in V^{**}$ and all $f \in V^*$, $$f(\xi^{-1}\zeta)=\zeta(f)\,. $$ So using the definition of $\varepsilon$, we rewrite the above again as: $$a_1 \cdot f(\xi^{-1}[\beta(\cdot, v^1)]) + \dots + a_n \cdot f(\xi^{-1}[\beta(\cdot,v^n)]) \,,$$ and then applying the definition of $\xi^{-1}$ as deduced above: $$a_1\beta(f,v^1)+\dots+a_n\beta(f,v^n)\,.$$ This allows us to use the bilinearity of $\beta$ to simplify: $$=\beta(f,a_1v^1)+\dots+\beta(f,a_nv^n)=\beta(f,a_1v^1+\dots+a_nv^n)=\beta(f,v)\,. $$

Thus, for all $f \in V^*, v \in V$, we have shown that $\Phi_A(f,v)=\beta(f,v)$, thus truly we do have $$\beta=\Phi_A =\varepsilon(\cdot, A(\cdot))$$ and thus the assignment $A \mapsto \Phi_A$ really *is* surjective.

- Equivalence of induced representation
- Imagining four or higher dimensions and the difference to imagining three dimensions
- Norm of Fredholm integral operator equals norm of its kernel?
- Generators of the multiplicative group modulo $2^k$
- Find a constant that minimizes $\int_0^1 |e^x – c| \ dx$
- Increasing, continuous function implies connectivity and viceversa.
- Give me such a $“3n±Q”$ problem that we do not know a counter-example
- Calculating the limit $\lim((n!)^{1/n})$
- How to prove that $\lim_{x \to \infty} \frac{x^2}{2^x}=0$
- Recursive Function – $f(n)=f(an)+f(bn)+n$
- Which Cross Product for the Desired Orientation of a Sphere ?
- Understanding Discrete and Fast Fourier Transform intuitively
- Product of $SU(2)$ representations
- Prove that $(ab,cd)=(a,c)(b,d)\left(\frac{a}{(a,c)},\frac{d}{(b,d)}\right)\left(\frac{c}{(a,c)},\frac{b}{(b,d)}\right)$
- Books on complex analysis