Intereting Posts

Special Orthogonal Group and Cayley-Hamilton theorem
Two different definitions of General Measurable Function
Bounding the Norm of the Riemann Curvature Operator
How to prove that $ \lim_{n\to\infty}\left( \sum_{r=1}^n \dfrac 1 {\sqrt{n^2 + r}} \right) = 1$
real meaning of divergence and its mathematical intuition
Solving for the implicit function $f\left(f(x)y+\frac{x}{y}\right)=xyf\left(x^2+y^2\right)$ and $f(1)=1$
Calculate the sum of the infinite series $\sum_{n=0}^{\infty} \frac{n}{4^n}$
Counting Irreducible Polynomials
Real world applications of category theory
Local solutions of a Diophantine equation
Any $p + 1$ consecutive integers contain at least two invertible elements modulo $p!!$ if $p$ is odd
Trying to understand the slick proof about the dual space
Center of dihedral group
Is this a $u$-substitution? Weird integral.
Mean-value property for holomorphic functions

I am working through the exercises in “Lie Groups, Lie Algebras, and Representations” – Hall and can’t complete exercise 11 of chapter 3. My aim was to demonstrate that there does not exist a vector space isomorphism $A$ between the two spaces that also preserves the commutator.

$$[AX, AY] = A[X, Y]$$

To this end I computed the following commutation relations on bases for two spaces.

For the $\mathfrak{su}(2)$ basis matrices $e_1, e_2, e_3$ it holds that

$$[e_1, e_2] = 2e_3 \,\,\,\,\,\, [e_1, e_3] = -2e_2 \,\,\,\,\,\, [e_2, e_3] = 2e_1$$

For the $\mathfrak{sl}(2, \mathbb{R})$ basis matrices $f_1, f_2, f_3$ it holds that

$$[f_1, f_2] = 2f_2 \,\,\,\,\,\, [f_1, f_3] = -2f_3 \,\,\,\,\,\, [f_2, f_3] = f_1$$

- Is the Lie Algebra of a connected abelian group abelian?
- Lie algebra-like structure corresponding to noncrystallographic root systems
- How to obtain a Lie group from a Lie algebra
- To what extent are the Jordan-Chevalley and Levi Decompositions compatible.
- Lie derivative of a vector field equals the lie bracket
- Applications of Algebra in Physics

It is clear that for the linear bijection $(e_1, e_2, e_3) \mapsto (f_1, f_2, f_3)$ would not preserve the relationships, nor would a permutation of the target matrices. However, I need to show no invertible matrix satisfies

$$[AX, AY] = A[X, Y]$$

So from there I began to derive equations for the elements of $A$. They are ugly expressions in terms of the sub-determinants of the $A$ matrix, and given them I can’t think of a way to conclude $A$ cannot exist. Is there an easier way to finish the proof than to derive the equations for $A$?

Note: I have looked up solutions for this problem and the only technique I see hinted at is to consider Killing forms (which have not yet been covered in this book).

- Commuting matrix exponentials: necessary condition.
- On the relationship between the commutators of a Lie group and its Lie algebra
- Automorphism group of a lie algebra as a lie subgroup of $GL(\frak g)$
- Lie Groups/Lie algebras to algebraic groups
- If $\mathfrak{g}$ admits a decomposition then it is semisimple
- Do we have $(g \wedge g)^g = 0$?
- Show that the Lie algebra generated by x, y with relations $ad(x)^2(y) = ad(y)^5(x) = 0$ is infinite dimensional and construct a basis
- Tangent vectors to coadjoint orbits
- When is the Killing form null?
- Lie algebra associated to an arbitrary discrete group

Your approach works without problems, if you write the condition $[Ax,Ay]=A[x,y]$ for all $x,y$ in terms of the $9$ coefficients of the matrix $A$. The polynomial equations in these $9$ unknowns over $\mathbb{R}$ quickly yield $\det(A)=0$, a contradiction.

Another elementary argument is the following. $\mathfrak{sl}(2,\mathbb{R})$ has a $2$-dimensional subalgebra, e.g., $\mathfrak{a}=\langle f_1,f_2\rangle$, but $\mathfrak{su}(2)$ has no $2$-dimensional subalgebra. Hence they cannot be isomorphic.

This is a Q&A style answer not meant to be the final answer to the question. It completes the original technique for future readers. Thanks to Dietrich Burde for the motivation to continue with it.

As above, suppose $A$ is an isomorphism from $\mathfrak{su}(2) \to \mathfrak{sl}(2, \mathbb{R})$. Then

$$[AX, AY] = A[X, Y]$$

Let $A_i$ denote the column vectors of $A$. Then

$$Ae_i = \sum_j A_{ij} f_j$$

We use $[Ae_1, Ae_2] = A[e_1, e_2]$ to obtain

$$2\begin{vmatrix} A_{11} & A_{21} \\ A_{12} & A_{22} \end{vmatrix}f_2 +

2\begin{vmatrix} A_{11} & A_{21} \\ A_{13} & A_{23} \end{vmatrix}(-f_3) +

2\begin{vmatrix} A_{12} & A_{22} \\ A_{13} & A_{23} \end{vmatrix}f_1 = 2 (A_{31}f_1 + A_{32}f_2 + A_{33}f_3)$$

Combining these three implied equations with the cofactor expansion of the determinant:

$$\begin{vmatrix} A_{11} & A_{21} & A_{31} \\ A_{12} & A_{22} & A_{32} \\ A_{13} & A_{23} & A_{33} \end{vmatrix} = A_{31}\begin{vmatrix} A_{12} & A_{22} \\ A_{13} & A_{23} \end{vmatrix}

– A_{32}\begin{vmatrix} A_{11} & A_{21} \\ A_{12} & A_{22} \end{vmatrix}

+ A_{33}\begin{vmatrix} A_{11} & A_{21} \\ A_{13} & A_{23} \end{vmatrix}$$

we obtain:

$$\det(A) = 2 A_{31}^2 – A_{32}^2 – A_{33}^2$$

Using the other two commutivity relations we get:

$$\det(A) = – A_{31}^2 + 2 A_{32}^2 – A_{33}^2$$

$$\det(A) = – A_{31}^2 – A_{32}^2 + 2 A_{33}^2$$

Adding the three equations together we see that

$$3 \det(A) = 0$$

Hence, $A$ is not invertible contradicting it being a vector space isomorphism.

This is a Q&A style answer not meant to be the final answer to the question. It fleshes out one of the techniques suggested by Dietrich Burde for future readers.

Another elementary argument is the following. $\mathfrak{sl}(2,\mathbb{R})$ has a $2$-dimensional subalgebra, e.g., $\mathfrak{a}=\langle f_1,f_2\rangle$, but $\mathfrak{su}(2)$ has no $2$-dimensional subalgebra. Hence they cannot be isomorphic.

$\mathfrak{sl}(2, \mathbb{R})$ has a two dimensional subspace.

Consider matrices of the form $\alpha_1 f_1 + \alpha_2 f_2$. Clearly this is a subspace of $\mathfrak{sl}(2)$. We need to show the commutation operation is closed in this subspace:

$$[\alpha_1 f_1 + \alpha_2 f_2, \beta_1 f_1 + \beta_2 f_2] = 2(\alpha_1\beta_2 – \alpha_2\beta_1)f_2$$

$\mathfrak{su}(2)$ does not have a two dimensional subspace.

Consider a two dimensional subspace with basis $g_1, g_2$. Then

$$[\alpha_1 g_1 + \alpha_2 g_2, \beta_1 g_1 + \beta_2 g_2] = (\alpha_1\beta_2 – \alpha_2\beta_1)[g_1, g_2]$$

We must show that $g_1, g_2$ cannot be chosen such $[g_1, g_2]$ is in the span of $g_1, g_2$. To this end let $g_1 = \sum_i a_i e_i, g_2 = \sum b_i e_i$. It can be shown through direct calculation that

$$[g_1, g_2] = \begin{vmatrix}

2 e_1 & a_1 & b_1 \\

2 e_2 & a_2 & b_2 \\

2 e_3 & a_3 & b_3 \notag

\end{vmatrix}$$

In other words, the commutator of $g_1$ and $g_2$ is twice their cross product. Since the cross product is perpendicular to $g_1, g_2$ we are done.

This is a Q&A style answer not meant to be the final answer to the question. It fleshes out one of the techniques suggested by Mariano Suárez-Alvarez for future readers.

An isomorphism $f:\mathfrak{su}(2) \to \mathfrak{su}(3)$ has to map a diagonalizable element to a diagonalizable element.

It is isn’t quite the same technique, but inspired by it. Instead I will use that if an isomorphism existed between $\mathfrak{su}(2)$ and $\mathfrak{sl}(2, \mathbb{R})$ then the induced homomorphism on their adjoint representations would have to preserve diagonalizability of matrices. This leads to a contradiction.

The following proposition is inspired by Lie algebra homomorphisms preserve Jordan form:

Suppose the Lie algebras $\mathfrak{g}, \mathfrak{h}$ are isomorphic. Denote the isomorphism as $\phi : \mathfrak{g} \to \mathfrak{h}$. Then for all diagonalizable $ad_X \in ad_\mathfrak{g}$, $\phi^*(ad_X) \in ad_\mathfrak{h}$ is diagonalizable (where $\phi^*$ is the induced homomorphism between the adjoint representations). In particular, if $\lambda_i$, $Y_i$ is an eigenvalue, eigenvector pair of $ad_X$, then $\lambda_i$, $\phi(Y_i)$ is an eigenvalue, eigenvalues pair of $ad_{\phi(X)}$.

Suppose that $ad_X$ is diagonalizable with eigenvalues $\lambda_i$ and eigenvectors $Y_i$. Then

$$ad_X(Y_i) = \lambda_i Y_i$$

We want to show that $\phi(Y_i)$ is an eigenvector of $\phi^*(ad_X)$.

\phi^*(ad_X)(\phi(Y_i)) &=& ad_{\phi(X)}(\phi(Y_i)) \\

&=& [\phi(X), \phi(Y_i)] \\

&=& \phi([X, Y_i]) \\

&=& \phi(ad_X(Y_i)) \\

&=& \lambda_i\phi(Y_i) \\

\end{eqnarray*}

Now using the commutivity relations stated in the problem we can calculate the adjoint representation of $\mathfrak{su}(2)$:

$$ ad_{e_1} =

\begin{bmatrix}

0 & 0 & 0 \\

0 & 0 & -2 \\

0 & 2 & 0

\end{bmatrix} \,\,\,\,\,

ad_{e_2} =

\begin{bmatrix}

0 & 0 & 2 \\

0 & 0 & 0 \\

-2 & 0 & 0

\end{bmatrix}\,\,\,\,\,

ad_{e_3} =

\begin{bmatrix}

0 & -2 & 0 \\

2 & 0 & 0 \\

0 & 0 & 0

\end{bmatrix}$$

For $\mathfrak{sl}(2, \mathbb{R})$ we find:

$$ ad_{f_1} =

\begin{bmatrix}

0 & 0 & 0 \\

0 & 2 & 0 \\

0 & 0 & -2

\end{bmatrix} \,\,\,\,\,

ad_{f_2} =

\begin{bmatrix}

0 & 0 & 1 \\

-2 & 0 & 0 \\

0 & 0 & 0

\end{bmatrix}\,\,\,\,\,

ad_{f_3} =

\begin{bmatrix}

0 & -1 & 0 \\

0 & 0 & 0 \\

2 & 0 & 0

\end{bmatrix}$$

Suppose $\phi$ is an isomorphism between $\mathfrak{sl}(2, \mathbb{R})$ and $\mathfrak(su)(2)$ and that

$$\phi(f_1) = a_1 e_1 + a_2 e_2 + a_3 e_3$$

Now any linear combination of the matrices $ad_{e_i}$ is skew-symmetric which means that it has imaginary eigenvalues. On the other hand the matrix $ad_{f_1}$ has eigenvalues $0, -2, 2$. Consider the eigenvalue, eigenvector pair $-2, v$ of $f_1$. There is no way that $\phi(v)$ can be an eigenvector of $\phi(f_1)$ with eigenvalue $-2$, so we have a contradiction.

- Is the group isomorphism $\exp(\alpha x)$ from the group $(\mathbb{R},+)$ to $(\mathbb{R}_{>0},\times)$ unique?
- No cycle containing edges $e$ and $g$ implies there is a vertex $u$ so that every path sharing one end with $e$ and another with $g$ contains $u$
- an invertible element $i$ in $\mathbb Z_n$ must be coprime to $n$
- Are minimal prime ideals in a graded ring graded?
- Probability of A wins the game
- Show that $\lim_{n\to\infty} \sum_{k=1}^{n} \frac{n}{n^2+k^2}=\frac{\pi}{4}$
- question on left and right eigenvectors
- Method of characteristics for the PDE $xu_x +y u_y = 0$ with an initial condition on a circle
- Why does the Borel-Cantelli lemma finish the job? – Law of Large Numbers Brownian Motion
- Drawing a triangle from medians
- Does measurability/continuity of a mapping follow that of its sections?
- If $A,B\in M(2,\mathbb{F})$ and $AB=I$, then $BA=I$
- Evaluating $\int{ \frac{x^n}{1 + x + \frac{x^2}{2} + \cdots + \frac{x^n}{n!}}}dx$ using Pascal inversion
- Submodularity of the product of two non-negative, monotone increasing submodular functions
- A lower bound for the ratio of $2$- and $\infty$-norms within a linear subspace