Let $A$ be a normal matrix in Mat$_{n\times n}(\mathbb C)$, if $A$ is upper triangular then it is diagonal (Normal means $AA^*=A^*A$, where $A^*$ is the conjugate transpose of $A$) If I consider the diagonal of $AA^*$, let denote $(a_{ij})=A$ and $(â_{ij})_{i,j}=AA^*$ then, since $AA^*=A^*A$ $â_{ii}=\sum\limits_{k=1}^na_{ik}\overline{a}_{ik}=\sum\limits_{k=1}^n\overline{a_{ki}}{a}_{ki}$ $\implies\sum\limits_{k=1}^n|a_{ik}|^2=\sum\limits_{k=1}^n|a_{ki}|^2$. If I take $i=n$ then it follows that […]

There seems to be some inconsistencies in my mind that I’m trying to clear up, regarding the null space and the dimension theorem: This is the problem: Find a matrix whose null space is spanned by the vectors: $(2, −3, 1, 1, −1), (1, 0, −2, 1, 1), (2, −2, 1, 0, −1), (−8, 3, […]

I am stuck with the following equivalence about Affine Sets: “$L$ being an affine set is equivalent to $L$ being the solution set of a set of equations $Ax=b$ for some $A,b$.” In a more mathematical statement: $L$ is an affine set $\iff$ $L=\left\{x|Ax=b\right\}$ where $x \in \mathbb{R}^n$, for some matrix $A$ and $b$. Proving […]

Consider following Lie Group: $$ \text{Sp}(2n,\mathbb{C})=\{g\in\text{Mat}_{2n}(\mathbb{C})\mid J=g^TJg\}\quad\ where\quad J=\begin{pmatrix} 0 & 1_n \\ -1_n & 0 \end{pmatrix} $$ And the corresponding Lie Algebra: $$ \text{sp}(2n,\mathbb{C})=\{g\in\text{Mat}_{2n}(\mathbb{C})\mid g^TJ+Jg=0\} $$ Are there any basic proofs that $\text{Sp}(2n,\mathbb{C})$ is a Lie Group and that $\text{sp}(2n,\mathbb{C})$ is the corresponding Lie Algebra without using submersions (seen here: Why is $Sp(2m)$ as […]

I am aware that a similar question has been asked here, among other questions, but I feel that my question is different because I am actually trying to write up a very rigorous proof that such a set spans the $n \times n$ symmetric matrices and that it is linearly independent. I am having trouble […]

Linear algebra and special-linear group experts please help: It is known that in principle one can generate this $C$ matrix form the $A$ and $B$ matrix below. Here $$ C=\begin{pmatrix} 0& -1& 0\\ 1& 0& 0\\ 0& 0& 1 \end{pmatrix} $$ from: $$ A=\begin{pmatrix} 0& 0& 1\\ 1& 0& 0\\ 0& 1& 0 \end{pmatrix}, \text{ […]

This question stems from the comments and answers of Alex G here. As can be seen in the question or the comments to his answer, we take $V,W$ to be real vector spaces, and say that for $S\subset V$, a function $f:S\rightarrow W$ is “linear”, we’ll say on $S$, if the following two conditions hold: […]

Why doesn’t the minimal polynomial of a matrix change if we extend the field? I appreciate any help or proof.

Let $(E, \langle \cdot, \cdot \rangle)$ be an $n$-dimensional Hilbert space and $A,B \colon E \to E$ linear isomorphisms. Does there exist a basis $\{e_{1},…,e_{n}\}$ of $E$ such that $\mathcal{A}=\{A(e_{1}),…,A(e_{n})\}$ and $\mathcal{B}=\{B(e_{1}),…,B(e_{n})\}$ are orthogonal bases? Hints or solutions are greatly appreciated.

Let $f$ be a parametrized surface $f: \Omega \subset \mathbb{R}^2 \rightarrow \mathbb{R}^3$ and $N : \Omega \rightarrow Tf$ the Gauß map. Then the shape operator is defined as $L = -DN \circ Df^{-1}.$ Now the thing is that $Df$ is a $3 \times 2$ matrix, so I cannot invert this matrix easily. So how do […]

Intereting Posts

Quantifiers, predicates, logical equivalence
How to show that $H \cap Z(G) \neq \{e\}$ when $H$ is a normal subgroup of $G$ with $\lvert H\rvert>1$
Common tangent to two circles
If $p:E\to B$ is a covering space and $p^{-1}(x)$ is finite for all $x \in B$, show that $E$ is compact and Hausdorff iff $B$ is compact and Hausdorff
Simpler way to compute a definite integral without resorting to partial fractions?
Why is $\mathbb{Z}, n\ge 3$ not a UFD?
The smallest 8 cubes to cover a regular tetrahedron
How can a normal vector and a vector on the plane give an equation of the plane?
Every Real number is expressible in terms of differences of two transcendentals
Representation of integers by ternary quadratic form $x^2+y^2-z^2$
Evaluate determinant of an $n \times n$-Matrix
Existence of Irreducible polynomials over $\mathbb{Z}$ of any given degree
loewner ordering of symetric positive definite matrices and their inverse
Proof to sequences in real analysis
What kind of vector spaces have exactly one basis?