Articles of linear algebra

Find a basis for two operators

Let $(E, \langle \cdot, \cdot \rangle)$ be an $n$-dimensional Hilbert space and $A,B \colon E \to E$ linear isomorphisms. Does there exist a basis $\{e_{1},…,e_{n}\}$ of $E$ such that $\mathcal{A}=\{A(e_{1}),…,A(e_{n})\}$ and $\mathcal{B}=\{B(e_{1}),…,B(e_{n})\}$ are orthogonal bases? Hints or solutions are greatly appreciated.

Matrix representation of shape operator

Let $f$ be a parametrized surface $f: \Omega \subset \mathbb{R}^2 \rightarrow \mathbb{R}^3$ and $N : \Omega \rightarrow Tf$ the Gauß map. Then the shape operator is defined as $L = -DN \circ Df^{-1}.$ Now the thing is that $Df$ is a $3 \times 2$ matrix, so I cannot invert this matrix easily. So how do […]

Wiedemann for solving sparse linear equation

I am new member. I am researching in Wiedemann algorithm to find solution $x$ of $$Ax=b$$ Firstly, I will show a Wiedemann’s deterministic algorithm (Algorithm 2 in paper Compute $A^ib$ for $i=0..2n-1$; n is szie of matrix A Set k=0 and $g_0(z)=1$ Set $u_{k+1}$ to be $k+1$st unit vector Extract from the result of step […]

Set of all unitary matrices – compactness and connectedness.

Let U denote the set of all nxn matrices A with complex entries such that A is unitary. Then U as a topological subspace of $C^{n^2}$ is a) compact but not connected. b) connected but not compact. c) connected and compact. d) neither connected nor compact. we can say a set is compact if it […]

Show $\alpha$ is selfadjoint.

This question already has an answer here: $TT^*=T^2$, show that $T$ is self-adjoint 2 answers

For self-adjoint operators, eigenvectors that correspond to distinct eigenvalues are orthogonal

So I was looking for a proof for the next theorem. $V$ is inner product space $T: V\rightarrow V$ self adjoint linear map. $ \lambda_{1},\lambda_{2} \in \mathbb{F}$ so that $ \lambda_{1} \neq \lambda_{2}$ $ v_{1},v_{2} \in V$ so that $ 0_{v} \neq v_{1} \neq v_{2} \neq 0_{v}$ $T(v_{1}) = \lambda_{1}v_{1}$ $T(v_{2}) = \lambda_{2}v_{2}$ then $\langle […]

smallest eigenvalue of rank one matrix minus diagonal

Let $x$ be a $d$-dimensional real vector with $\| x\| = 1$. Define $X := xx^T – \mathrm{diag}(xx^T)$. Is it possible to show that $\lambda_{\mathrm{min}}( X ) \geq – 1/2$? Running a bunch of random trials in python seems to suggest this is true, but I’m not sure how to show it. The best I […]

Is characterisation of degree 2 nilpotent matrices (i.e. $M^2=0$) known?

$M$ is $n\times n$ real (or complex) matrix. Also $M$ is nilpotent of degree 2, i.e. $M^2=0.$ Question. How does $M$ look like? I just calculated that $2\times 2$ matrix must have following form $$\begin{bmatrix} gh & \pm g^2 \\ \mp h^2 & -gh \end{bmatrix}.$$ I wanted to compute conditions on $3\times3,4\times 4$ and look […]

Probability distribution for a three row matrix vector product

Consider a fixed (non-random) $3$ by $n$ matrix $M$ whose elements are chosen from $\{-1,1\}$. Assume $n$ is even. I am trying work out what the probability mass function of $Mx$ is when $x$ is a random vector with elements chosen independently and uniformly at random from $\{-1,1\}$. Each of the three elements of $y […]

Understanding a Proof for Why $\ell^2$ is Complete

Setting: Let $(x_n)$ be Cauchy in $\ell^2$ over $\mathbb{F} = \mathbb{C}$ or $\mathbb{R}$. I’m trying to show that $(x_n) \rightarrow x \in \ell^2$. That is, I’m trying to show that $\ell^2$ is complete in a particular way outlined below. I only used the first few steps of the proof because once I understand the third […]