Intereting Posts

Verification for the solution following differential equation!
basic calculus/analysis question. why is $\frac {dy}{dx} dx = dy$?
Prove that $N(\gamma) = 1$ if, and only if, $\gamma$ is a unit in the ring $\mathbb{Z}$
What comes after $\cos(\tfrac{2\pi}{7})^{1/3}+\cos(\tfrac{4\pi}{7})^{1/3}+\cos(\tfrac{6\pi}{7})^{1/3}$?
Is the continuity of a vector field enough for the existence of the solution of a differential equation?
Convergence in quadratic mean and in mean
$M$ be a finitely generated module over commutative unital ring $R$ , $N,P$ submodules , $P\subseteq N \subseteq M$ and $M\cong P$ , is $M\cong N$?
$\int_{-\infty}^{\infty}{e^x+1\over (e^x-x+1)^2+\pi^2}\mathrm dx=\int_{-\infty}^{\infty}{e^x+1\over (e^x+x+1)^2+\pi^2}\mathrm dx=1$
Is the vector cross product only defined for 3D?
Formula relating Euler characteristics $\chi(A)$, $\chi(X)$, $\chi(Y)$, $\chi(Y \cup_f X)$ when $X$ and $Y$ are finite.
Sufficient condition for irreducibility of polynomial $f(x,y)$
Any good approximation for this integral?
third-order nonlinear differential equation
What axioms does ZF have, exactly?
Seeking elegant proof why 0 divided by 0 does not equal 1

Any matrix $A \in Gl(n, \mathbb{C})$ can be written as a finite linear combination of elements $U_i\in U(n)$:

$$ A = \sum_{i} \lambda_i U_i$$

Is this true, how could I prove it?

- Why square matrix with zero determinant have non trivial solution
- Evaluate determinant of an $n \times n$-Matrix
- Motivation for linear transformations
- Small perturbation of linear transformation cannot decrease its rank
- Are all algebraic commutative operations always associative?
- Strictly diagonally dominant matrices are non singular

- How can one rigorously determine the cardinality of an infinite dimensional vector space?
- Every $R$-module is free $\implies$ $R$ is a division ring
- Basic question about $\sup_{x\neq 0}{} \frac{\|Ax\|}{\|x\|} = \sup_{\|x\| = 1}{\|Ax\|} $, $x \in\mathbb{R}^n$
- Find a set of vectors {u, v} in $R^4$ that spans the solution set of the equations
- Prove Transformation is one-to-one iff it carries linearly independent subsets of $V$ onto Lin. Ind. subsets of $W$.
- How to solve matrix equation $AX+XB=C$ for $X$
- Verify matrix identity $A^tD-C^tB=I$ on certain hypotheses
- If N is elementary nilpotent matrix, show that N Transpose is similar to N
- Generalized rotation matrix in N dimensional space around N-2 unit vector
- How can two vectors be dependent in one field and independent in another field?

The reference to the MathOverflow question is a good one. If $A$ is a complex matrix, you can normalize so that $\|A\| \le 1$. Then

$$

A = B + iC

$$

where $B$, $C$ are selfadjoint and given by

$$

B = \frac{1}{2}(A+A^{\star}),\;\;\; C=\frac{1}{2i}(A-A^{\star}).

$$

These selfadjoint operators also satisfy $\|B\| \le 1$ and $\|C\|\le 1$, which means that their eigenvalues–which must be real–are in $[-1,1]$. Then you can decompose $B$ and $C$ as

$$

B = \frac{1}{2}(U_{B}+V_{B}),\;\;\; C=\frac{1}{2}(U_{C}+V_{C})

$$

where $U_{B}, V_{B}, U_{C}, V_{C}$ are unitary and given by

$$

U_{B} = B + i\sqrt{I-B^2},\;\; V_{B}=B-i\sqrt{I-B^2} \\

U_{C} = C + i\sqrt{I-C^2},\;\; V_{C}=C-i\sqrt{I-C^2}

$$

This makes sense becaue $I-B^2$ and $I-C^2$ are selfadjoint with their eigenvalues in $[0,1]$; so the square roots are defined that also have eigenvalues in $[0,1]$. You can check that

$$

U_{B}U_{B}^{\star}= U_{B}^{\star}U_{B} = (B-i\sqrt{I-B^2})(B+i\sqrt{I-B^2})=B^2+(I-B^2)=I.

$$

Then $\frac{1}{2}(U_{B}+V_{B})=B$, $\frac{1}{2}(U_{C}+V_{C})=C$ and $A=B+iC$ is a linear combination of unitary matrices.

It is known that every complex square matrix $A$ can be written as a linear combination of at most **two** unitary matrices. First, by scaling, you may assume that $\|A\|\le1$. Then, by singular value decomposition, you may also assume that

$$

A=\operatorname{diag}(s_1,\ldots,s_n)

$$

where the singular values $s_j$s are real nonnegative and bounded above by $1$. Now, as $s_j=\frac12(z_j+\bar{z}_j)$, where $z_j=s_j+i\sqrt{1-s_j^2}$ has unit modulus, it follows that $A$ is the average of two unitary matrices.

There is also an open conjecture that every real square matrix is a linear combination of at most **four** real orthogonal matrices. See Chi-Kwong Li and Edward Poon, *Additive Decomposition of Real Matrices*, Linear and Multilinear Algebra, 50(4):321-326, 2002.

I do not see an elementary linear algebra proof right now, but an answer is given at this MO-question, namely that in a $C^*$-algebra, any operator is the linear combination of four unitary operators. The algebra $M(n, \mathbb{C})$ of $n × n$ matrices over $\mathbb{C}$ becomes a $C^*$-algebra if we consider matrices as operators on the Euclidean space $\mathbb{C}^n$, and use the operator norm $||.||$ on matrices.

- approximation of a continuous function by polynomials over a strictly continuous monotone function
- Conditional expectation on Gaussian random variables
- Distance between a point to a set in metric spaces
- Sequences or 'chains' of adjoint functors
- How to prove that $(x-1)^2$ is a factor of $x^4 – ax^2 + (2a-4)x + (3-a)$ for $a\in\mathbb R$?
- Continuity of one partial derivative implies differentiability
- Applications of Fractional Calculus
- Prove that the square root of 3 is irrational
- What might I use to show that an entire function with positive real parts is constant?
- In differential calculus, why is dy/dx written as d/dx ( y)?
- Show that for any $w \in \mathbb{C}$ there exists a sequence $z_n$ s.t. $f(z_n) \rightarrow w$
- Integral vanishes on all intervals implies the function is a.e. zero
- Simple proof Euler–Mascheroni $\gamma$ constant
- What is the sum $\sum_{k=0}^{n}k^2\binom{n}{k}$?
- Is it true that all single variable integral that had closed form can solve by one algorithms?