Intereting Posts

$f \in L^1 ((0,1))$, decreasing on $(0,1)$ implies $x f(x)\rightarrow 0$ as $x \rightarrow 0$
Unique matrix of zeros and ones
homomorphisms of $C^{\infty}(\mathbb R^{n})$
So can anybody indicate whether it is worthwhile trying to understand what Mochizuki did?
A possible dilogarithm identity?
Intersection of neighborhoods of 0. Subgroup?
$x+1/x$ an integer implies $x^n+1/x^n$ an integer
Show that $T\to T^*$ is an isomorphism (Where T is a linear transform)
Lines cutting regions
log sin and log cos integral, maybe relate to fourier series
Create a formula that creates a curve between two points
How many digits of accuracy will an answer have?
(Proof) If $f$ and $g$ are continuous, then $\max\{f(x),g(x)\}$ is continuous
Subspace Topology of a subset
Radius of convergence of product

There are two matrices $A$ and $B$ which can not be diagonalized simultaneously. Is it possible to block diagonalize them? What if the matrices have an special pattern?

Physics of the problem is that, I have $2$ layers of atoms where $A$ is connectivity within the layer $1$ itself and $B$ is connectivity between layer $1$ and $2$.

By **block diagoanl matrix** I mean a matrix which its diagonals are some matrices with size $M \times M$ (where M is much smaller than the actual size of matrix but larger than 1)

For example a $3 \times 3$ block diagonal matrix

[ \begin{array}{ccc}

[a] & [0] & [0] \\

[0] & [b] & [0]\\

[0] & [0] & [c] \end{array}

]

I have attached the figures of sparsity pattern of an example $A$ and $B$.

- Simple to state yet tricky question
- Is always possible to find a generalized eigenvector for the Jordan basis M?
- What is the codimension of matrices of rank $r$ as a manifold?
- How to prove that $\det(M) = (-1)^k \det(A) \det(B)?$
- Inverse of a block matrix
- Prove that the rank of a block diagonal matrix equals the sum of the ranks of the matrices that are the main diagonal blocks.

- $J$ be a $3\times 3$ matrix with all entries $1$ Then $J$ is
- Finding Matrix A from Eigenvalues and Eigenvectors (Diagonalization)
- Symmetric matrix is always diagonalizable?
- Is the matrix diagonalizable for all values of t?
- Why a non-diagonalizable matrix can be approximated by an infinite sequence of diagonalizable matrices?
- Inverse of a block matrix
- Square root of Positive Definite Matrix
- If $A \in M_{n \times 1} (\mathbb K)$, then $AA^t$ is diagonalizable.
- New proof about normal matrix is diagonalizable.
- If matrix A is invertible, is it diagonalizable as well?

I will propose a method for finding the optimal simultaneous block-diagonalization of two matrices $A$ and $B$, assuming that $A$ is diagonalizable (and eigenvalues are not too degenerate).

(Something similar may work with the Jordan normal form of $A$ as well.)

By optimal I mean that none of the corresponding blocks are simultaneously block-diagonalizable.

Note that using the trivial block-diagonalization $A=(A)$ and $B=(B)$ this can always be done.

Let me start by verifying Victor Liu’s comment that the problem is equivalent to expressing your vector space as a direct sum of invariant spaces.

I will write $C\oplus D=\begin{pmatrix}C&0\\0&D\end{pmatrix}$ and similarly for several matrices.

I assume that the matrices operate in the vector space $V=\mathbb R^n$ (or $\mathbb C^n$, this is irrelevant).

If the two matrices $A$ and $B$ can be block-diagonalized simultaneously, $A=A_1\oplus A_2$ and $B=B_1\oplus B_2$, then $V$ is a direct sum $V=V_1\oplus V_2$, where the spaces $V_i$ are invariant under $A$ and $B$.

A subspace $U\subset V$ is said to be invariant under $A$ if $A(U)\subset U$.

Similarly, if $V$ is a direct sum of subspaces invariant under both $A$ and $B$, then $A$ and $B$ can be simultaneously block-diagonalized.

Now what are the subspaces of $V$ that are invariant with respect to both $A$ and $B$?

Suppose $U\neq\{0\}$ is such a subspace.

Suppose furthermore that there is another subspace $U’\subset V$ so that $U$ and $U’$ only intersect at the origin but they span all of $V$: $V=U\oplus U’$.

(In other words, $U$ is an invariant subspace complementable by an invariant subspace. The spaces in the decomposition need to be of this type.)

Now $U$ is invariant under $A$.

Let us first diagonalize $A$.

Let $\lambda_1,\dots,\lambda_m$ be the eigenvalues and $V_1,\dots,V_m$ the corresponding eigenspaces.

We can thus write $V$ as a direct sum $V=V_1\oplus\dots\oplus V_m$.

Since $U$, it is necessarily of the form $U=U_1\oplus\dots\oplus U_m$, where $U_i$ is a subspace of $V_i$ (can be zero or the whole thing).

Suppose one of the eigenvalues of $A$, $\lambda_1$, is simple (perhaps the ground state if you have a quantum mechanical system).

Then $V_1=\langle v_1\rangle$ has to belong to $U$ or its complementing space; suppose it belongs to $U$.

We also need to have $Bv_1\in U$.

Now $Bv_1=x_1\oplus\dots\oplus x_m\in V_1\oplus\dots\oplus V_m$.

By the direct sum form of $U$, we must have $x_i\in U$ for all $i$.

Now we have a list of new vectors $x_1,\dots,x_m$ in $U$.

For each of them we can run the same argument: $Bx_i=\dots\in V_1\oplus\dots\oplus V_m$ must be in $U$ and so must each of its components (w.r.t. the eigenspace decomposition).

Differently put, if $\pi_I:V\to V_i$ is the orthogonal projection to the eigenspace, the following is true: If $u\in U$, then $\pi_iBu\in U$ for all $i$.

It can be enlightening to calculate the matrices $\pi_iBu$.

(You could probably see something at glance, but I have not developed intuition for it.)

This is a (at least computationally feasible) method to find an invariant subspace $U$ containing $V_1$.

It can well happen that $U=V$ and only the trivial simultaneous block-diagonalization exists.

If not, after some amount of iteration you will notice that the above procedure no longer produces more linearly independent vectors and $U$ is a nontrivial invariant subspace.

It does not yet follow that $U$ would be complementable by an invariant subspace.

Assuming there is another simple eigenvalue (whose eigenspace is not contained in $U$), you can start the same construction with that one and produce a new invariant subspace $U’$.

You will automatically have $U\cap U’=\{0\}$.

You can also start at any eigenvector of $B$ corresponding to a simple eigenvalue.

If you somehow know that a particular vector needs to be in an invariant subspace, you can use it as well.

(But be careful: an eigenvector corresponding to a degenerate eigenvalue need not be in an invariant subspace.)

With this method you can generate some amount of invariant subspaces.

If these subspaces span the whole space, the corresponding block-diagonalization is optimal.

If they do not span all of $V$, at least the rest of the problem is easier.

(Note that even if $A$ is symmetric, a complementing invariant subspace need not be orthogonal to $U$.)

This is not a complete answer to your question, but perhaps you can solve your problem with something along these lines.

And do ask if you need clarification.

- Is there a representation of an inner product where monomials are orthogonal?
- Integral $\int_0^{\pi/2} \ln(1+\alpha\sin^2 x)\, dx=\pi \ln \frac{1+\sqrt{1+\alpha}}{2}$
- What's a concise word for “the expression inside a limit”? Limitand?
- Can every even integer be expressed as the difference of two primes?
- Proof of Gauss' Law of gravitation without reference to Newton?
- Simple chessboard exercise
- Intersection of two finite abelian subgroups
- Is $\mathbb{Q}/\mathbb{Z}$ isomorphic to $\mathbb{Q}$?
- A UFD for which the related formal power series ring is not a UFD
- Right-continuity of completed filtration
- Limiting Behaviour of Mean Value Theorem ($\theta \to \frac12$ as $h \to 0$)
- Is the classifying space $B^nG$ the Eilenberg-MacLane space $K(G, n)$?
- An unusual combination lock problem
- Does X have a lower bound?
- Conceptual/Graphical understanding of the Fourier Series.