Articles of linear algebra

Can the induced matrix norms be induced by inner products?

Consider the induced matrix p-norm. My question is: Can these norms be induced by inner product? It sufficed to check that if it satisfies the parallelogram law. But I don’t find an immediate way to either find a counterexample or prove that it is true. I generated some examples using MATLAB. It seems that the […]

Is there a connection between the diagonalization of a matrix $A$ and that of the product $DA$ with a diagonal matrix $D$?

Given a diagonalizable matrix $A = P_0\Lambda_0 P_0^{-1}$ and a diagonal matrix $D$ with $\det D=1$, is there any connection between $P_0$ and the matrix $P$ of the diagonalization of $DA = P\Lambda P^{-1}$?

Cholesky factorisation: $P = LL^T = R^TR$ vs $P = UU^T = L'^TL'$

I’m (reasonably) familiar with factoring a positive definite matrix $\mathbf{P} = \mathbf{L} \mathbf{L}^T = \mathbf{R}^T \mathbf{R}$, and is supported by MATLAB and Eigen. However, I have also seen a factorization of the (same) $\mathbf{P} = \mathbf{U} \mathbf{U}^T = \mathbf{L'}^T \mathbf{L'}$ The following illustrates: >> A = rand(3, 4) A = 0.2785 0.9649 0.9572 0.1419 0.5469 […]

prove that projection is independent of basis

I am a little stuck with how to proceed with this question. Let W be a subspace of a finite dimensional inner product space V. Prove that $proj_w(v)$ is independent of basis. I think that I have to show that if I have two orthogonal basis such as $\{v_1,….,v_k\}$ and $\{w_1,….w_k\}$ then I need to […]

To Find $A^{50}$

This question already has an answer here: for a $3 \times 3$ matrix A ,value of $ A^{50} $ is 3 answers

For what values of $k$ will the system have multiple solutions?

For what value(s) of $k$, if any, will the system have a) no solutions b) a unique solution c) infinitely many solutions? $$kx + 2y = 3$$ $$2x – 4y = -6$$ I know how to get the answer to (c), I got when $k = -1$. I have no clue for the other two […]

Consider the trace map $M_n (\mathbb{R}) \to \mathbb{R}$. What is its kernel?

The map is the trace map. I.e, it takes any $n$ by $n$ matrix and associates to that matrix, a number of the form $\mathrm{Tr}(A) = \sum_{i=1}^n a_{ii}$, where $A \in M_n (\mathbb{R})$. I need to find the kernel of this map, give a basis and its dimension (which is easy once I have the […]

The identity det(A) = exp(Tr(ln(A)) for A general

I understand the proof of the identity in the title for $A$ Hermitian. One uses that any Hermitian matrix can be diagonalized as $A = X \Lambda X^{-1}$, such that $$ \det{A} = \prod_i \lambda_i, $$ and we have $$ \exp(Tr(\log(A)) = \exp(Tr(X\log\Lambda X^ {-1}) = \exp(\sum_i\log(\lambda_i)) = \prod_i \lambda_i. $$ However, is it possible […]

Proving that if $A$ is diagonalizable with non-negative eigenvalues, then $A=B^2$ for some $B$

Let $A$ be a diagonalizable $n \times n$ matrix with non-negative eigenvalues. Prove that there exists a matrix $B$ such that$$A=B^2.$$ I honestly don’t have a clue what to do. Could anyone please help me out?

A further help required for what is a linear map restricted to a subspace

A related question was posted here. I have further question as the following. If $x_1,x_2$ is an orthonormal basis of an eigenspace of $A$ pertaining to some eigenvalue $\lambda$, why $(x_1,x_2)^TA(x_1,x_2)$ can be viewed as the linear map $A$ restricted to that eigenspace spanned by $\{x_1,x_2\}$? I am puzzled that the matrix $(x_1,x_2)^TA(x_1,x_2)$ is a […]