Articles of linear algebra

Finding the kernel of a linear map

Our exercise is to find all solutions to the equation $Ax = 0$, among others for the following matrix $$A =\begin{pmatrix} 6 & 3 & -9 \\ 2 & 1 & -3 \\ -4 & -2 & 6 \end{pmatrix}.$$ This amounts to finding the kernel, and obviously, the rows of the matrix are multiples of […]

Lagrange diagonalization theorem – what if we omit assumption about the form being symmetric

I know that for every symmetric form $f: U \times U \rightarrow \mathbb{K}$, char$\mathbb{K} \neq 2$ there exists a basis for which $f$’s matrix is diagonal. Could you tell me what happens if we omit assumption about $f$ being symmetric? Could you give me an example of non symmetric bilinear form $f$ which cannot be […]

Counting invariant subspaces of a Vector space

Well, I was reading about invariant subspaces and related things and this question came to my mind: If I choose a vector space and fix a linear transformation on itself, then how many invariant subspaces will there be? Is there any formula or materials to read?

prove change of basis matrix is unitary

As the title, let $(V,\langle,\rangle)$ be a complex inner product space and assume $S_1=(u_1,\ldots,u_n)$, $S_2=(v_1,\ldots,v_n)$ are orthonormal bases of $V$. Prove that the change of basis matrix $M_ IV(S_2,S_1)$ is a unitary matrix. (There is a hint that let $S$ be the operator s.t. $S(u_i)=v_i$ and prove this is a unitary operator.)

Why does this least-squares approach to proving the Sherman–Morrison Formula work?

Suppose $A$ is an invertible square matrix and $u,v$ are column vectors. Suppose furthermore that $1 + v^T A^{-1}u \neq 0$. Then the Sherman–Morrison formula states that: $$ (A+uv^T)^{-1} = A^{-1} – {A^{-1}uv^T A^{-1} \over 1 + v^T A^{-1}u} $$ Here, $uv^T$ is the outer product of two vectors $u$ and $v$. A proof I […]

How can I characterize the type of solution vector that comes out of a matrix?

Ax = b. I need a way to analyze a square matrix A to see if its solution vector x will always be positive when b is positive. This question arises from solving the radiosity equation: I’m interested to know when A is incorrect, which would be when x has negative values even though b […]

On monomial matrix (generalized permutation matrix )

A matrix $a\in GL_{n}(F)$ is said to be monomial if each row and column has exactly one non-zero entry. Let $N$ denote the set of all monomial matrices. I want to prove that following are equivalent $A\in N$ there exist a non singular $D$ (diagonal matrix ) and a permutation matrix $P$ such that $A=DP$ […]

If both $A-\frac{1}{2}I$ and $A + \frac{1}{2}I$ are orthogonal matrices, then…

Problem : If both $A-\frac{1}{2}I$ and $A + \frac{1}{2}I$ are orthogonal matrices, then which one of the following is correct : (i) A is orthogonal (2) A is skew symmetric matrix of even order (3) $A^2 = \frac{3}{4}I$ Solution : $(A’-\frac{1}{2}I)(A-\frac{1}{2}I) =I$ and $(A’+\frac{1}{2}I)(A+\frac{1}{2}I) =I$ $\Rightarrow A +A’ =0$ $\Rightarrow A’ =-A $ $\Rightarrow A^2 […]

Uncountable linearly independet family in $K^\mathbb{N}$

Let $K$ be a field. Consider the vector space $K^\Bbb{N}$ of $K$-sequences. Is there an uncountable linearly independent set of vectors in this vector space? If Yes, can you name it explicitely? Does this work for modules as well?

Calculate the slope of a line passing through the intersection of two lines

Let say I have this figure, I know slope $m_1$, slope $m_1$, $(x_1, y_1)$, $(x_2, y_2)$ and $(x_3, y_3)$. I need to calculate slope $m_3$. Note the line with $m_3$ slope will always equally bisect line with $m_1$ slope and line with $m_2$.