Articles of linear algebra

why does the reduced row echelon form have the same null space as the original matrix?

What is the proof for this and the intuitive explanation for why the reduced row echelon form have the same null space as the original matrix?

Does projection onto a finite dimensional subspace commute with intersection of a decreasing sequence of subspaces: $\cap_i P_W(V_i)=P_W(\cap_i V_i)$?

Let $V$ to be an infinite dimensional linear space over some field $k$. (you can take $k=\mathbb{C}$, or further assume $V$ is a complex Hilbert space). And assume $W$ is a finite dimensional subspace of $V$, and denote the projection from $V$ to $W$ by $P_W$. Let $V_1\supseteq V_2\supseteq V_3\supseteq\cdots$ be a decreasing sequence of […]

Why does the Gram-Schmidt procedure divide by 0 on a linearly dependent lists of vectors?

Let $v_1, \dots, v_m$ be a linearly dependent list of vectors. If $v_1 \ne 0$, then there is some $v_j$ in the span of $v_1, \dots, v_{j-1}$ If we let j be the smallest integer with this property, and apply the gram-schmidt procedure to produce an orthonormal list $(e_1, \dots, e_{j-1})$ then $v_j$ is in […]

Tensor products of maps

Let $V, W, U, X$ be $R$-modules where R is a ring. At what level of generality, if any is it true that the maps (I always mean linear) from $V \otimes W$ to $U \otimes X$ can be identified with $L(V, U)\otimes L(W, X)$ where $L(., .)$ is the space of maps, via the […]

Basis for the intersection of two integer lattices

If $B_1$ and $B_2$ are the bases of two integer lattices $L_1$ and $L_2$, i.e. $L_1=\{B_1n:n\in\mathbb Z^d\}$ and $L_2=\{B_2n:n\in\mathbb Z^d\}$, is there an easy way to determine a basis for $L_1\cap L_2$? Answers of the form “Plug the matrices into a computer and ask for Hermite Normal Form, etc” are perfectly acceptable as this is […]

For $W \leqslant V$, prove $\dim W + \dim W^\perp = \dim V$

I want to prove that if $W$ is a subspace of an inner product space $V$, then $\dim W + \dim W^\perp = \dim V$. I have defined $x^\perp = \{ y : x \cdot y = 0\}$, where $\cdot$ denotes the dot product. It is a pretty elementary result, but I’m not sure how […]

Linear algebra problem from dummite & foote

Let $V$ be a finite dimensional vector space over $\mathbb{Q}$ and suppose $T$ is a nonsingular linear transformation of $V$ such that $T^{-1} = T^2 + T$. Prove that the dimension of $V$ is divisible by $3$. If the dimension of $V$ is precisely $3$, prove that all such transformations $T$ are similar. So applying […]

Constructing a non-linear system with prerequisites about the nature of its critical points.

An exercise from the book I am reading is: “Construct a non-linear system that has four critical points:two saddle points, one stable focus, and one unstable focus.” I have tried many systems. I found one quickly but I was lucky even if I had a few clues thanks my previous trials. I wonder if there […]

Orthogonal Decomposition

[Ciarlet 1.2-2] Let $O$ be an orthogonal matrix. Show that there exists an orthogonal matrix $Q$ such that $$Q^{-1}OQ\ =\ \left(\begin{array}{rrrrrrrrrrr} 1 & & & & & & & & & & \\ & \ddots & & & & & & & & & \\ & & 1 & & & & & & & […]

Why don't $A^TAx = A^Tb$ and $Ax=b$ have the same solution?

Suppose $A$ is a $m\times n$ matrix, $x$ is a vector of unkowns and $b$ is a vector of know entries. Why don’t $$Ax=b$$ and $$A^TAx = A^Tb$$ have the same solution ($x$)? It seems to me that I could get from the first equation to the second equation by simply multiplying both sides by […]