Intereting Posts

Proving that irreducibility of a matrix implies strong connectedness of the graph
Sheafification of a presheaf through the etale space
Evaluate $\int_0^{\infty}\frac{e^x-1}{xe^x(e^x+1)}dx.$
I'm searching for some books with guidance into mathematical study.
What does it mean for a matrix to be orthogonally diagonalizable?
If $M$ is a nonorientable $3$-manifold, why is $H_1(M, \mathbb{Z})$ infinite?
Blackboard bold, Bold, Fraktur, and Reserved Variable.
Differentiation under the integral sign – line integral?
Memorylessness of the Exponential Distribution
Differential Equations without Analytical Solutions
Fourier transform of Schrödinger kernel: how to compute it?
Is it true that the equation $27x^2+1=7^3y^2$ has infinitely many solutions in positive integers $x,y$ ?
If $a_n$ goes to zero, can we find signs $s_n$ such that $\sum s_n a_n$ converges?
Inequality on an increasing function
Deep theorem with trivial proof

My linear algebra notes state the following lemma: If $(v_1, …,v_m)$ is linearly dependent in $V$ and $v_1 \neq 0$ then there exists $j \in \{2,…,m\}$ such that $v_j \in span(v_1,…,v_{j-1})$ where $(…)$ denotes an ordered list.

But if at least one $v_i$ is $\neq 0$ then the list can be reordered and the lemma applied. Is $v_1 \neq 0$ just another way of saying $v_i \neq 0$ for at least one $i$?

- How to show that $\det(AB) =\det(A)\det(B)$
- Dual space and inner/scalar product space
- Can a real symmetric matrix have complex eigenvectors?
- How to test if a graph is fully connected and finding isolated graphs from an adjacency matrix
- Continuous eigen value decomposition.
- Why is the complex number $z=a+bi$ equivalent to the matrix form $\left(\begin{smallmatrix}a &-b\\b&a\end{smallmatrix}\right)$

- Nullspace that spans $\mathbb{R}^n$?
- A square matrix $n \times n$ is an invertible matrix iff the rank of the matrix is $n$.
- Transpose map in $M(2,\mathbb{R})$
- Proving that irreducibility of a matrix implies strong connectedness of the graph
- Strictly diagonal matrix
- Approximate spectral decomposition
- Find the equation of the plane passing through a point and a vector orthogonal
- cyclic vectors- cyclic subspaces
- Simple to state yet tricky question
- Distance between a point and a m-dimensional space in n-dimensional space ($m<n$)

Your book is correct, but silly. It should not have excluded $v_1=0$, but allowed $j=1$ instead. By convention $\operatorname{span}()=\{0\}$ (it is important that the span of *every* set of vectors is a subspace, so the empty set should give the null subspace), and $v_1=0$ (which all by itself makes the set linearly dependent) would not be an exception, because one can then take $j=1$ (and if there is no linear dependence among the remaining vectors, one *has to* take $j=1$). So it is just another case of unfounded fear of the void.

The result states (or should) that given an ordered sequence of linearly dependent vectors, there is always one of them that is in the span of set of vectors preceding it. This is always true. Indeed, you can always take the this vector to be the first one to make the sequence-up-to-there linearly dependent. The empty sequence is always independent, and a sequence with one vector is linearly dependent only if that vector is zero, in which case it is in the (empty) span of the (empty) set of preceding vectors. If the first is not zero but after some independent vectors a zero vector comes along, then it also in the span of the set of preceding vectors, but that span now has positive dimension. Of course a *nonzero* vector in the same span, in place of that zero vector, would also have made the sequence dependent; this is in fact the more common case.

$v_1\ne0$ simply means that $v_1\ne 0$. All other $v_j$ may be zero.

Note that $v_1$ does play a special role insofar as it is the only of the given vectors that is definitely in all $\operatorname{span}(v_1,\ldots, v_{j-1})$ of the claim. Note that the claim as stated fails if $v_1=0$ and $v_2,\ldots, v_m$ are linearly independant.

You can *apply* the theorem to different permutations of the $v_i$. But note that the claim differs if you permute the vectors! What you get is that *some* vector is a linear combination of the others (and that holds even if all vectors are zero). The theorem at hand is concerned with a given *ordered* seqeuence of vectors and want a vector to be a linear combination of *preceeding* vectors.

You’re essentially correct. But it’s another way of saying $v_i\neq 0$ for some $v_{i}$, not all $v_{i}$.

- MLE of bivariate normal distribution
- Finite dimensional subspace of $C()$
- A projection $P$ is orthogonal if and only if its spectral norm is 1
- How can I show that $\sum\limits_{n=1}^\infty \frac{z^{2^n}}{1-z^{2^{n+1}}}$ is algebraic?
- Partial Derivatives on Manifolds – Is this conclusion right?
- If $\phi(g)=g^3$ is a homomorphism and $3 \nmid |G|$, $G$ is abelian.
- Method of Exhaustion applied to Parabolic Segment in Apostol's Calculus
- If $k>0$ is a positive integer and $p$ is any prime, when is $\mathbb Z_p =\{a + b\sqrt k~|~a,b \in\mathbb Z_p\}$ a field.
- Prove that the Gaussian rationals is the field of fractions of the Gaussian integers
- Moments and non-negative random variables?
- Graph type taxonomy
- Method of Steepest descent integral
- Show that $3p^2=q^2$ implies $3|p$ and $3|q$
- Proof of Jordan Curve Theorem for Polygons
- surjectivity of group homomorphisms