Intereting Posts

Is there a distributive law for ideals?
Product of binomial coefficient as a basis
prove that $A(n) : \left(\frac n3\right)^n\lt n!\lt \left(\frac{n}{2}\right)^n$ for all $n\ge 6$
Does the limit exist? (Calculus)
The set of convergence of a sequence of measurable functions is measurable
Prove that any group of order 15 is cyclic.
Can a holomorphic function satisfy f(1/n)=1/(n+1)?
Matrix exponential convergence
Inverse Laplace transform $\mathcal{L}^{-1}\left \{ \ln \left ( 1+\frac{w^{2}}{s^{2}}\right ) \right \}$
smallest eigenvalue of rank one matrix minus diagonal
Something strange about $\sqrt{ 4+ \sqrt{ 4 + \sqrt{ 4-x}}}=x$ and its friends
Does $u\in L^p(B)$ implies $u_{|\partial B_t}\in L^p(\partial B_t)$ for almost $t\in (0,1]$?
Partitioning $$ into pairwise disjoint nondegenerate closed intervals
Is there an analytic solution for the equation $\log_{2}{x}+\log_{3}{x}+\log_{4}{x}=1$?
A holomorphic function is conformal

Find a set of vectors {u, v} in $\mathbb R^4$ that spans the solution set of the equations

$x – y + 2z +3w = 0$

$4x + 2y – z + 3w = 0$

- Singular-value inequalities
- Generating correlated random numbers: Why does Cholesky decomposition work?
- For which values does the Matrix system have a unique solution, infinitely many solutions and no solution?
- Prove that if V is finite dimensional then V is even dimensional?
- For which $n, k$ is $S_{n,k}$ a basis? Fun algebra problem
- Let $T$ be a self-adjoint operator and$\langle T(w),w\rangle>0$ . If $\operatorname{dim}(W) = k$ then $T$, has at least $k$ positive eigenvalues
$ u =\begin{bmatrix}\\\\\\\end{bmatrix}, v =\begin{bmatrix}\\\\\\\end{bmatrix}$

How exactly can I do that, what do they mean by “that *spans*“

- Prove that if matrix $A$ is nilpotent, then $I+A$ is invertible.
- Is the unique least norm solution to $Ax=b$ the orthogonal projection of b onto $R(A)$?
- Obtain a basis of invertible matrices for $M_n(D)$, where $D$ is an integral domain
- Where does the Pythagorean theorem “fit” within modern mathematics?
- Polynomial Interpolation and Security
- basis of a vector space
- Linearly independent functionals
- Can the induced matrix norms be induced by inner products?
- Rotation of conics sections using linear algebra
- What is the significance of reversing the polarity of the negative eigenvalues of a symmetric matrix?

First off, to solve this you do exactly what you’ve been doing in all of your other problems. Since you weren’t sure of your answer, I went ahead and worked it out for you.

The augmented matrix equation that you’re trying to solve is this one:

$$\left[\begin{array}{cccc|c} 1 & -1 & 2 & 3 & 0 \\ 4 & 2 & -1 & 3 & 0\end{array}\right]$$

Now I’ll do Gaussian elimination to solve it. Note that I’m going to leave off the final column of $0$’s because no elementary row operation will actually change them. So just imagine that column still being there:

$$\begin{align}\begin{bmatrix} 1 & -1 & 2 & 3 \\ 4 & 2 & -1 & 3\end{bmatrix} &\sim_{R_2 \to R_2-4R_1} \begin{bmatrix} 1 & -1 & 2 & 3 \\ 0 & 6 & -9 & -9\end{bmatrix} \\ &\sim_{R_2\to \frac 16R_2} \begin{bmatrix} 1 & -1 & 2 & 3 \\ 0 & 1 & -\frac 32 & -\frac 32\end{bmatrix} \\ &\sim_{R_1\to R_1+R_2} \begin{bmatrix} 1 & 0 & \frac 12 & \frac 32 \\ 0 & 1 & -\frac 32 & -\frac 32\end{bmatrix}\end{align}$$

This is the RREF of your matrix. Now we see that columns $3$ and $4$ don’t have pivots so we set $z=s$, $s\in \Bbb R$ and $w=t$, $t\in \Bbb R$. Then we write down the equations that the above matrix represents and substitute these new variables:

$$\begin{cases} x +\frac 12z+\frac 32w=0 \\ y-\frac 32z-\frac 32w=0\end{cases} \iff \begin{cases} x= -\frac 12s-\frac 32t \\ y=\frac 32s+\frac 32 t\end{cases}$$

So each element of the solution set is of the form $$\begin{bmatrix} x \\ y \\ z \\ w\end{bmatrix} = \begin{bmatrix} -\frac 12s-\frac 32t \\ \frac 32s+\frac 32 t \\ s \\ t\end{bmatrix} = s\begin{bmatrix} -\frac 12 \\ \frac 32 \\ 1 \\ 0\end{bmatrix} + t\begin{bmatrix} -\frac 32 \\ \frac 32 \\ 0 \\ 1\end{bmatrix},\quad s,t\in\Bbb R$$

Thus the set $\{(-\frac 12, \frac 32, 1, 0),(-\frac 32,\frac 32, 0, 1)\}$ spans the space. But fractions are a little annoying. So what we can do is multiply each vector by $2$. Really what we’re doing here is defining new variables $s’=\frac 12s$ and $t’=\frac 12t$. Then plugging these into the above and moving the constant inside the vectors we get $$\bbox[5px,border:2px solid red]{\{(-1,3,2,0),(-3,3,0,2)\}}$$

Note that these vectors are not unique. There are an infinite number of other pairs of vectors which will span this space. For instance, Vineet’s solution is another perfectly valid pair.

Now that we see this exercise just asked to you solve it the same way as all the others you’ve done so far, I’ll tackle your question about what the word *span* means. Let $S=\{\mathbf u_1, \dots, \mathbf u_k\}\subset \Bbb R^n$.

**Definition:** A *linear combination* of $S$ is a vector $\mathbf w$ such that $$\mathbf w = w_1\mathbf u_1 + \cdots + w_k\mathbf u_k$$ for some $w_1, \dots, w_k \in \Bbb R$.

Example: The vector $(-7,-2,3)$ is a linear combination of $\{(1,2,3),(3,2,1)\}$ because $$(-7,-2,3) = 2(1,2,3)-3(3,2,1)$$

However $(1,0,0)$ is **not** a linear combination of $\{(1,2,3),(3,2,1)\}$ because there are no numbers $a,b$ such that $$(1,0,0) = a(1,2,3)+b(3,2,1)$$

**Definition:** The *span* of $S$ is the set of **all** linear combinations of $S$. I.e. $$\operatorname{span}(S) = \{\mathbf w \in \Bbb R^n \mid\mathbf w = w_1\mathbf u_1 + \cdots + w_k\mathbf u_k\}$$

Examples:

$$\begin{align}(1)\quad &\operatorname{span}\{(1,0,0),(0,1,0)\} = \operatorname{span}\{(10,9,0),(e,-\pi,0)\} = \text{the $xy$-plane} \\ (2)\quad &\operatorname{span}\{(1,2,3)\} = \text{the line containing $O$ parallel to $(1,2,3)$}\end{align}$$

**Lemma:** The span of any set of vectors in $\Bbb R^n$ is a subspace of $\Bbb R^n$. (*Note: I’m not going to prove this lemma here, it’s just important to know.*)

So this question is asking you to find a set of two vectors whose span is the solution set to your system of linear equations.

Let $x$ and $y$ be the free variables. To get the $u$, put $x=0,y=1$ and for $v$, put $x=1,y=0$ and solve two equations in $z$ and $w$. Finally you will get

$u=[0,3,3,-1]^T$ & $v=[1,0,1,-1]^T$

Please note the magnitude of these vectors are irrelevant. So you can multiply it with any scaler you want. In case of $u$, I multiplied vectore with $3$.

Hope this will be helpful !

what means: the set of solutions $(x, y, z, w)$ in $\Bbb{R}^4$, of an

equation of type $ax + by + cz + dw = 0$ , $a,b,c,d\in \Bbb{R}$, is the sub space kernel in $\Bbb{R}^4$

of the linear form $(x, y, z, w)\rightarrow a x + by + cz + dw$ ,

so this kernel is by rank theorem (case not a, b, c, d, all zero)

a hyperplane of $\Bbb{R}^4$, so a sub space of dimension $3$.

The solution set $(x, y, z, w)$ of $n$ equations of the same type

as above, is the subspace obtained by the intersection of the

solutions set of each equation.

in your case $n=2$,

the question suppose that this intersection of solution can it be

generated by two vectors $u$ and $v$, that is all solutions are in

the form $a u+b v$ where $a,b\in\Bbb{R}$

- Interchange of integration and summation and the Taylor expansion
- Converging series question, Prove that if $\sum_{n=1}^{\infty} a_n^{2}$ converges, then does $\sum_{n=1}^{\infty} \frac {a_n}{n}$
- Is there a bijective function $f: \mathbb{R}\to\mathbb{R}$ that is discontinuous?
- Are commutative C*-algebras really dual to locally compact Hausdorff spaces?
- An extension of a game with two dice
- Exchange order of “almost all” quantifiers
- prove $( \lnot \lnot p \Rightarrow p) \Rightarrow (((p \Rightarrow q ) \Rightarrow p ) \Rightarrow p )$ with intuitionistic natural deduction
- Limit of the sequence $\frac{1^k+2^k+…+n^k}{n^{k+1}}$
- the number of the positive integer numbers $k$ that makes for the two quadratic equations $ \pm x^2 \pm px \pm k$ rational roots.
- Can a conic over $\mathbb{Q}$ with no $\mathbb{Q}$-points have a point of degree 3?
- Does Lowenheim-Skolem theorem depend on axiom of choice?
- Transitivity of the discriminant of number fields
- A definite integral with trigonometric functions: $\int_{0}^{\pi/2} x^{2} \sqrt{\tan x} \sin(2x) \, \mathrm{d}x$
- How to solve the Brioschi quintic in terms of elliptic functions?
- Choice Problem: choose 5 days in a month, consecutive days are forbidden