Intereting Posts

Limit-The question asks to find the value of a in the limit.
How to evaluate the quotient of Dedekind eta function in Pari/Gp
What's 4 times more likely than 80%?
Cardinality of separable metric spaces
Why does $\left(\int_{-\infty}^{\infty}e^{-t^2} dt \right)^2= \int_{-\infty}^{\infty}\int_{-\infty}^{\infty}e^{-(x^2 + y^2)}dx\,dy$?
If $X$ is a metric space with infinitely many connected components, is $X$ compact?
Is there limit $ \lim_{(x,y) \to (0,0)} \frac{x^3}{x^2 + y^2}$?
Finding the degree of a field extension over the rationals
How to prove $\bar{m}$ is a zero divisor in $\mathbb{Z}_n$ if and only if $m,n$ are not coprime
What is a sequence of functions that converge weakly in Lp, but not strongly?
Local ring at a non-singular point of a plane algebraic curve
There are compact operators that are not norm-limits of finite-rank operators
If $x$ and $y$ are not both $0$ then $ x^2 +xy +y^2> 0$
$|\nabla f (x)| =1$ implies $f$ linear?
What is the difference between necessary and sufficient conditions?

Let’s say we have a basis $B$ of $\mathbb{R}^n$ consisting of vectors $\vec{v}_1$ through $\vec{v}_n$, and some other basis $C$ of $\mathbb{R}^n$. Then, would $[\vec{v}_1]_C$ through $[\vec{v}_n]_C$ be a basis for $\mathbb{R}^n$ as well? It seems obvious, but I am not sure how to go about this.

I am trying to prove this by creating a matrix $S$ that is a change of basis matrix, so it has $[\vec{v}_1]_C$ through $[\vec{v}_n]_C$ as its columns. So I get $C$$S$=$B$. I know that kernels of both $C$ and $B$ are equal to zero, since they are square matrices with independent columns as they are bases, but I am not sure what that tells me about $S$. Maybe there is a way to show that a matrix with independent columns ($C$) times a matrix with not-independent columns ($S$) can’t (always/sometimes?) yield another matrix with independent columns ($B$).

- Prove that a polytope is closed
- Isomorphism between dual space and bilinear forms
- A rank-one matrix is the product of two vectors
- Counting invariant subspaces of a Vector space
- A basis for the dual space of $V$
- Eigenvalues and power of a matrix

- A $\{0,1\}$-matrix with positive spectrum must have all eigenvalues equal to $1$
- Size of Jordan block
- Basis of primitive nth Roots in a Cyclotomic Extension?
- What's the Clifford algebra?
- Upper and Lower Triangular Matrices
- Calculate Rotation Matrix to align Vector A to Vector B in 3d?
- Invariant Subspace of Two Operators
- Difference between the algebraic and topological dual of a topological vector space?
- Finite sum of eigenspaces (with distinct eigenvalues) is a direct sum
- Determinant of the transpose via exterior products

Yes, the coordinate vectors of $\mathbf{v}_1,\ldots,\mathbf{v}_n$ with respect to the matrix $C$ will be a basis for $\mathbb{R}^n$.

Note that you are abusing notation somewhat: if $B$ and $C$ are bases, then they aren’t matrices; what you really want is to have a basis $\beta$, and $B$ is the matrix whose columns are the vectors in $\beta$; and another basis $\gamma$, and $C$ is the matrix whose columns are the vectors in $\gamma$.

From what you have: since $CS = B$, and the kernel of $B$ is zero, then so is the kernel of $S$: if $\mathbf{x}$ lies in the kernel of $S$, then

$$\mathbf{0} = C\mathbf{0} = C(S\mathbf{x}) = (CS)\mathbf{x} = B\mathbf{x}.$$

But since the kernel of $B$ is zero, that means that $\mathbf{x}=\mathbf{0}$. So the kernel of $S$ is zero, which means that its columns are linearly independent, hence a basis. But the columns of $S$ are the coordinate vectors of the elements of $\beta$ written in terms of $\gamma$, which gives your result.

For your final question, yes, you are correct. Here’s a general statement you can prove (using the same technique as above):

If $A$ is $n\times p$, $B$ is $p\times m$, and $C$ is $n\times m$ with $AB=C$, then $\mathrm{ker}(B)\subseteq\mathrm{ker}(C)$. That is, if $\mathbf{x}\in\mathbb{R}^m$ is in the kernel of $B$, then it is also in the kernel of $C$.

Another way of attacking the problem is to prove that $[\mathbf{v}_1]_{\gamma},\ldots,[\mathbf{v}_n]_{\gamma}$ are linearly independent directly. But this is easy: suppose that

$$\alpha_1[\mathbf{v}_1]_{\gamma} + \cdots + \alpha_n[\mathbf{v}_n]_{\gamma} = \mathbf{0}.$$ Then we have:

$$\mathbf{0} = \alpha_1[\mathbf{v}_1]_{\gamma} + \cdots + \alpha_n[\mathbf{v}_n]_{\gamma}

= [\alpha_1\mathbf{v}_1+\cdots+\alpha_n\mathbf{v}_n]_{\gamma}.$$

But this means that $\alpha_1\mathbf{v}_1+\cdots+\alpha_n\mathbf{v}_n=\mathbf{0}$, and since $\mathbf{v}_1,\ldots,\mathbf{v}_n$ are linearly independent, that means $\alpha_1=\cdots=\alpha_n=0$.

In essence, it’s the same argument as you have above, but playing directly with the vectors instead of the associated change-of-basis matrices.

Yes. If $SB = C$ then $S = CB^{-1}$, which is also an invertible matrix since $B$ and $C$ are invertible.

If $B$ and $C$ are basis, there is a basis change matrix $S$ (or rather, a linear application $f$ associated to $S$) that changes $B$ into $C$.

But $f$ is not only the application that turns one basis into another, it is an application that turns any basis into another basis.

If $(e_1,\ldots,e_n)$ is the canonical basis of $\mathbb{R}^n$, $f$ turns this basis into a new basis, whose vectors are the column vectors of its matrix representation $S$.

Those applications $f$ who turn basis into basis are precisely the bijective linear applications from $\mathbb{R}^n$ to $\mathbb{R}^n$.

It is easy to check that if $f$ is a change of basis, then it is bijective ; and if $f$ is bijective, it turns any basis into another basis.

- Let D be a principal ideal domain. Prove that every non-zero prime ideal in D is a maximal ideal in D.
- Should the domain of a function be inferred?
- QR factorization of a special structured matrix
- Prove a certain property of linear functionals, using the Hahn-Banach-Separation theorems
- Transitivity of the discriminant of number fields
- Pointwise convergence does not imply $f_n(x_n)$ converges to $f(x)$
- Analytic function in the punctured plane satisfying $|f(z)| \leq \sqrt{|z|} + \frac{1}{\sqrt{z}}$ is constant
- Comparing the expected stopping times of two stochastically ordered random processes (Prove or give a counterexample for the two claims)
- Normal Distribution, The “Y” Value
- Prove an analog of Rolle's theorem for several variables
- Evaluation of the integral $\int_0^1 \log{\Gamma(x+1)}\mathrm dx$
- What is this method called – Abelization of a Group.
- If $G$ is a groupe such that $|G|=p^m k$, does $G$ has a subgroup of order $p^n$ with $n<m$.
- question about Laguerre polynomials
- Are the rationals a nowhere dense set?