Intereting Posts

Proving that 1- and 2-d simple symmetric random walks return to the origin with probability 1
Is there a partial sum formula for the Harmonic Series?
Parametric equations for hypocycloid and epicycloid
About the ratio of the areas of a convex pentagon and the inner pentagon made by the five diagonals
What is a simple example of a limit in the real world?
In what sense is the forgetful functor $Ab \to Grp$ forgetful?
Prove that if $({x+\sqrt{x^2+1}})({y+\sqrt{y^2+1}})=1$ then $x+y=0$
Euler-Maclarurin summation formula and regularization
how the Kronecker product is a tensor product?
finite subgroups of PGL(3,C)
How could we formalize the introduction of new notation?
Undecidable conjectures
Show that $\sqrt{n+1}-\sqrt{n}\to0$
the product of diagonals of a regular polygon
Compute $\phi_X(t)=E(e^{it^\top X})$ if $X\stackrel{d}{=}\mu + \xi AU$ with $AA^\top=\Sigma$

I earlier asked this question The basis of a matrix representation. I now have a another question related to the same topic.

The vector space $\Bbb{R}^n$ I have seen defined as all $n$-tuples of real numbers$^1$. But I think a more intuitive definition would be that $\Bbb{R}^n$ is simply the $n$-dimensional real space such as a plane for $n=2$ or a line for $n=1$. My problem with the first definition is that it seems like we have already specified a basis, namely $(i,j,k)$ for n=3, which themselves have no meaning as column vectors until we define them to be $(1,0,0),(0,1,0)$ and $ (0,0,1)$ respectively. Thus if we write vectors in $\Bbb{R}^n$ as column vectors we have already defined our basis and the tuples are therefore not basis independent objects.

So my point is that if we change basis, the $n$-order tuples will change but the actual ‘arrow’ from one point to another in the $n$-dimensional space will not. So surely this ‘arrow’ is the actual element $\Bbb{R}^n$ and the $n$-order tuple is its coordinate map.

- How two find the matrix of a quadratic form?
- Recurrence with varying coefficient
- changing bases/rotating axes to find reflection across y=2x
- Reconstructing a Matrix in $\Bbb{R}^3$ space with $3$ eigenvalues, from matrices in $\Bbb{R}^2$
- A question in Subspaces in linear algebra
- Show that $T\to T^*$ is an isomorphism (Where T is a linear transform)

Is this right or wrong? If it is right please could you give me a source where it states it explicitly, I have tried to look for one without success.

- from http://www.math.vt.edu/people/dlr/m2k_svb01_vecspc.pdf

- prove that projection is independent of basis
- Is this equivalent to Cauchy-Schwarz Inequality?
- Help with proving that the transpose of the product of any number of matrices is equal to the product of their transposes in reverse
- Continued matrices-valued function
- What should be the characteristic polynomial for $A^{-1}$ and adj$A$ if the characteristic polynomial of $A$ be given?
- Evaluate determinant of an $n \times n$-Matrix
- If two real symmetric square matrices commute then does they have a common eigenvector ?
- Eigenvalues of a certain tridiagonal matrix
- Prove one set is a convex hull of another set
- List the primes for which the following system of linear equations DOES NOT have a solution in $\mathbb{Z}_p$

No, it is wrong to consider $\Bbb R^n$ as an abstract vector space of dimension$~n$ without given basis. There is no concrete vector space that *precisely* models *the* abstract vector space of dimension$~n$ without given basis, though one can give many examples of vector spaces of dimension$~n$ that do not obviously come equipped with a basis. That is what abstraction is about; if you want to work in a basis independent way, just start “let $V$ be an $\Bbb R$-vector space of dimension$~n$” and you won’t have a preferred basis.

But $\Bbb R^n$ is a concrete “gold standard” vector space that can serve as reference; any *other* vector space of dimension$~n$ is brought into isomorphism with $\Bbb R^n$ exactly be choosing an ordered basis in it. But $\Bbb R^n$ does not need this; it is in canonical bijection with itself (by the identity map), obviously. So no choice of basis is needed in $\Bbb R^n$. If you *must* talk about bases, then the mentioned identity isomorphism corresponds to the standard basis of $\Bbb R^n$, but this really serves as “unit to use for quantities that need no unit”. So talking about things in $\Bbb R^n$ being basis independent is really meaningless, as no basis is needed.

I’ve said more about this in this answer. From which I will here just paraphrase: the spaces $\Bbb R^n$ are the worst possible examples of vector spaces to learn about the meaning and usefulness of bases. (Unfortunately, they are also the easiest examples to talk about concretely.) When using in $\Bbb R^n$ another basis than the standard basis, we get from it an isomorphism $\Bbb R^n\to\Bbb R^n$ other than the identity. So one needs to distinguish between “abstract vectors” on one side of the isomorphism, and “their coordinates” on the other side, even though both live in the same space, and are unequal. You are quite right that this is utterly confusing. So when doing things basis independently *don’t work in $\Bbb R^n$*.

$\Bbb R^n$ is the set of all $n$-tuples. But because there is a theorem which says that every $n$-dimensional vector space over the reals is isomorphic to $\Bbb R^n$, mathematicians (being lazy) don’t often distinguish between their $n$-dimensional vector space $V$ and $\Bbb R^n$ unless they have to.

So while $\Bbb R^n$ is defined to mean the set of all $n$-tuples, it’s generally ALSO conflated with Euclidean space, “arrow” space (the set of all oriented line segments), and “coordinate” space.

$\Bbb R^n$ does not have an intrinsic basis. We could in fact choose the basis $\left\{ f_1=\begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix}, f_2=\begin{bmatrix} 0 \\ 1 \\ 1 \end{bmatrix}, f_3 = \begin{bmatrix} 1 \\ 0 \\ 1\end{bmatrix}\right\}$ or an uncountably infinite number of others for $\Bbb R^3$. It just turns out that the standard basis is more easily manipulated.

I am not sure that I completely understand what you are asking. But, anyway…

It is all about definitions. You can take, as a definition, that

$$\mathbb{R}^n = \left\{\pmatrix{x_1 \\ x_2 \\ \vdots \\ x_n} \mid x_i\in \mathbb{R}\right\}.$$

This defines what $\mathbb{R}^n$ is as a set. We make this into a vector spaces by defining the usual vector addition and scalar multiplication. All this we do without even thinking about what a basis is.

After defining what $\mathbb{R}^n$ is we then realize that we can write all the elements/vectors in $\mathbb{R}^n$ in a unique way as a linear combination of the vectors

$$

e_1 = \pmatrix{1 \\ 0 \\ \vdots \\ 0}, e_2 = \pmatrix{0 \\ 1 \\ \vdots \\ 0}, e_3 = \cdots

$$

Saying this is exactly saying that this collection of vectors $\{e_1\}$ is a basis.

Now, it turns out that you can find other bases. No matter what basis you pick, the elements are the same. The space/set doesn’t depend on the choice of basis that you might (might not) make.

After this we now understand what $\mathbb{R}^n$ is and what a basis for $\mathbb{R}^n$ is. It turns out that it makes sense to talk about general vector spaces. $\mathbb{R}^n$ is just an example of a vector space.

A general vector space is a set that satisfies certain requirements: http://en.wikipedia.org/wiki/Vector_space#Definition

It then turns out that if $U$ is a general vector space of dimension $n$, then as vector spaces $U$ is isomorphic to $\mathbb{R}^n$. They are not equal, but isomorphic. So in a sense there is only one vector space of dimension $n$ (for each $n$). We say that up to isomorphism there is only one vector space of dimension $n$.

I would say that $\mathbb{R}^n$ is the unique $n$-dimensional $\mathbb{R}$-vector space (up to isomorphism).

Now there are several ways to define it. As cartesian product it is the set:

$$

\{f\colon\{1,…,n\}\to\mathbb{R} : f\mbox{ is a function}\}

$$

Then, for convenience, we usually write the elements of $\mathbb{R}^n$ as $n$-tuples.

However, in full generality, if we have a field $k$ and a finite dimensional $k$-vector space $V$. Then, by standard results, we know that $V$ has a basis $\mathscr{B}=\{u_1,…,u_n\}$. In particular, for any vector $u\in V$ we have

$$

u=\sum_{i=1}^n\lambda_iu_i,

$$

where $\lambda_i\in k$. Thus we can define the coordinate function:

$$

c_{\mathscr{B}}\colon V\to k^n,\ u\mapsto(\lambda_1,…,\lambda_n),

$$

which is an isomorphism of vector spaces. (This is the reason why I said that $\mathbb{R}$ is the unique $n$-dimensional $\mathbb{R}$-vector space.)

If we consider another basis $\mathscr{B}’=\{v_1,…,v_n\}$, then we have another coordinate function $c_{\mathscr{B}’}$.

If we want to change basis then we have to extend by linearity the map:

$$

\varphi_{\mathscr{B},\mathscr{B}’}\colon V\to V,\ u_i\mapsto v_i.

$$

This map can be seen as $c_{\mathscr{B}’}^{-1}\circ\varphi\circ c_{\mathscr{B}}\colon k^n\to k^n$. So this is the map you apply to your number $n$-tuples when you change basis.

Notice that if you consider $\mathbb{R}^2$ as the plane and you draw two orthogonal lines and draw the vector (or arrow) $(3,2)$ then this is $3(1,0)+2(0,1)$. But if you change basis, this vector will change (as picture). Try to draw this example and then the same with basis $\{(1,0),(1,1)\}$ so that $(3,2)=(1,0)+2(1,1)$.

The definition involves Cartesian product of sets: $\mathbb{R}^{n}=\underset{n\text{ times}}{\underbrace{\mathbb{R}\times\dots\times\mathbb{R}}}$ which constists of ordered n-tuples. Elements of $\mathbb R$ happen to be real numbers. So to me, it doesn’t seem “like we have already specified a basis”. We haven’t even said yet that it is a vector space.

- Limits of sequences connected with real and complex exponential
- Are there any compact embedded 2-dimensional surfaces in $\mathbb R^3$ that are also flat?
- Supremum of a product measurable function…
- Is there an approximation to the natural log function at large values?
- Andrei flips a coin over and over again until he gets a tail followed by a head, then he quits. What is the expected number of coin flips?
- Integral of $\frac{1}{x^2+x+1}$ and$\frac{1}{x^2-x+1}$
- Fundamental Theorem of Calculus Confusion regarding atan
- Integer partition of n into k parts recurrence
- Exponential function and uniform convergence of polynomials.
- Understanding Vacuously True (Truth Table)
- Integral $ \int_{-\infty}^\infty \frac{e^{ikx}}{x^{3/2}}dx$
- $\mathbb C/(X^2)$ is isomorphic to $\mathbb R/((Y^2+1)^2)$
- Show $\mathbb{Z}$ is a Euclidean domain
- Proof of Uncountable Basis for $\mathbb{N} \to \mathbb{R}$ over $\mathbb{R}$
- Finding the Angle theta between two 2d vectors.