Intereting Posts

Pullback of a covering map
Prove $\frac{|a+b|}{1+|a+b|}<\frac{|a|}{1+|a|}+\frac{|b|}{1+|b|}$.
Reason for Continuous Spectrum of Laplacian
Relation: pairwise and mutually
Non-Decreasing Digits
In how many ways can we colour $n$ baskets with $r$ colours?
The Dihedral Angles of a Tetrahedron in terms of its edge lengths
How can I show that $\sqrt{1+\sqrt{2+\sqrt{3+\sqrt\ldots}}}$ exists?
Euler Product formula for Riemann zeta function proof
Can RUBIK's cube be solved using group theory?
Non-integer order derivative
Is there a difference between $y(x)$ and $f(x)$
Natural Logarithm and Integral Properties
$(-1)^{\sqrt{2}} = ? $
Longest pipe that fits around a corner.

Let’s take $n$ vectors in $\mathbb{R}^n$ at random. What is the probability that these vectors are linearly independent? (i.e. they form a basis of $\mathbb{R}^n$)

(of course the problem is equivalent of “taken a matrix at random from $M_{\mathbb{R}}(n,n)$, what is the probability that its determinant $\neq 0$)

Don’t know if this question is difficult to answer or not. Please share any information about it! ðŸ™‚

- Motivation for adjoint operators in finite dimensional inner-product-spaces
- Matrix multiplication using Galois field
- Prove that Every Vector Space Has a Basis
- Proving that a right (or left) inverse of a square matrix is unique using only basic matrix operations
- Motivation for linear transformations
- symmetric positive definite matrix question

(the $n$ vectors are meant with real values, I’m interested in solutions in $\mathbb{N}$ or $\mathbb{Q}$ or whatever fields you like)

- Dimension of vector space of all $n$-linear functions
- Another proof of uniqueness of identity element of addition of vector space
- Ideals in the ring of endomorphisms of a vector space of uncountably infinite dimension.
- mathematical difference between column vectors and row vectors
- Is the set $\{\frac{1}{a\,-\,\pi}\mid a\in\mathbb{Q}\}$ linearly independent over $\mathbb{Q}$?
- New proof about normal matrix is diagonalizable.
- Proof of the conjecture that the kernel is of dimension 2, extended
- What properties should a matrix have if all its eigenvalues are real?
- Let $\text{Rank}{(A - \lambda I)^k} = \text{Rank}{(B - \lambda I)^k}$. Why are $A$ and $B$ similar?
- How to project a n-dimensional point onto a 2-D subspace?

As others have pointed out the main problem is what “taking a vector at random” means. Probability theory requires that one specifies a certain probability measure on ${\mathbb R}^n$ before one can make any predictions about outcomes of experiments concerning chosen vectors. E.g., if it is totally unlikely, meaning: the probability is zero, that a vector with $x_n\ne 0$ is chosen, then the probability that $n$ vectors chosen independently are linearly independent is $\>=0$, since with probability $1$ they all lie in the plane $x_n=0$.

A reasonable starting point could be installing a *rotational invariant* probability measure. As the length of the $n$ chosen vectors does not affect their linear dependence or independence this means that we are chosing $n$ independent vectors uniformly distributed on the sphere $S^{n-1}$. (This informal description has a precise mathematical meaning.)

Under this hypothesis the probability that the $n$ chosen vectors $X_k$ are linearly independent is $=1$.

*Proof.* The first vector $X_1$ is linearly independent with probability $1$, as $|X_1|=1$. Assume that $1< r\leq n$ and that the first $r-1$ vectors are linearly independent with probability $1$. Then with probability $1$ these $r-1$ vectors span a subspace $V$ of dimension $r-1$, which intersects $S^{n-1}$ in an $(r-2)$-dimensional “subsphere” $S_V^{r-2}$. This subsphere has $(n-1)$-dimensional measure $0$ on $S^{n-1}$. Therefore the probability that $X_r$ lies in this subsphere is zero. It follows that with probability $1$ the vectors $X_1$, $\ldots$, $X_{r-1}$, $X_r$ are linearly independent.

This depend of how you takes the vectors. For the case $n=2$, for example, if you have a vector $v_1$, the probability that you get a vector LD with $v_1$ is $0$. So, the probability that you get two vector LI is 1.

- Square root of Positive Definite Matrix
- Gaussian Curvature K > 0
- Is there a geometric interpretation of $F_p,\ F_{p^n}$ and $\overline{F_p}?$
- How come $\sum\limits_{n=1}^{\infty}\frac{1}{2^n}=\sum\limits_{n=1}^{\infty}\frac{1}{2^n\ln(2^n)}$?
- Abelian $2$-groups
- Subsets of the reals when the Continuum Hypothesis is assumed false
- Why is $5^{n+1}+2\cdot 3^n+1$ a multiple of $8$ for every natural number $n$?
- All tree orders are lattice orders?
- What is the number of bijections between two multisets?
- How many 2-edge-colourings of $K_n$ are there?
- Maximum Proof (Average?)
- Let a|c and b|c such that gcd(a,b)=1, Show that ab|c
- How to find PV $\int_0^\infty \frac{\log \cos^2 \alpha x}{\beta^2-x^2} \, \mathrm dx=\alpha \pi$
- Can the real and imaginary parts of $\dfrac{\sin z}z$ be simplified?
- Links between difference and differential equations?