Intereting Posts

Sum of two closed sets is measurable
Every ideal of $K$ has $\leq n$ generators?
Solving the heat equation using Fourier series
basic linear algebra question, proving it is a diagonal matrix and scalar matrix.
Everything in the Power Set is measurable?
Distances between randomly distributed points in a ball
$|AxA|$ is a constant implies normal subgroup
Find the limit $L=\lim_{n\to \infty} (-1)^{n}\sin\left(\pi\sqrt{n^2+n}\right)$
Topologies and manifolds
Could someone explain aleph numbers?
How not to prove the Riemann hypothesis
Dense subset of $C(X)$
Linear independence of function vectors and Wronskians
Entire function, Liouville and zeroes
About the intersection of two countable sets

Suppose $(e_1,e_2,…,e_n)$ is an orthonormal basis of the inner product space $V$ and $v_1,v_2,…,v_n$ are vectors of $V$ such that $$||e_j-v_j||< \frac{1}{\sqrt{n}}$$ for each $j \in \left\{1,2,…,n \right \}$. Prove that $(v_1,v_2,…,v_n)$ is a basis of $V$.

I am completely lost and just starting to learn about inner product spaces. Could someone provide a proof with the explanation of how you got there?

- Is complex conjugation needed for valid inner product?
- What's the connection between “hyperbolic” inner product spaces and the hyperbolic plane?
- How to prove that $L^p $ isn't induced by an inner product? for $p\neq 2$
- Multiplicative norm on $\mathbb{R}$.
- How to motivate the axioms for the inner product
- Proof for triangle inequality for vectors

- Problem on Finding the rank from a Matrix which has a variable
- How unique (on non-unique) are U and V in Singular Value Decomposition (SVD)?
- Least squares problem: find the line through the origin in $\mathbb{R}^{3}$
- Looking for an intuitive explanation why the row rank is equal to the column rank for a matrix
- What is Modern Mathematics? Is this an exact concept with a clear meaning?
- Proof that Gauss-Jordan elimination works
- Singular-value inequalities
- Relation between AB and BA
- Is every convex-linear map an affine map?
- Understanding Eigenvector

Note that it’s enough to show that the $v_i$ are linearly independent. Suppose that there are scalars $c_1,\dots,c_n$ not all zero such that $\sum_ic_iv_i=0$. Then

$$ 0=\sum_{i}c_iv_i=\sum_{i}c_i(v_i-e_i)+\sum_ic_ie_i$$

hence

$$ \Big|\Big|\sum_{i}c_i(v_i-e_i)\Big|\Big|=\Big|\Big|\sum_{i}c_ie_i\Big|\Big|$$

However,

$$\Big|\Big|\sum_{i}c_i(v_i-e_i)\Big|\Big|\leq \sum_i|c_i|\cdot ||v_i-e_i||<\frac{1}{\sqrt{n}}\sum_{i=1}^n|c_i|\leq \Big[\sum_{i=1}^n|c_i|^2\Big]^{\frac{1}{2}} $$

(with the last step using Cauchy-Schwarz), while since the $e_i$ are orthonormal we have

$$ \Big|\Big|\sum_{i}c_ie_i\Big|\Big|^2=\sum_{i=1}^n|c_i|^2 $$

so that

$$ \Big|\Big|\sum_{i}c_ie_i\Big|\Big|=\Big[\sum_{i=1}^n|c_i|^2\Big]^{\frac{1}{2}} $$

This contradiction shows that the $v_i$ are linearly independent.

Hint (not sure if this will work or not):

Since there are $n$ vectors in the set $v_1, \ldots, v_n$, you only need to show that they are linearly independent. So, suppose $\lambda_1 v_1 + \cdots + \lambda_n v_n = 0$. We need to show that this implies $\lambda_1 = \cdots = 0$. Write $v_i = e_i + d_i$, where the difference vectors $d_i$ are small. Try to deduce that $\lambda_1 e_1 + \cdots + \lambda_n e_n = 0$.

Start with $\sum_i \lambda_iv_i = 0$ and assume one $\lambda_i\neq 0$ and denote $d_i = e_i -v_i$

$\Rightarrow \sum_i \lambda_i(e_i +d_i) =0$

$\Rightarrow \sum_i \lambda_ie_i =- \sum_i \lambda_id_i$

$\Rightarrow ||\sum_i\lambda_i e_i ||= ||\sum_i \lambda_id_i||$

(Take $d=\max\{||d_i||\}$)

$\Rightarrow ||\sum_i\lambda_i e_i ||^2= ||\sum_i \lambda_id_i||^2

\leq (\sum_i |\lambda_i|\cdot ||d_i||)^2

\leq (\sum_i |\lambda_i|\cdot d)^2

< (\sum_i |\lambda_i|\cdot \frac{1}{\sqrt{n}})^2

\leq \frac{1}{n} (\sum_i |\lambda_i| )^2

\leq \frac{1}{n}\cdot n (\sum_i |\lambda_i|^2)

$

The last inequality uses Cauchy–Schwarz inequality.

Now, since {e_i} is orthonormal basis we know that $||\sum_i\lambda_i e_i ||^2= \sum_i|\lambda_i|^2$ and thus:

$\sum_i|\lambda_i|^2= ||\sum_i\lambda_i e_i ||^2< \frac{1}{n}\cdot n (\sum_i |\lambda_i|^2) = \sum_i|\lambda_i|^2$.

Contradiction.

- Solving a 2nd order differential equation by the Frobenius method
- What (and how many) pieces does the Banach-Tarski Paradox break a sphere into?
- Probability of of an event happening at least once in a sequence of independent events?
- minimal surface of revolution when endpoints on x-axis?
- Choosing the correct subsequence of events s.t. sum of probabilities of events diverge
- Notation for: all subsets of size 2
- How can I show that $\sum \limits_{n=2}^\infty\frac{1}{n\ln n}$ is divergent without using the integral test?
- Inequality. $a^2+b^2+c^2 \geq a+b+c$
- Are there nontrivial continuous maps between complex projective spaces?
- Sum of squares of sum of squares function $r_2(n)$
- Show that $\frac{(-1)^n}{n+(-1)^n\sqrt{n+1}}=\frac{(-1)^n}n+\mathcal{O}\left(\frac{1}{n^{3/2}}\right)$
- Information captured by differential forms
- Evaluate the integral $\int\limits_{-\infty}^\infty \frac{\cos(x)}{x^2+1}dx$.
- What's the hard part of zero?
- Finding x in $\frac{\,_2F_1(\frac{1}{5},\frac{4}{5},\,1,\,1-x)}{\,_2F_1(\frac{1}{5},\frac{4}{5},\,1,\,x)} = \sqrt{n}$