Intereting Posts

Analytic solution of a system of four second order polynomials
In Group theory proofs what is meant by “well defined”
How could we define the factorial of a matrix?
Is it true that if ${\bf{x}}^\text{T}{\bf{Ay}}={\bf{x}}^\text{T}\bf{By}$ for all $\bf{x},\bf{y}$ then $\bf{A}=\bf{B}$?
Is there more than one way to divide the “L”-shaped tromino into four congruent, connected pieces?
What should an amateur do with a proof of an open problem?
Maximum value of multiplicative order
Convergence of a product series with one divergent factor
Is there any connection between Green's Theorem and the Cauchy-Riemann equations?
Isolated point in spectrum
Symmetric random walk and the distribution of the visits of some state
Ornstein-Uhlenbeck process: Markov, but not martingale?
A polynomial determined by two values
finding the probability that Alice has at least one class each day
Hatcher pg 187: Idea of Cohomology

My professor said that if for a $n \times n$ matrix $A$, $\text{null}(A) = \mathbb{R}^n$, then $A = 0_{n}$. Why is this true? I understand what its saying – if everything times this matrix is zero, then the matrix has to be zero.

The intuition is simple enough with numbers, but could someone explain why this is true with matrices? Thanks

- Question regarding matrices with same image
- Is the inverse of a linear transformation linear as well?
- Show that a positive operator is also hermitian
- Attempting to find a specific similarity (equivalence) matrix
- Show that there exists a vector $v$ such that $Av\neq 0$ but $A^2v=0$
- What conditions must the constants b1,b2 and b3 satisfy so that the system below has a solution

- The Determinant of a Sum of Matrices
- Text recommendation for introduction to linear algebra
- A bad Cayley–Hamilton theorem proof
- How two find the matrix of a quadratic form?
- Skew matrices generate everything?
- annihilator linear algebra
- Why don't $A^TAx = A^Tb$ and $Ax=b$ have the same solution?
- Can vectors be inverted?
- $\ker T\subset \ker S\Rightarrow S=rT$ when $S$ and $T$ are linear functionals
- Limit of sequence of growing matrices

If $\operatorname{null}(A)=\Bbb R^n$, then for all $x\in \Bbb R^n$ it holds that $Ax=0_{\Bbb R^n}$. In particular for all $i\in \{1, \ldots, n\}$ it is true that $Ae_i=0_{\Bbb R^n}$.

So $Ae_1=0_{\Bbb R^n}\land Ae_2=0_{\Bbb R^n}\land \ldots \land Ae_n=0_{\Bbb R^n}$.

Consider the matrix whose column $i$ is $e_i$, $[e_1\mid \ldots \mid e_n]$.

The above tells you that $0_{n\times n}=[Ae_1\mid \ldots \mid Ae_n]$, but $[Ae_1\mid \ldots \mid Ae_n]=A[e_1\mid \ldots \mid e_n]$.

Now note that $[e_1\mid \ldots \mid e_n]=I_n$ and conclude.

Show that if the matrix $A$ contains one non-zero entry, then you can find a vector $x$ such that $Ax \ne 0$. The easiest vectors to consider are ones that are all zero except for one entry, which is one.

If $A \neq 0$ then some column of $A$ is not the $0$ (column) vector. Say the $i$th column of $A$ is not the $0$ vector. Then $$A e_i = i\text{th column of } A$$

where $e_i$ is the (column) vector with $1$ in the $i$th positions and zero otherwise. Why does this imply that null$(A)$ isn’t all of $\Bbb{R}^n$?

A linear transformation can be visualized as taking a grid over space and skewing/distorting it. The grid lines all have to stay lines, and parallel lines remain parallel (i.e. it respects vector addition and multiplication), but that’s it. So “stretch the z-axis by 2” and “rotate around the y axis by 30$^\circ$” are valid linear transformations, represented by particular matrices.

In particular, squashing $\mathbb{R}^n$ onto some subspace is also a linear transformation (e.g. “squish everything down into the xz plane”). If we end up with a subspace, then that means that some other subspace (in this case the y-axis) got mapped to the origin. When this happens we say that this subspace is part of the *nullspace* of the matrix. So $\operatorname{null}(A)=\mathbb{R}^n$ is saying “this matrix squishes everything to the origin”. The zero matrix multiplies everything by zero, so it squishes everything to the origin. What your professor has proved is that the *only* matrix that does this is the zero matrix: the “squish everything to zero” property is enough to pin it down, because nothing else can do this.

In particular, any matrix with a nonzero entry in its $n^{th}$ column can be multiplied by the basis vector $(0,0,0,…1…0)$ with $1$ at position $n$, which will then produce this column vector as output (which by hypothesis is nonzero). So any nonzero matrix leaves *something* unsquished, and the conclusion follows.

- Orders, Partial Orders, Strict Partial Orders, Total Orders, Strict Total Orders, and Strict Orders
- A variation of the urn-ball model
- Find all the numbers $n$ such that $\frac{12n-6}{10n-3}$ can't be reduced.
- Show that $\sum_{k=1}^\infty \frac{\ln{k}}{k^2}$ converges or diverges
- When is $x=\{ x\}$?
- how do I find the average velocity and instantaneous velocity?
- Is there an infinite connected topological space such that every space obtained by removing one point from it is totally disconnected?
- Irrationality of $ \frac{1}{\pi} \arccos{\frac{1}{\sqrt{n}}}$
- Question Regarding The Power Series For $e^x$
- Gaussian curvature and mean curvature sufficient to characterize a surface?
- How to find the period of the sum of two trigonometric functions
- Infinite sum and the conditions with tanh
- Proving an entire, complex function with below bounded modulus is constant.
- Is $\lim_{x\rightarrow\infty}\frac{x}{\pi(x)}-\ln(x) =-1$?
- Find all subrings of $\mathbb{Z}^2$