Intereting Posts

$A$ is normal and nilpotent, show $A=0$
Uniformly bounded sequence of holomorphic functions converges uniformly
Normal subgroup if conjugate subgroup is subset
What is the probability that this harmonic series with randomly chosen signs will converge?
Why do both sine and cosine exist?
Sum : $\sum \sin \left( \frac{(2\lfloor \sqrt{kn} \rfloor +1)\pi}{2n} \right)$.
An entire function $f(z)$ is real iff $z$ is real
What is an Isomorphism: Linear algebra
$\mathbb{R}^3$ \ $\mathbb{Q}^3$ is union of disjoint lines. The lines are not in an axis diretion.
Right invertible and left zero divisor in matrix rings over a commutative ring
Show that if $a \equiv b \pmod n$, $\gcd(a,n)=\gcd(b,n)$
Fermat's little theorem proof by Euler
Prove that any group of order 15 is cyclic.
Compute the inverse Laplace transform of $e^{-\sqrt{z}}$
Does a section that vanishes at every point vanish?

I know Cramer’s rule works for 3 linear equations. I know all steps to get solutions. But I don’t know why (how) Cramer’s rule gives us solutions?

Why do we get $x=\frac{\Delta_1}\Delta$ and $y$ and $z$ in the same way?

I want to know how these steps give us solutions?

- Trace of the matrix power
- where did determinant come from?
- Effect of elementary row operations on determinant?
- Determinant of a large block matrix
- Find the determinant of $n\times n$ matrix
- How do you find the determinant of this $(n-1)\times (n-1)$ matrix?

- Singular value proofs
- How to prove the Pythagoras theorem using vectors
- Prove that if $A$ is regular then $\operatorname{adj}(\operatorname{adj}(A)) = (\det A)^{n-2} A$
- Prove: Square Matrix Can Be Written As A Sum Of A Symmetric And Skew-Symmetric Matrices
- Tests for positive definiteness of nonsymmetric matrices
- Jordan canonical form of an upper triangular matrix
- Text recommendation for introduction to linear algebra
- What is the least value of $k$ for which $B^k = I$?
- Introductory Linear Algebra Book Recommendation
- What is the agreed upon definition of a “positive definite matrix”?

It’s actually simple; I explain it here in two variables, but the principle is the same.

Say you have an equation

$$\begin{pmatrix}a&b\\c&d\end{pmatrix}\begin{pmatrix}x\\y\end{pmatrix}=\begin{pmatrix}p\\q \end{pmatrix}$$

Now you can see that the following holds

$$\begin{pmatrix}a&b\\c&d\end{pmatrix}\begin{pmatrix}x&0\\y&1\end{pmatrix}=\begin{pmatrix}p&b\\q &d\end{pmatrix}$$

Finally just take the determinant of this last equation; $\det$ is multiplicative so you get $$\Delta x=\Delta_1$$

Cramer’s rule is very easy to discover because if you solve the linear system of equations

\begin{align*}

a_{11} x_1 + a_{12} x_2 + a_{13} x_3 &= b_1 \\

a_{21} x_1 + a_{22} x_2 + a_{23} x_3 &= b_2 \\

a_{31} x_1 + a_{32} x_2 + a_{33} x_3 &= b_3 \\

\end{align*}

by hand, just using a standard high school approach of eliminating variables, then out pops Cramer’s rule! In my opinion, this is the most likely way that a mathematician would discover the determinant in the first place, and Cramer’s rule is discovered simultaneously.

I remember thinking that it must be quite difficult to prove Cramer’s rule for an $n \times n$ matrix, but it turns out to be surprisingly easy (once you take the right approach). We’ll prove it below.

The most useful way of looking at the determinant, in my opinion, is this: the function $M \mapsto \det M$ is an alternating multilinear function of the columns of $M$ which satisfies $\det(I) = 1$. This characterization of the determinant gives us a quick, simple proof of Cramer’s rule.

For simplicity, I’ll assume $A$ is a $3 \times 3$ matrix with columns $a_1, a_2, a_3$. Suppose that $$b = Ax = x_1 a_1 + x_2 a_2 + x_3 a_3.$$ Then

\begin{align*}

\begin{vmatrix} b & a_2 & a_3 \end{vmatrix} &=

\begin{vmatrix} x_1 a_1 + x_2 a_2 + x_3 a_3 & a_2 & a_3 \end{vmatrix} \\

&= x_1 \begin{vmatrix} a_1 & a_2 & a_3 \end{vmatrix} +

x_2 \begin{vmatrix} a_2 & a_2 & a_3 \end{vmatrix} +

x_3 \begin{vmatrix} a_3 & a_2 & a_3 \end{vmatrix} \\

&= x_1 \det A.

\end{align*}

If $\det A \neq 0$, it follows that

$$

x_1 = \frac{\begin{vmatrix} b & a_2 & a_3 \end{vmatrix}}{\det A}.

$$

I learned this proof in section 4.4, problem 16 (“*Quick proof of Cramer’s rule*“) in Gilbert Strang’s book Linear Algebra and its Applications.

Here is a very simple solution that only uses some properties of determinants. Consider the following system:

$$

\left\{

\begin{array}{c}

a_1x+b_1y+c_1z=d_1 \\

a_2x+b_2y+c_2z=d_2 \\

a_3x+b_3y+c_3z=d_3

\end{array}

\right.

$$

Assume $\Delta\neq0$, then, $$\require{action}\begin{align}

\Delta_1&

\mathtip{=\left|

\matrix{d_1 & b_1 & c_1 \\

d_2 & b_2 & c_2 \\

d_3 & b_3 & c_3}\right|}{\text{by definition of }\Delta_1} \\& \\

&\mathtip{=\left|

\matrix{(a_1x+b_1y+c_1z) & b_1 & c_1 \\

(a_2x+b_2y+c_2z) & b_2 & c_2 \\

(a_3x+b_3y+c_3z) & b_3 & c_3}\right|}{\text{by the system of equations}} \\& \\

&\mathtip{=\left|

\matrix{(a_1x+b_1y+c_1z)-(b_1y+c_1z) & b_1 & c_1 \\

(a_2x+b_2y+c_2z)-(b_2y+c_2z) & b_2 & c_2 \\

(a_3x+b_3y+c_3z)-(b_3y+c_3z) & b_3 & c_3}\right|}{\text{If a multiple of one column is added to another column, the value of the determinant is not changed.}} \\ & \\

&\mathtip{=\left|

\matrix{a_1x & b_1 & c_1 \\

a_2x & b_2 & c_2 \\

a_3x & b_3 & c_3}\right|}{\text{Simplifying}} \\& \\

&\mathtip{= x\left|

\matrix{a_1 & b_1 & c_1 \\

a_2 & b_2 & c_2 \\

a_3 & b_3 & c_3}\right|}{\text{If each entry in a given row is multiplied by $k$, then the value of the determinant is multiplied by $k$.}} \\& \\

&\mathtip{= x\Delta}{\text{by definition of }\Delta}

\end{align}$$ Thus $x=\dfrac{\Delta_1}{\Delta}$. The proof is due to D. E. Whitford and M. S. Klamkin. (“On an Elementary Derivation of Cramer’s Rule”, *American Mathematical Monthly*, vol. 60 (1953), pp.186–7).

This is another way to look at it. $Ax = b$ where A is invertible. First we define the following matrix:

$$I_i(x) = [e_1\,\,e_2\,\, e_{i-1}\,\,x\,\,e_{i+1}\,\, …\,\, e_n]$$

By the defenition of matrix multiplication:

$$AI_i(x) = [Ae_1\,\,Ae_2\,\, Ae_{i-1}\,\,Ax\,\,Ae_{i+1}\,\, …\,\, Ae_n]$$

$$AI_i(x) = A_i(b)$$

$$det\,AI_i(x) = det\,A_i(b)$$

The determinant of the product of two matrices is the product of the determinants, so:

$$det\,A \,\cdot\, det\,I_i(x) = det\,A_i(b)$$

Let’s now look at $det\,I_i(x)$, note that the determinants are the same because you can get rid of all the $x_j$ where $j\ne i$ by row reduction. Also $x$ is in the $i^{th}$ column:

$$det\,I_i(x) = det

\begin{bmatrix}

1 & x_1 & \cdots & 0 \\

0 & x_2 & 0 & 0 \\

\vdots & \vdots & \ddots & 0 \\

0 & x_n & 0 & 1 \\

\end{bmatrix}

= det

\begin{bmatrix}

1 & 0 & \cdots & 0 \\

0 & x_i & 0 & 0 \\

\vdots & \vdots & \ddots & 0 \\

0 & 0 & 0 & 1 \\

\end{bmatrix}

= 1 \cdot 1 \cdot\cdot\cdot 1\cdot x_i \cdot 1\cdot\cdot\cdot1=x_i$$

$det\,I_i(x)$ is simply $x_i$, So we can say that:

$$x_i = \frac{det\,A_i(b)}{det\,A}$$

- Show that $ \mathbb{E} < \infty \Longleftrightarrow \sum_{n=1}^\infty \mathbb{P} < \infty $ for random variable $X$
- Count the number of rational canonical form&find similarity classess
- Is there such a thing as partial integration?
- linear least squares minimizing distance from points to rays – is it possible?
- An example of two infinite-dimensional vector spaces such that $\dim_{\mathbb{F}}\mathcal{L}(U,V)> \dim_{\mathbb{F}}U\cdot \dim_{\mathbb{F}}V$
- How to do $\sum_{k=1}^{\infty} \frac{\cos(kx)}{k^2}$?
- $A\otimes_{\mathbb C}B$ is finitely generated as a $\mathbb C$-algebra. Does this imply that $A$ and $B$ are finitely generated?
- Prove that a graph $G$ is a forest if and only if every induced subgraph of $G$ contain a vertex of degree at most $1$
- $G$ is a group $,H \cong K$, then is $G/H \cong G/ K$?
- A pair of continued fractions that are algebraic numbers and related to $a^2+b^2=c^m$
- Are isometric normed linear spaces isomorphic?
- Minimum number of points chosen from an $N$ by $N$ grid to guarantee a rectangle?
- Examples of Separable Spaces that are not Second-Countable
- Equivalent definitions of the Jacobson Radical
- Evaluate $\int_0^{{\pi}/{2}} \log(1+\cos x)\, dx$