Intereting Posts

If G is a group of order n=35, then it is cyclic
Free cocompact action of discrete group gives a covering map
Finding $P$ such that $P^TAP$ is a diagonal matrix
Where is Cauchy's wrong proof?
Correct notation for “for all positive real $c$”
Combinatorics with simple substitution ciphers question
371 = 0x173 (Decimal/hexidecimal palindromes?)
Different versions of Riesz Theorems
Weak/strong law of large numbers for dependent variables with bounded covariance
What's the relation between topology and graph theory
Make this visual derivative of sine more rigorous
Why $O(n)$ has exactly two connected components?
integer solutions to $x^2+y^2+z^2+t^2 = w^2$
Limiting Behaviour of Mean Value Theorem ($\theta \to \frac12$ as $h \to 0$)
Central limit theorem confusion

For the following square matrix:

$$ \left( \begin{array}{ccc} 3 & 0 & 1 \\

-4 & 1 & 2 \\

-6 & 0 & -2 \end{array} \right)$$Decide which, if any, of the following vectors are eigenvectors of

that matrix and give the corresponding eigenvalue.

- Minimal polynomial for an invertible matrix and its determinant
- Reference: Continuity of Eigenvectors
- Is there a connection between the diagonalization of a matrix $A$ and that of the product $DA$ with a diagonal matrix $D$?
- Find a formula in terms of k for the entries of Ak, where A is the diagonalizable matrix:
- $2\times 2 $ matrices over $\mathbb{C}$ that satisfy $\mathrm A^3=\mathrm A$
- Eigenvalues of a tridiagonal stochastic matrix
$ \left( \begin{array}{ccc} 2 \\ 2 \\

-1 \end{array} \right)$ $ \left( \begin{array}{ccc}

-1 \\ 0 \\ 2 \end{array} \right)$ $ \left( \begin{array}{ccc}

-1 \\ 1 \\ 3 \end{array} \right)$ $ \left( \begin{array}{ccc} 0 \\ 1 \\ 0 \end{array} \right)$$ \left( \begin{array}{ccc} 3 \\ 2 \\ 1

\end{array} \right)$

If I’ve understood correctly, I must multiply the matrix by each vector first. If the result is a multiple of that vector, then it’s an eigenvector. Only the fourth vector is so. But how should I calculate its corresponding eigenvalue?

- Signed angle between 2 vectors?
- What's the explicit categorical relation between a linear transformation and its matrix representation?
- A question in Subspaces in linear algebra
- Why $\mathbf{0}$ vector has dimension zero?
- Rotation matrix in terms of dot products.
- Extending a Chebyshev-polynomial determinant identity
- How to prove rigorously that you need $m \geq n$ equations with $n$ unknowns to be able to solve a system of equations?
- Sub-determinants of an orthogonal matrix
- Simultaneous diagonlisation of two quadratic forms, one of which is positive definite
- Are the functions $\sin^n(x)$ linearly independent?

Gigili, this is how’s usually calculated both eigenvalues and eigenvectors. Calculate the determinant:

$$\det(\lambda I-A)=\begin{vmatrix}\lambda-3&0&-1\\4&\lambda-1&-2\\6&0&\lambda+2\end{vmatrix}=(\lambda-1)(\lambda-3)(\lambda+2)+6(\lambda-1)=\lambda(\lambda-1)^2\Longrightarrow$$

$$\Longrightarrow \lambda=0\,,\,1\,\,\,\text{are the eigenvalues of the matrix}$$

Now, to find eigenvectors corresponding to the eigenvalues you form a homogeneous linear system by subtituting $\,\lambda\,$ in the above matrix expression with the corr. value. Call the unknowns $\,x,y,z\,$ and note that since the determinant is going to be zero we get *always* a system with a non-trivial expression (why?):

$$\lambda=0:\;\;\;\;\begin{cases}-3x&&\;\;-z=0\\\;\;\;4x&-y&+2z=0\\\;\;\;6x&&\;\;\;2z=0\end{cases}\;\;\;\;\Longrightarrow\;\;\;\begin{cases}z=-3x\\y=4x+2z=-2x\end{cases}$$

Thus, any eigenvector corresponding to the eigenvalue $\,\lambda=0\,$ has the form

$$\begin{pmatrix}\;\;x\\\!\!-2x\\\!\!-3x\end{pmatrix}\,\,,\,\,\text{for example}\,\,\,\begin{pmatrix}\;\;\;1\\-2\\-3\end{pmatrix}$$

Try now to mimic the above for the eigenvalue $\,\lambda=1\,$ . Once again, you get a system of rank 2 and thus there’s only one linearly independent eigenvector, too.

If $\mathbf{M}\mathbf{v} = \lambda \mathbf{v}$ for a scalar $\lambda$, then $\mathbf{v}$ is an eigenvector of $\mathbf{M}$, and $\lambda$ is its corresponding eigenvalue.

To expand on DonAntonio’s answer: an eigenvalue and eigenvector of a $n \times n$ matrix $A$ is defined as a respectively a scalar $\lambda$ and a vector $x \neq 0$ such that $Ax = \lambda x$. But this equivalent to $Ax – \lambda x = (A-\lambda I)x = 0$. We now want to solve this equation for $x$, so that $x \neq 0$. Now, if $A-\lambda I$ would be invertible, we would have only the solution $x = (A-\lambda I)^{-1} 0 = 0$, so that’s not what we want. Instead, we require the matrix $A-\lambda I$ to be singular.

Recall that a matrix is singular if and only if it’s determinant is zero, so we calculate the determinant of $A – \lambda I$ and equate it to zero:

$$

0 = det(A-\lambda I) = (\lambda – \lambda_1)^{m_1} \ldots (\lambda – \lambda_k)^{m_k}

$$

This is called the characteristic equation for $A$. This will yield a polynomial of degree $n$, with $k \leq n$ roots. These roots are then the eigenvalues of $A$, since these values of $\lambda$ will make $A-\lambda I$ singular. The exponentials $m_1$ through $m_k$ are called the algebraic multiplicities of the corresponding eigenvalues by the way, and these determine an upper bound on the dimensions of the eigenspaces, that I will explain next.

So now, we have a matrix $A – \lambda_i I$ that is singular, and we want to solve $(A-\lambda_i I)x_i = 0$. Every vector $x_i$ that is a solution of this equation is an eigenvector of $A$, corresponding to the eigenvalue $\lambda_i$. But, this vector is not unique! Note that if $x_i$ is an eigenvector, then so is $c x_i$ for every scalar $c \neq 0$. And it could even be the case that there exist two or more linearly independent eigenvectors! The space of all vectors that are a solution to this equation are called the eigenspace of $A$ for the eigenvector $\lambda_i$. This space is the same as the null space of $A-\lambda_i I$, whose dimension is determined by the rank-nullity theorem. This theorem (look it up in your linear algebra book ðŸ˜‰ ) states that the dimension of a matrix is equal to the sum of it’s rank (the dimension of the row space) and it’s nullity (the dimension of it’s null space). It also can be shown that the dimension of the eigenspace must be necessarily less or equal to the algebraic multiplicity of the eigenvector. This dimension is called the geometric multiplicity of $\lambda_i$.

So, to summarize the calculation of eigenvalues and corresponding eigenvectors:

- Write down the characteristic polynomial of $A$:

$$

det(A-\lambda I) = 0.

$$ - Solve the characteristic equation. The solutions $\lambda_i$ are the eigenvalues of $A$.
- Write down the system $(A-\lambda I) x = 0$ and solve the system for the vector $x$. (By Gaussian elimination or something like that.) The solutions $x_i$ are the eigenvectors of $A$ corresponding to $\lambda_i$. The subspace of all eigenvectors corresponding to $\lambda_i$ is the eigenspace of $A$ for $\lambda_i$.

Hope that helped a bit. ðŸ˜‰

- Are those two numbers transcendental?
- Is there any perfect squares that are also binomial coefficients?
- Roots of Legendre Polynomial
- Fields of characteristics and affine conics
- Are there further transformation principles similar to the Inclusion-Exclusion Principle (IEP)?
- How is this $\mu_0$ not a premeasure?
- Proving an equality involving compositions of an integer
- If $\sin x + \sin^2 x =1$ then find the value of $\cos^8 x + 2\cos^6 x + \cos^4 x$
- Homeomorphism that maps a closed set to an open set?
- In Russian roulette, is it best to go first?
- show that the interval of the form $$ is open set in metric subspace $$ but not open in $\mathbb R^1$
- Can you construct a field with 6 elements?
- How to generalize Reshetnikov's $\arg\,B\left(\frac{-5+8\,\sqrt{-11}}{27};\,\frac12,\frac13\right)=\frac\pi3$?
- Prove equality in triangle inequality for complex numbers
- Area of ascending regions on implicit plot – $\cos(y^y) = \sin(x)$