What is the importance of eigenvalues/eigenvectors?

What is the importance of eigenvalues/eigenvectors?

Solutions Collecting From Web of "What is the importance of eigenvalues/eigenvectors?"

Short Answer

Eigenvectors make understanding linear transformations easy. They are the “axes” (directions) along which a linear transformation acts simply by “stretching/compressing” and/or “flipping”; eigenvalues give you the factors by which this compression occurs.

The more directions you have along which you understand the behavior of a linear transformation, the easier it is to understand the linear transformation; so you want to have as many linearly independent eigenvectors as possible associated to a single linear transformation.


Slightly Longer Answer

There are a lot of problems that can be modeled with linear transformations, and the eigenvectors give very simply solutions. For example, consider the system of linear differential equations
\begin{align*}
\frac{dx}{dt} &= ax + by\\\
\frac{dy}{dt} &= cx + dy.
\end{align*}
This kind of system arises when you describe, for example, the growth of population of two species that affect one another. For example, you might have that species $x$ is a predator on species $y$; the more $x$ you have, the fewer $y$ will be around to reproduce; but the fewer $y$ that are around, the less food there is for $x$, so fewer $x$s will reproduce; but then fewer $x$s are around so that takes pressure off $y$, which increases; but then there is more food for $x$, so $x$ increases; and so on and so forth. It also arises when you have certain physical phenomena, such a particle on a moving fluid, where the velocity vector depends on the position along the fluid.

Solving this system directly is complicated. But suppose that you could do a change of variable so that instead of working with $x$ and $y$, you could work with $z$ and $w$ (which depend linearly on $x$; that is, $z=\alpha x+\beta y$ for some constants $\alpha$ and $\beta$, and $w=\gamma x + \delta y$, for some constants $\gamma$ and $\delta$) and the system transformed into something like
\begin{align*}
\frac{dz}{dt} &= \kappa z\\\
\frac{dw}{dt} &= \lambda w
\end{align*}
that is, you can “decouple” the system, so that now you are dealing with two independent functions. Then solving this problem becomes rather easy: $z=Ae^{\kappa t}$, and $w=Be^{\lambda t}$. Then you can use the formulas for $z$ and $w$ to find expressions for $x$ and $y$.

Can this be done? Well, it amounts precisely to finding two linearly independent eigenvectors for the matrix $\left(\begin{array}{cc}a & b\\c & d\end{array}\right)$! $z$ and $w$ correspond to the eigenvectors, and $\kappa$ and $\lambda$ to the eigenvalues. By taking an expression that “mixes” $x$ and $y$, and “decoupling it” into one that acts independently on two different functions, the problem becomes a lot easier.

That is the essence of what one hopes to do with the eigenvectors and eigenvalues: “decouple” the ways in which the linear transformation acts into a number of independent actions along separate “directions”, that can be dealt with independently. A lot of problems come down to figuring out these “lines of independent action”, and understanding them can really help you figure out what the matrix/linear transformation is “really” doing.

A short explanation

Consider a matrix $A$, for an example one representing a physical transformation. When this matrix is used to transform a given vector $x$ the result is $y = A x$.

Now an interesting question is

Are there any vectors $x$ which does not change it’s direction under this transformation, but allow the vector magnitude to vary by scalar $ \lambda $?

Such a question is of the form $$A x = \lambda x $$

So, such special $x$ are called eigenvector(s) and the change in magnitude depends on the eigenvalue $ \lambda $.

The behaviour of a linear transformation can be obscured by the choice of basis. For some transformations, this behaviour can be made clear by choosing a basis of eigenvectors: the linear transformation is then a (non-uniform in general) scaling along the directions of the eigenvectors. The eigenvalues are the scale factors.

I think if you want a better answer, you need to tell us more precisely what you
may have in mind: are you interested in theoretical aspects of eigenvalues; do
you have a specific application in mind? Matrices by themselves are just arrays of
numbers, which take meaning once you set up a context. Without the context, it seems
difficult to give you a good answer. If you use matrices to describe adjacency relations,
then eigenvalues/vectors may mean one thing; if you use them to represent linear maps
something else, etc.

One possible application: In some cases, you may be able to diagonalize your
matrix M using the eigenvalues, which gives you a nice expression for M^k.
Specifically, you may be able to decompose your matrix into a product
SDS^-1 , where D is diagonal, with entries the eigenvalues, and S is the
matrix with the associated respective eigenvectors.
I hope it is not a problem to post this as a comment. I got a couple of Courics
here last time for posting a comment in the answer site.

Mr. Arturo:
Interesting approach!. This seems to connect with the theory of characteristic
curves in PDE’s(who knows if it can be generalized to dimensions higher than 1), which
are curves along which a PDE becomes an ODE, i.e., as you so brilliantly said,
curves along which the PDE decouples.

When you apply transformations to the systems/objects represented by matrices, and you need some characteristics of these matrices you have to calculate eigenvectors (eigenvalues).

“Having an eigenvalue is an accidental property of a real matrix (since it may fail to have an eigenvalue), but every complex matrix has an eigenvalue.”(Wikipedia)

Eigenvalues ​​characterize important properties of linear transformations, such as whether a system of linear equations has a unique solution or not. In many applications eigenvalues ​​also describe physical properties of a mathematical model.

Some important applications –

  • Principal Components Analysis (PCA) in object/image recognition;

  • Physics – stability analysis, the physics of rotating bodies;

  • Market risk analysis – to define if a matrix is positive definite;

  • PageRank from Google.

In data analysis, the eigenvectors of a covariance (or correlation matrix) are usually calculated.


Eigenvectors are the set of basis functions that are the most efficient set to describe data variability. They are also the coordinate system that the covariance matrix becomes diagonal allowing the new variables referenced to this coordinate system to be uncorrelated. The eigenvalues is a measure of the data variance explained by each of the new coordinate axis.


They are used to reduce the dimension of large data sets by selecting only a few modes with significant eigenvalues and to find new variables that are uncorrelated; very helpful for least-square regressions of badly conditioned systems. It should be noted that the link between these statistical modes and the true dynamical modes of a system is not always straightforward because of sampling problems.

I would like to direct you to an answer that I posted here: Importance of eigenvalues

I feel it is a nice example to motivate students who ask this question, in fact I wish it were asked more often. Personally, I hold such students in very high regard.

An eigenvector $v$ of a matrix $A$ is a directions unchanged by the linear transformation: $Av=\lambda v$.
An eigenvalue of a matrix is unchanged by a change of coordinates: $\lambda v =Av \Rightarrow \lambda (Bu) = A(Bu)$. These are important invariants of linear transformations.

Lets Go Back to the Historical Background to get the motivation!

Consider The Linear Transformation T:V->V.

Given a Basis of V, T is characterized beautifully by a Matrix A,which tells everything about T.

The Bad part about A is that ” A changes with the change of Basis of V “.

Why it is bad?

Because the Same Linear Transformation due to two different basis selection are given by two distinct matrices. Believe me! You cannot relate between the two matrices by looking at their entries. Really Not Interesting!

Intuitively, there exist some strong relation between two such Matrices.

Our Aim is to capture THE THING in common(Mathematically).

Now Eigen Values are a necessary condition to check so but not sufficient though!

Let make my statement clear.

“The Invariance of a Subspace of V under a Linear Transformation of V” is such a property.

That is if A and B represent the same T, then they must have all the eigenvalues equal. But The Converse is Not True! Hence not Sufficient but NECESSARY!

And The Relation Is ” A is Similar to B “. i.e. $ A=PBP^{-1} $ where P is a non-Singular Matrix.

This made it clear for me https://www.youtube.com/watch?v=PFDu9oVAE-g

There’s a lot of other linear algebra videos as well from 3 blue 1 brown