Intereting Posts

The Determinant of a Sum of Matrices
Does the number pi have any significance besides being the ratio of a circle's diameter to its circumference?
counting Number of matrices
Finding out the area of a triangle if the coordinates of the three vertices are given
Understanding the properties and use of the Laplacian matrix (and its norm)
Prove that if $f:A\to B$ is uniformly continuous on $A$ and $g$ is uniformly continuous on $B$, then $g(f(x))$ is uniformly continuous on $A$
If $ f(x) = \frac{\sin^2 x+\sin x-1}{\sin^2 x-\sin x+2},$ then value of $f(x)$ lies in the interval
Can a collection of subsets of $\mathbb{N}$ such that no one set contains another be uncountable?
What pattern does set theory study?
How to show that a root of the equation $x (x+1)(x+2) … (x+2009) = c $ can have multiplicity at most 2?
$\epsilon$-$\delta$ proof that $\lim\limits_{x \to 1} \frac{1}{x} = 1$.
When the approximation $\pi\simeq 3.14$ is NOT sufficent
Algorithm to find the exact roots of solvable high-order polynomials?
Polar decomposition normal operator
Differential Geometry of curves and surfaces: bibliography?

There is some obvious relationship between the root solutions to a homogeneous difference equation (as a recurrence relation) and eigenvalues which I’m trying to see. I have read over the wiki article 3.2, 3.4 and the eigenvalues ($\lambda$ ) are hinted at as the roots, but I’m still not sure why these must be eigenvalues of some matrix, say $A_0$, and what the meaning of $A_0$ may be.

It seems that to solve a homogeneous linear difference equation we find the “characteristic polynomial” by simply factoring one difference equation. However, typically to find the “characteristic polynomial” I would solve the characteristic equation for some matrix,

$A_0 = \begin{pmatrix} 1 & 0 & 0\\

0 & -2 & 0 \\

0 & 0 & 3 \\

\end{pmatrix}$

- generalized eigenvector for 3x3 matrix with 1 eigenvalue, 2 eigenvectors
- Questins on Formulae for Eigenvalues & Eigenvectors of any 2 by 2 Matrix
- eigenvalues of certain block matrices
- The Adjacency Matrix of Symmetric Differences of any Subset of Faces has an Eigenvalue of $2$…?
- $J$ be a $3\times 3$ matrix with all entries $1$ Then $J$ is
- Alpha and Omega limit sets (dynamical systems)

$(A_0 – \lambda I)\mathbf x = \mathbf 0$,

then solve for the determinant equal to $0$, and then solve for each $\lambda$ e.g.

$ \det(A_0 – \lambda I) = 0$

$(1 – \lambda)(2 + \lambda)(3 – \lambda) = 0$

Now suppose this also happens to be a solution to some linear difference equation, and so here the characteristic polynomial is $\lambda^3 – 2\lambda^2 – 5\lambda + 6 = 0$, and the difference equation is.

$y_{k+3} – 2y_{k+2} – 5y_{k+1} + 6y_k = 0 $. Then, for example, $\lambda = 3$ is a solution for all k.

Now, given we have found this solution to this difference equation, how can we explain some special relationship to $A_0$, other than $\lambda = 3$ happens to be an eigenvalue of $A_0$? Is there any *meaning* to make of $A_0$?

(cf. 4.8, Linear Algebra 4th, D. Lay)

- Eigenvalues and power of a matrix
- A and B is similar ⇒ $A^T$ is similar to $B^T$.
- What is the kernel of the tensor product of two maps?
- block matrix multiplication
- Prove: Every Subspace is the Kernel of a Linear Map between Vector Spaces
- Invariant Subspace of Two Linear Involutions
- If $A,B\in M(2,\mathbb{F})$ and $AB=I$, then $BA=I$
- How to prove Weyl’s asymptotic law for the eigenvalues of the Dirichlet Laplacian?
- Alternative definition of the determinant of a square matrix and its advantages?
- Why is a set of orthonormal vectors linearly independent?

You have a mistake in your expansion of the characteristic polynomial, it should be $\lambda^3-2 \lambda^2-5 \lambda +6$.

To see the connection between this polynomial and the matrix $A_0$, it helps to reduce the difference equation down a first order equation in many variables.

Let $x_k^1 = y_k, x_k^2 = y_{k+1}, x_k^3 = y_{k+2}$. Then the difference equation becomes

$$x_{k+1}^1 = x_k^2$$

$$x_{k+1}^2 = x_k^3$$

$$x_{k+1}^3 = -6 x_k^1 + 5 x_k^2 + 2 x_k^3,$$

or, in matrix terms, with $x_k = (x_k^1,x_k^2,x_k^3)^T$:

$$ x_{k+1} = A x_k = \begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ -6 & 5 & 2 \end{bmatrix} x.$$

Note the connection between the characteristic polynomial coefficients and the bottom row of the matrix (this is called the controllable canonical form in control circles). If you work out the eigenvalues of $A$ you will find that they are $\{-2,1,3\}$. In fact, the characteristic polynomial of $A$ is $\lambda^3-2 \lambda^2-5 \lambda +6$.

Hence $A$ is diagonalizable by some matrix $V$, and you have $A_0 = V^{-1} A V$, where $A_0$ has the form above.

They share the same characteristic polynomial because $\det (\lambda I – A_0) = \det (\lambda I – V^{-1} A V) = \det V^{-1} \det (\lambda I – A ) \det V = \det (\lambda I – A)$.

Relevant links:

http://en.wikipedia.org/wiki/Companion_matrix

http://en.wikipedia.org/wiki/State_space_representation#Canonical_realizations

The roots *are* eigenvalues. What is the operator they are eigenvalues of? It is the *shift* operator.

Consider the vector space $V$ of sequences $a_0, a_1, a_2, …$ (say of complex numbers). Define the *left shift operator*

$$S(a)_i = a_{i+1}.$$

In other words, $S$ takes input a sequence $a_0, a_1, a_2, …$ and returns the sequence $a_1, a_2, a_3, …$. Now we need the following three fundamental observations.

**Observation 1:** The linear homogeneous recurrence with characteristic polynomial $p$ is precisely described by the equation $p(S) a = 0$. In other words, the sequences satisfying such a linear recurrence are precisely the sequences in the kernel of $p(S)$.

**Observation 1.5:** The shift operator $S$ acts on the space of solutions to any equation of the form $p(S)a = 0$.

**Observation 2:** Over the complex numbers, we can factor $p(S) = \prod (S – \lambda_i)$. Thus if $(S – \lambda_i) a = 0$, or equivalently if $a$ is an eigenvector of $S$ with eigenvalue $\lambda_i$, then $p(S) a = 0$.

**Observation 3:** $(S – \lambda_i) a = 0$ if and only if $a_n = c \lambda_i^n$ for some constant $c$.

$S$ resembles in some sense the differentiation operator, which it is sent to if one sends $V$ to the space of formal power series via

$$(a_0, a_1, a_2, …) \mapsto \sum_{n=0}^{\infty} \frac{a_n}{n!} x^n.$$

Thus everything above applies (at least formally) to linear homogeneous ODEs with constant coefficients and their characteristic polynomials as well. The corresponding eigenvectors are the functions $e^{\lambda_i x}$.

- A Lebesgue measure question involving a dense subset of R, translates of a measurable set, etc.
- “Long-division puzzles” can help middle-grade-level students become actual problem solvers, but what should solution look like?
- Diagonalize a symmetric matrix
- Why is the eigenvector of a covariance matrix equal to a principal component?
- How to show that $f$ is constant by using Liouville's theorem?
- Finding the inverse of the arc length function
- Symmetric Icon Fractals
- Evaluate $\int_0^1x\log\left(1+x^2\right)\left^3\operatorname{d}\!x$
- LogSine Integral $\int_0^{\pi/3}\ln^n\big(2\sin\frac{\theta}{2}\big)\mathrm d\theta$
- Olympiad Inequality Problem
- Has the $\Gamma$-like function $f_p(n) = 1^{\ln(1)^p} \cdot 2^{\ln(2)^p} \cdot \ldots \cdot n^{\ln(n)^p} $ been discussed anywhere?
- Definite integral over a simplex
- Does there exist a nowhere differentiable, everywhere continous, monotone somewhere function?
- Wilson's Theorem – Why only for primes?
- Sequence of partial sums of e in Q is a Cauchy sequence.