Intereting Posts

Determine the *interval* in which the solution is defined?
Concept of Random Walk
Right continuous version of a martingale
Symbol for the set of odd naturals?
Show that rodriques formula is a linear transformation?
Explain inequality of integrals by taylor expansion
Can compact sets completey determine a topology?
$m\{x\in :f'(x)=0\}>0$
A sequentially compact subset of $\Bbb R^n$ is closed and bounded
The order of the number of integer pairs satisfying certain arithmetical function relationships
Very *mathematical* general physics book
''Linear'' transformations between vector spaces over different fields .
How many possible DAGs are there with $n$ vertices
Do random variables form a comma category?
Spivak Chapter 2, problems 27 (and 28)

I often get confused about

- when a Markov chain has an equilibrium distribution;
- when this equilibrium distribution is unique;
- which starting states converge to the equilibrium distribution; and
- how finite and countably infinite Markov chains different with respect to the above.

(Google isn’t quite clearing up my confusion.) Is the following correct/am I missing anything?

An irreducible Markov chain (finite or countably infinite) has a unique equilibrium distribution if and only if all states are positive recurrent. (What about reducible Markov chains? A reducible Markov chain has a non-unique equilibrium distribution iff all states are positive recurrent?) However, not all starting states necessarily converge to the unique equilibrium, unless the Markov chain is also aperiodic; that is, an irreducible Markov chain converges to its unique equilibrium regardless of initial state, if and only if all states are positive recurrent and aperiodic.

- Markov Chain transitional probability query.
- Probability of a substring occurring in a string
- Markov chain: join states in Transition Matrix
- Expected number of turns for a rook to move to top right-most corner?
- Probability distribution for the position of a biased random walker on the positive integers
- 6-digit password - a special decoding method

- Flea on a triangle
- birthday problem - expected number of collisions
- Probability: Permutations
- Coin Tossing Game Optimal Strategy
- Intuition for probability density function as a Radon-Nikodym derivative
- Flip a fair coin until three consecutive heads or tails appear
- Rolling $2$ dice: NOT using $36$ as the base?
- Is there a name for the “famous” inequality $1+x \leq e^x$?
- Logical issues with the weak law of large numbers and its interpretation
- Probability from a collection of independent predictions

For a Markov chain with $N<\infty$ states, the set $I$ of invariant probability vectors is a non-empty simplex in ${\mathbb R}^N$ whose extreme points correspond to the recurrent classes of the chain. Thus, the vector is unique iff there is exactly one recurrent class; the transient states (if any) play absolutely no role (as in Jens’s example). The set $I$ is a point, line segment, triangle, etc. exactly when there are one, two, three, etc. recurrent classes.

If the invariant vector $\pi$ is unique, then there is only one recurrent class and the chain will eventually end up there. The vector $\pi$ necessarily puts zero mass on all transient states. Letting $\phi_n$ be the law of $X_n$, as you say, we have $\phi_n\to \pi$ only if the recurrent class is aperiodic. However, in general we have Cesàro convergence:

$${1\over n}\sum_{j=1}^n \phi_j\to\pi.$$

An infinite state space Markov chain need not have any recurrent states, and may have the zero measure as the only invariant measure, finite or infinite. Consider the chain on the positive integers which jumps to the right at every time step.

Generally, a Markov chain with countable state space has invariant probabilities iff there are positive recurrent classes. If so, every invariant probability vector $\nu$ is a convex combination of

the unique invariant vector $m_j$ corresponding to each positive recurrent class $j\in J$, i.e.,

$$\nu=\sum_{j\in J} c_j m_j,\qquad c_j\geq 0,\quad \sum_{j\in J}c_j=1.$$

This result is Corollary 3.23 in Wolfgang Woess’s *Denumerable Markov Chains*.

The answers you have given are true at least for finite Markov chains. (My courses did not cover any others, I am afraid. And all references I have are German, so of limited use to you =) ).

The part

A reducible Markov chain has a non-unique equilibrium distribution iff all states are positive recurrent?

is not true. Consider the Markov Chain on the states 0 and 1, which goes from 0 to 1 with probability 1 and then stays there. It has a unique equilibrium distribution ($\delta_1$), without the state 0 being positive recurrent.

- The relationship between tan(x) and square roots
- The Hairy ball theorem and Möbius transformations
- how to prove this combinatorial identity I accidentally find?
- What is the value of this repeated square root: $\sqrt{1\sqrt{2\sqrt {3 \sqrt{4\cdots}}}}$
- Is Tolkien's Middle Earth flat?
- How do I calculate the number of different combinations of multiple sets' elements (different number of elements on each set)?
- Do we get predicative ordinals above $\Gamma_0$ if we use hyperexponentiation?
- Calculating $\sum_{k=0}^{n}\sin(k\theta)$
- Why there is a unique empty set?
- Why $\sqrt{-1 \times {-1}} \neq \sqrt{-1}^2$?
- What are some natural arithmetical statements independent of ZFC?
- Proof for $\sin(x) > x – \frac{x^3}{3!}$
- How many limit points in $\{\sin(2^n)\}$? How many can there be in a general sequence?
- What is the distribution of sum of a Gaussian and a Rayleigh distributed independent r.v.?
- For each continuous $g:X\to $, $g(a_n)\to g(a)$, can we deduce $a_n\to a$?