# Can Markov Chain state space be continuous?

I looked for a formal definition of Markov chain and was confused that all definitions I found restrict chain’s state space to be countable. I don’t understand purpose of such a restriction and I have feeling that it does not make any sense.

So my question is: can state space of a Markov chain be continuum? And if not then why?

#### Solutions Collecting From Web of "Can Markov Chain state space be continuous?"

It seems you are thinking of a Markov process, rather than a Markov Chain.
The Markov chain is usually defined as a Markov process that has a discrete (finite or countable) state space.

“Often, the term Markov chain is used to mean a Markov process which
has a discrete (finite or countable) state-space” (ref)

A Markov process is (usually) a slightly more general thing, it’s only required to exhibit the Markov property, which makes sense for uncountable states (and “time”), or more general supports. So, for example, the continuous random walk is normally considered a Markov process, but not a Markov Chain.

Yes, it can. In some quarters the “chain” in Markov chain refers to the discreteness of the time parameter. (A notable exception is the work of K.L. Chung.) The evolution in time of a Markov chain $(X_0,X_1,X_2,\ldots)$ taking values in a measurable state space $(E, {\mathcal E})$ is governed by a one-step transition kernel $P(x,A)$, $x\in E$, $A\in{\mathcal E}$:
$${\bf P}[ X_{n+1}\in A|X_0,X_1,\ldots,X_n] = P(X_n,A).$$
Two fine references for the subject are Markov Chains by D. Revuz and Markov Chains and Stochastic Stability by S. Meyn and R. Tweedie.