Markov Chain

Tags :: Bayesian Statistics

A Markov Chain is a sequence of random variables $X_0, X_1, …$ for which the future state is conditionally independent from all past ones given the current state . In other words, knowing the current state is enough to know the probabilities for all future states .

The Markov property can be written as

\[ P(X_{n+1} = j | X_n = i, X_{n-1} = i_{n-1}, … ,X_0 = i_0) = P(X_n+1 = j | X_n = i) \]

The following shows an oversimplified weather model, the second shows a deterministic die, the last two are more abstract with out a complete definition.

!markovexample.png

Properties of a Markov chain are defined for individual states and the entire chain. e.g. if a chain returns to a state over and over it is called a recurrent state. A transient state is one that the chain will leave forever. A chain is irreducible if it is possible to get form any state to any other state in a finite number of steps.

Transience and recurrence are important for understanding long-term run behavior. If a chain has transient and recurrent states, the chain will spend time in transient states, but will eventually spend all of eternity in recurrent states. So how much time will the chain spend at each state? The answer is found in the stationary distribution .

The stationary distribution \(s\) is a PMF such that \(sT = s\) . Rather a distribution that is not changed by the transition matrix \(T\). This does not mean that the chain is not moving anymore, it means that the chain moves in such a way that the time it will spend at each state is the one defined by \(s\). Steady state is an alternative name for stationary distribution.


Links to this note