Markov Chain

Introduction

A Markov chain is a series of random values x1, x2, ... in which the probabilities associated with a particular value xi depend only on the prior value xi-1. A Markov chain is a random process with the property that the next state depends only on the current state. A Markov chain is a model of a system that has many possible "states" where the likelihood of being in any given state is expressable as a probability based on its previous state or states. In a process, the outcome of a given experiment affects the outcome of the next experiment. This type of process is called a Markov chain.

A zero'th order Markov chain means that the probability of being in any state is independent of the previous state. A first-order Markov chain means that the probability of being in any given state only depends on the immediate previous state. A second-order chain depends on the previous 2 states, etc.