**Markov Models**

**1. Description:**

In probability theory, a Markov model is a stochastic model that assumes the Markov property. Generally, this assumption enables reasoning and computation with the model that would otherwise be intractable.

**2. Key Points:**

1. Set of states:

2. Process moves from one state to another generating a sequence of states :

3. Markov chain property: probability of each subsequent state depends only on what was the previous state:

4. To define Markov model, the following probabilities have to be specified: transition probabilities

and initial probabilities

**3. Example of Markov Model:**

1. Two states : ‘Rain’ and ‘Dry’.

2. Transition probabilities: P(‘Rain’|‘Rain’)=0.3 , P(‘Dry’|‘Rain’)=0.7 , P(‘Rain’|‘Dry’)=0.2, P(‘Dry’|‘Dry’)=0.8

3. Initial probabilities: say P(‘Rain’)=0.4 , P(‘Dry’)=0.6 .

**4. Calculation of sequence probability**

1. By Markov chain property, probability of state sequence can be found by the formula:

2. Suppose we want to calculate a probability of a sequence of states in our example, {‘Dry’,’Dry’,’Rain’,Rain’}. P({‘Dry’,’Dry’,’Rain’,Rain’} ) = P(‘Rain’|’Rain’) P(‘Rain’|’Dry’) P(‘Dry’|’Dry’) P(‘Dry’)= 0.3*0.2*0.8*0.6