Markov Chains

Introduction

A Markov chain is a model describing a sequence of possible events, where the probability of an even occurring only depends on the state of the system in the previous event. In this learn page we will see how we can represent Markov chains, and how they can be useful in working out probabilities etc.

Example

"It is known that if it is dry today then there is a 1/3 chance that it will be wet tomorrow, and if it is wet today there is a 3/4 chance it will be wet tomorrow"

Objectives

This is an example of a Markov chain because the weather tomorrow only depends on the weather today. What sort of things will we be working out in this chapter?

  • What will the weather be like in 5 days, given that it is wet today?
  • Over time what will be the proportion of dry days?
  • What is the expected number of dry days in a row?

We will also be looking at different types of Markov chains. In the example above I represented the Markov chain as a paragraph, and outlined all the probabilities, however there are better ways of representing Markov chains...


Representing Markov Chains

Transition Diagrams

One way of representing Markov chains is by using a Transition diagram. This diagram has a node for each possible state, and arrows between them carrying information about the probabilities of "transitioning" between the states. This is an example of a transition diagram for the example mentioned in the introduction:

"It is known that if it is dry today then there is a 1/3 chance that it will be wet tomorrow, and if it is wet today there is a 3/4 chance it will be wet tomorrow"

This, of course, can be extended to how ever many states are required, for example rolling dice would have 6 states (one for each possible "score").

Transition Matrices

We can also represent Markov chains in Transition matrices. These allow for calculations, and are why situations are often modelled with Markov chains. Calculations which would take hours with tree-diagrams to find probabilities can be done in seconds. To carry on with this topic you will need to at least know basic Matrix multiplication, and finding inverses, which can be found on the Learn pages on matrices.

The same example can be represented as a 2x2 transition matrix:

The probabilities in the "W" or wet column show the probabilities of staying wet, and being dry the next day, given the previous state is wet. The same can be said for the "D" or dry column. You may also realize that the probabilities in each column add up to 1. This isn't a coincidence:

Each column of a transition matrix is a probability vector thus must add up to one.

Initial States

The initial state is the state at which a Markov chain begins. This could be any one of the possible states (certain), or it could be anyone of the states (given as probabilities). The initial states is given as a probability vector.

Let's take the weather example (used above). If on the first day we are given that it is dry the probability vector will look like this:

Whereas if the only thing you are told is that on the first day there is a 50% chance of it being dry, then the initial state probability vector will look like this:

These will be used when calculating probabilities of events happening, given an initial state.

Calculating Probabilities

Questions such as "Given that it was dry the previous day, what is the probability that it is wet in 10 days time?" would take an eternity using a tree diagram and working out the probabilities. However using the matrix method for representing Markov chains can make this very quick and easy.

Steady States

Expected Values

"Given that on the first day it was wet, how many consecutive wet days are expected?"

We can answer questions like these by considering the probabilities in a markov chain. Let's take the wet-dry weather scenario, and look at the transition matrix we established earlier on in this page:

Given that the first day is wet what is the probability that NO following days are wet? Well this is the same as the probability of the markov chain transitioning from wet, to dry! From the transition matrix we can see that this probability is 1/4. What is the probability of only the next day being wet? Well this is just the probability of the first day being wet, and the second dry (these are probabilities we can get from the transition matrix!). This is of course (3/4)(1/4). Applying the same logic we can make the following table, where r is equal to the number of consecutive wet days:

This is an example of a discrete random variable distribution, therefore we can find the expected value in the following way:
Applying the formula to the values for our markov chain:
Factoring out (3/4)(1/4) leaves a recognizable series:
The bit in the square brackets is equal to the binomial expansion of (1-x)^-2 with x=3/4.
This allows us to substitute in for our infinite series in the square brackets of the expected value, allowing us to evaluate the expected value:
This means that the expected number of consecutive wet days, given the initial condition that the weather is wet, is 3 days.

General case: Formula

In general calling the probability of the event we want to calculate the expected consecutive occurrences alpha (α) gives the following random variable distribution:
Doing the same thing we did before but with (α) leaves us with the following general formula:

Absorbing States

Reflecting Barriers

Periodicity

Discuss on Forum!
Topic Checklist
More Maths