Contents

### Introduction

#### Example

*"It is known that if it is dry today then there is a 1/3 chance that it will be wet tomorrow, and if it is wet today there is a 3/4 chance it will be wet tomorrow"*

#### Objectives

This is an example of a Markov chain because the weather tomorrow only depends on the weather today. What sort of things will we be working out in this chapter?

*What will the weather be like in 5 days, given that it is wet today?**Over time what will be the proportion of dry days?**What is the expected number of dry days in a row?*

We will also be looking at different types of Markov chains. In the example above I represented the Markov chain as a paragraph, and outlined all the probabilities, however there are better ways of representing Markov chains...

### Representing Markov Chains

#### Transition Diagrams

**Transition diagram**. This diagram has a node for each possible state, and arrows between them carrying information about the probabilities of

*"transitioning"*between the states. This is an example of a transition diagram for the example mentioned in the introduction:

*"It is known that if it is dry today then there is a 1/3 chance that it will be wet tomorrow, and if it is wet today there is a 3/4 chance it will be wet tomorrow"*

This, of course, can be extended to how ever many states are required, for example rolling dice would have 6 states (one for each possible "score").

#### Transition Matrices

The same example can be represented as a 2x2 transition matrix:

*Each column of a transition matrix is a probability vector thus must add up to one.*### Initial States

**is the state at which a Markov chain begins. This could be any one of the possible states (certain), or it could be anyone of the states (given as probabilities). The initial states is given as a probability vector.**

*initial state*Let's take the weather example (used above). If on the first day we are given that it is dry the probability vector will look like this:

### Calculating Probabilities

*would take an eternity using a tree diagram and working out the probabilities. However using the matrix method for representing Markov chains can make this very quick and easy.*

**"Given that it was dry the previous day, what is the probability that it is wet in 10 days time?"**### Steady States

### Expected Values

*"Given that on the first day it was wet, how many consecutive wet days are expected?"*We can answer questions like these by considering the probabilities in a markov chain. Let's take the wet-dry weather scenario, and look at the transition matrix we established earlier on in this page:

**NO**following days are wet? Well this is the same as the probability of the markov chain transitioning from wet, to dry! From the transition matrix we can see that this probability is

**1/4**. What is the probability of only the next day being wet? Well this is just the probability of the first day being wet, and the second dry (these are probabilities we can get from the transition matrix!). This is of course

**(3/4)(1/4)**. Applying the same logic we can make the following table, where

**r**is equal to the number of consecutive wet days:

**discrete random variable**distribution, therefore we can find the expected value in the following way:

**(3/4)(1/4)**leaves a recognizable series:

**(1-x)^-2**with

**x=3/4**.

**wet**days, given the initial condition that the weather is wet, is

**3 days**.

#### General case: Formula

**(α)**gives the following random variable distribution:

**(α)**leaves us with the following general formula: