ObjectivesThis is an example of a Markov chain because the weather tomorrow only depends on the weather today. What sort of things will we be working out in this chapter?
- What will the weather be like in 5 days, given that it is wet today?
- Over time what will be the proportion of dry days?
- What is the expected number of dry days in a row?
We will also be looking at different types of Markov chains. In the example above I represented the Markov chain as a paragraph, and outlined all the probabilities, however there are better ways of representing Markov chains...
Representing Markov Chains
This, of course, can be extended to how ever many states are required, for example rolling dice would have 6 states (one for each possible "score").
The same example can be represented as a 2x2 transition matrix:
Let's take the weather example (used above). If on the first day we are given that it is dry the probability vector will look like this:
We can answer questions like these by considering the probabilities in a markov chain. Let's take the wet-dry weather scenario, and look at the transition matrix we established earlier on in this page: