In this lesson we explore the concept of a Markov chain, which is a way of using matrix methods to analyze a process that changes over time according to given probabilities. Applications and examples will be provided.
Introduction to Markov chains
When we study a system that can change over time, we need a way to keep track of those changes. A Markov chain is a particular model for keeping track of systems that change according to given probabilities. As we shall see, a Markov chain may allow one to predict future events, but the predictions become less useful for events farther into the future (much like predictions of the stock market or weather).
This lesson requires prior knowledge of matrix arithmetic.
A state is any particular situation that is possible in the system. For example, if we are studying rainy days, then there are two states:
It's raining today.
It's not raining today.
The system could have many more than two states, but we will stick to two for this small example.
The term Markov chain refers to any system in which there are a certain number of states and given probabilities that the system changes from any state to another state.
That's a lot to take in at once, so let's illustrate using our rainy days example. The probabilities for our system might be:
If it rains today (R), then there is a 40% chance it will rain tomorrow, and 60% chance of no rain.
If it doesn't rain today (N), then there is a 20% chance it will rain tomorrow and 80% chance of no rain.
It may help to organize this data in what we call a state diagram.
Left circle represents rain (R), and the right represents no rain (N). Arrows indicate the probability to change state; for example, the arrow from R to N is labeled 0.6 because there is a 60% chance that if it rains today, then it will not rain tomorrow.
The Transition Matrix
If a Markov chain consists of k states, the transition matrix is the k by k matrix (a table of numbers) whose entries record the probability of moving from each state to another state (in decimal form, rather than percentage). The rows of the matrix correspond to the current state, and columns correspond to the next state. For example, the entry at row 1 and column 2 records the probability of moving from state 1 to state 2. (Note, the transition matrix could be defined the other way around, but then the formulas would also be reversed.)
Let's build a transition matrix together! First we choose an order for our states. Let's say R always comes before N. That means the first row and first column will concern R while the second row and column will concern N. Remember, rows mean ''from'' and columns mean ''to.''
The transition from R to R is 0.4, so we put 0.4 in the upper left of the matrix. The transition from R to N is 0.6 (upper right). N to R is 0.2 (lower left), and N to N is 0.8 (lower right).
On the left, the probabilities are shown in table form. On the right, they are shown as a transition matrix.
At the start of the process, we might know exactly which state the system is in (rain or no rain), but starting with the second state and further, we only know probabilities of the system being in any given state. To keep track of these probabilities, we use a state vector.
A state vector is a vector (list) that records the probabilities that the system is in any given state at a particular step of the process.
For example, if we know for sure that it is raining today, then the state vector for today will be (1, 0). But tomorrow is another day! We only know there's a 40% chance of rain and 60% chance of no rain, so tomorrow's state vector is (0.4, 0.6).
But what about the day after tomorrow? Or the day after that? Using matrix arithmetic, we can find the state vector for any step in the Markov process!
Over 79,000 lessons in all major subjects
Get access risk-free for 30 days,
just create an account.
Did you know… We have over 200 college
courses that prepare you to earn
credit by exam that is accepted by over 1,500 colleges and universities. You can test out of the
first two years of college and save thousands off your degree. Anyone can earn
credit-by-exam regardless of age or education level.