So the final sort of model that I'm going to show you, probabilistic model is called a Markov chain. Now, a Markov chain is a dynamic model. It's a probabilistic model and it's discrete. And what it does is modeled a discrete time state space transition. Now that's a bit of a mouth full. So I'm going to immediately give you an example, so you can understand what we mean by that. And the example that I'm thinking about is what a public policy person might do when they're trying to understand an individual's employment status. So obviously, unemployment and employment are key features of the economy. We like to understand them. We can understand them at a point in time by doing a survey. And asking people whether their employed or not employed. But we're, also, frequently interested in the dynamics of that process for how long do people stay unemployed, how liked they are. They do transition from employment to non-employment, so, in this particular example, I'm going to treat time not as a continuous variable, but as a discrete variable. And I'm going to consider time in six-month blocks, and I'm going to consider an individual's employment status as being in one of three possible categories. First one is that you're employed, you got a job. The second one is that you're unemployed and looking for a job. And the third one is that you're unemployed and you're not looking for a job. Now, it's quite possible to move from one of those states. That's what we mean by state. There are three states here to another. So you could be employed and you get fired. So that's going to take to being unemployed, and maybe you're upset about being fired, and you're going to try and find a job. And so you're unemployed and you're looking, or maybe you've said, that's enough, I'm unemployed and I'm not going to look. So you could go from state one to either state two or state three. Likewise, you could be unemployed and looking for a job and then get employed. So you could from one time period to the next go from state two to state one, but if you're unemployed and not looking, well, you can't go from state three to state one. Because even not looking for a job, you're not going to become employed. So you can see there are some transitions that you can make and others that you can't between these three states. Now, what a Markov chain does for you is model the probability of transitions between those three states. So if you have a look at the graphic on the right-hand side here, you can see the three possible states that an individual is in. So they're employed, they are unemployed and looking, that's state two, or unemployed and not looking, that's state three. And I've drawn arrows that show you the possible transitions. And so, if you are employed, you could certainly move to the unemployed and looking state, you could also move to the unemployed and not looking state. It's important to realize that you could stay employed in the next time period, which is why there's a darker blue arrow from the state back into itself. You can certainly stay in the same state. Likewise, you could move between the looking and lot looking states as well. But notice the arrow or the absence of an arrow between not looking and employed. Because if you're not looking, you're not going to be able to transition to the employed state. So that graphically represents the chain. Now on the left hand side we have what is called the probability transition matrix. And what that does is provide the probability of moving from one state to another. And so let's have a look at the top row in that matrix. So, it corresponds to the current state being one. In other words, you're employed, and the probabilities tell you the chances of transitioning to one of the other states. And so the 0.8 is the probability that you transition to state one. In other words, you stay employed. So according to this model 80% of people retain their job over the next six months. And there's a 0.1 chance that you lose your job and move to the looking stage. And likewise, a 0.1 chance that you lose your job and move to the not looking stage. Notice that those probabilities across the road to add up to one. So something has to happen. Likewise, I've got a set of transition probabilities from state two into states one, two, and three. And if you have a look at state three, notice there's a zero in the bottom left hand side of the matrix, because if you are unemployed or not looking, there's a zero probability of becoming employed. If you're not looking for a job, you're not going to get a job, and so there can certainly be zeros in this matrix. So that's the idea of a probability transition matrix, and they can be very useful for modeling these probabilistic dynamic processes. Now, this model is called a mark of chain because it satisfies a certain condition and it's called a mark of property which more, generally is would be understood as a lack of memory problem, a lack of memory characterization. And what that Markov property states is that the transition probabilities only depend on the current state, not on prior states. And the more elegant way of stating that is given the present, the future does not depend on the past. And that's definitely an assumption in this particular model, and the assumption may or may not be correct, and so one would want to check that assumption. But in this classic Markov chain that is an assumption, a simplifying assumption, that is made. So there's a fourth example of a probabilistic model. So we've talked about regression models, we've talked about tree models, we've talked about Monte Carlo approaches to solving problems, and we've seen a Markov model here at the end. And so these are all examples of probabilistic models, and have the potential to be applied to a wide number of business processes. So hopefully, you get the idea there's lots of opportunity to use these ideas in practice.