Mathematical Modeling with Markov Chains and Stochastic Methods
A stochastic model is a tool that you can use to estimate probable outcomes when one or more model variables is changed randomly. A Markov chain — also called a discreet time Markov chain — is a stochastic process that acts as a mathematical method to chain together a series of randomly generated variables representing the present state in order to model how changes in those present state variables affect future states.
Imagine that you love to travel but that you travel only to places that are a) a tropical paradise, b) ultramodern cities, or c) mountainous in their majesty. When choosing where to travel next, you always make your decisions according to the following rules:
You travel exactly once every two months.
If you travel somewhere tropical today, next you will travel to an ultramodern city (with a probability of 7/10) or to a place in the mountains (with a probability of 3/10), but you will not travel to another tropical paradise next.
If you travel to an ultramodern city today, you will travel next to a tropical paradise or a mountainous region with equal probability, but definitely not to another ultramodern city.
If you travel to the mountains today, you will travel next to a tropical paradise (with probability of 7/10) or an ultramodern city (with a probability of 2/10) or a different mountainous region (with a probability of 1/10).
Because your choice on where to travel tomorrow depends solely on where you travel today and not where you’ve traveled in the past, you can use a special kind of statistical model known as a Markov chain to model your destination decision making. What’s more, you could use this model to generate statistics to predict how many of your future vacation days you will spend traveling to a tropical paradise, a mountainous majesty, or an ultramodern city.
Looking a little closer at what’s going on here, the above-described scenario represents both a stochastic model and a Markov chain method. The model includes one or more random variables and shows how changes in these variables affect the predicted outcomes. In Markov methods, future states must depend on the value of the present state and be conditionally independent from all past states.
You can use Markov chains as a data science tool by building a model that generates predictive estimates for the value of future data points based on what you know about the value of the current data points in a dataset. To predict future states based solely on what’s happening in the current state of a system, use Markov chains.
Markov chains are extremely useful in modeling a variety of real-world processes. They’re commonly used in stock-market exchange models, in financial asset-pricing models, in speech-to-text recognition systems, in webpage search and rank systems, in thermodynamic systems, in gene-regulation systems, in state-estimation models, for pattern recognition, and for population modeling.
An important method in Markov chains is in Markov chain Monte Carlo (MCMC) processes. A Markov chain will eventually reach a steady state — a long-term set of probabilities for the chain’s states. You can use this characteristic to derive probability distributions and then sample from those distributions by using Monte Carlo sampling to generate long-term estimates of future states.