Andrei Andreevich Markov(1856 –1922 )
It is profoundly important to read around the area that you are working on; more so in mathematics. Theorems that you see in the texts are usually a result of ardous efforts / long drawn fights between mathematicians / result of being in extreme hysteria(Cantor for example)/ family duels(Bernoulli)/ Open air competitions(Solution to cubic & quartic polynomial ).
It is important for any person building stochastic models to have a thorough understanding of Markov chains as they typically lead to random walks and Brownian motion concepts .Typically one comes across some form of limit theorem in elementary courses on probability or statistics. In a crude form, one has an intuitive understanding that “You take a large collection of numbers and you take an average, it converges to some number.” It is tucked away in most books under some title which has the words,” law of large numbers” in it. However you miss the entire action if you miss the historical development to this concept and more importantly, the application of this law to dependent variables. Almost all the forms of weak law that one usually comes across has “independent and identically distributed” statement in it. This is where Markov’s work becomes important. Almost 100 years back he generalized the weak law to dependent variables. Sometimes historical developments such as these are not highlighted in math books and one needs to resort to reading around the material to get an idea of the immense contributions of the mathematicians.
This brief on the life of Markov is elegantly written. The takeaways from this note are :
-
Markov’s animosity towards his colleague Pavel Alekseevich Nekrasov was a key reason for the development of Markov Chains. He desperately wanted to prove Nekrasov wrong
-
Markov was Chebyshev’s student , the person behind the famous Chebyshev’s inequality
-
Markov emphasized problem-solving to understand various aspects of mathematics
-
Prelude to Markov’s work on extension of weak law
-
1738 – Jacob Bernoulli proved it for independent binary random variables
-
1837 – Poisson extended to variables where probability of success and failure depended on the trial number
-
1867, Pafnuty Chebyshev’s paper, “On Mean Values”, generalized the weak law of large numbers to any sequence of independent random variables with bounded second moments
-
Markov generalized this law for the case where the variances do not exis
-
1909 Markov extended the weak law to dependent variables.
-
Markov paid extraordinary attention to details and looked down upon statistics. He always thought mathematics is at a far higher pedestal than statistics. Though over the years, his correspondence with a statistician Chuprov changed his mindset.
-
The note concludes saying
Markov proved that the independence of random variables was not a necessary condition for the validity of the weak law of large numbers and the central limit theorem. He introduced a new sequence of dependent variable, called a chain, as well as a few basic concepts of chains such as transition probabilities, irreducibility and stationarity. His ideas were taken up and developed further by scientists around the world and now the theory of Markov Chains is one of the most powerful theories for analyzing various phenomena of the world.
Once you read this note I am certain that you will start looking at Markov chains with awe and admiration. The very essence of simulating random variables has undergone profound changes with the application of Markov chains.