Definition of Markov chain in US English:

Markov chain

(also Markov model)


  • A stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.

    • ‘This model represents a Markov chain in which each state is interpreted as the probability that the switch complex is in the corresponding state.’
    • ‘He applied a technique involving so-called Markov chains to calculate the required probabilities over the course of a long game with many battles.’
    • ‘Ecologists have used simple diffusion, correlated random walk, and Markov chain models to describe dispersal data for various insects.’
    • ‘He also took up new topics, writing several papers on probability theory, in particular on Markov chains.’
    • ‘Like the previously discussed models, Markov models have serious limitations.’


1930s: named after Andrei A. Markov (1856–1922), Russian mathematician.


Markov chain