Definition of Markov chain in English:

Markov chain

(also Markov model)

Pronunciation: /ˈmärˌkôf//-ˌkôv/

noun

Statistics
  • A stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.

    • ‘Like the previously discussed models, Markov models have serious limitations.’
    • ‘He also took up new topics, writing several papers on probability theory, in particular on Markov chains.’
    • ‘This model represents a Markov chain in which each state is interpreted as the probability that the switch complex is in the corresponding state.’
    • ‘He applied a technique involving so-called Markov chains to calculate the required probabilities over the course of a long game with many battles.’
    • ‘Ecologists have used simple diffusion, correlated random walk, and Markov chain models to describe dispersal data for various insects.’

Origin

Mid 20th century: named after Andrei A. Markov (1856–1922), Russian mathematician.

Pronunciation:

Markov chain

/ˈmärˌkôf//-ˌkôv/