Definition of Markov process in English:

Markov process

noun

Mathematics
  • Any stochastic process for which the probabilities, at any one time, of the different future states depend only on the existing state and not on how that state was arrived at.

Origin

1930s; earliest use found in Transactions of the American Mathematical Society. After German Markoffsche Prozess.