Definify.com

Definition 2024


Markov_process

Markov process

English

Noun

Markov process (plural Markov processes)

  1. (probability theory) A stochastic process in which the probability distribution of the current state is conditionally independent of the path of past states.

Related terms

Translations