Definify.com
Definition 2024
Markov_process
Markov process
English
Noun
Markov process (plural Markov processes)
- (probability theory) A stochastic process in which the probability distribution of the current state is conditionally independent of the path of past states.
Related terms
- Markov property
- Markov chain
Translations
stochastic process
|
|