Definify.com
Definition 2024
Markov_chain
Markov chain
English
Noun
Markov chain (plural Markov chains)
- (probability theory) A discrete-time stochastic process with the Markov property.
Translations
probability theory
|
|
Markov chain (plural Markov chains)
|
|