NOUN | process | Markov process, Markoff process | a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state |
---|
Meaning | A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state. | |
---|---|---|
Synonym | Markoff process | |
Narrower | Markov chain, Markoff chain | A Markov process for which the parameter is discrete time values |
Broader | stochastic process | A statistical process involving a number of random variables depending on a variable parameter (which is usually time) |
Adjectives | Markovian | relating to or generated by a Markov process |
©2001-24 · HyperDic hyper-dictionary · Contact