NOUN | process | Markoff process, Markov process | a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state |
---|
Meaning | A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state. | |
---|---|---|
Synonym | Markov process | |
Narrower | Markov chain, Markoff chain | A Markov process for which the parameter is discrete time values |
Broader | stochastic process | A statistical process involving a number of random variables depending on a variable parameter (which is usually time) |
©2001-24 · HyperDic hyper-dictionary · Contact