NOUN | process | Markov chain, Markoff chain | a Markov process for which the parameter is discrete time values |
---|
Meaning | A Markov process for which the parameter is discrete time values. | |
---|---|---|
Synonym | Markoff chain | |
Broader | Markov process, Markoff process | A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state |
Spanish | cadena de Márkov | |
Catalan | cadena de Màrkov |
©2001-24 · HyperDic hyper-dictionary · Contact