| NOUN | process | Markoff chain, Markov chain | a Markov process for which the parameter is discrete time values |
|---|
| Meaning | A Markov process for which the parameter is discrete time values. | |
|---|---|---|
| Synonym | Markov chain | |
| Broader | Markov process, Markoff process | A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state |
| Spanish | cadena de Márkov | |
| Catalan | cadena de Màrkov | |
©2001-25 · HyperDic hyper-dictionary · Contact