Meaning | A statistical process involving a number of random variables depending on a variable parameter (which is usually time). |
---|
Narrower | Markov process, Markoff process | A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state |
---|
random walk | A stochastic process consisting of a sequence of changes each of whose characteristics / characteristics (as magnitude or direction) is determined by chance |
stationary stochastic process | A stochastic process in which the distribution of the random variables is the same for any value of the variable parameter |
Broader | model, theoretical account, framework | A hypothetical description of a complex entity or process |
---|