markoff-chain

Nouns

  • (n) a Markov process for which the parameter is discrete time values Markov chain,

Synonyms

Markov chain

Words of close approximity