markov-chain

Nouns

  • (n) a Markov process for which the parameter is discrete time values Markoff chain,

Synonyms

Markoff chain

Words of close approximity