Markov chain

Hello, you have come here looking for the meaning of the word Markov chain. In DICTIOUS you will not only get to know all the dictionary meanings for the word Markov chain, but we will also tell you about its etymology, its characteristics and you will know how to say Markov chain in singular and plural. Everything you need to know about the word Markov chain you have here. The definition of the word Markov chain will help you to be more precise and correct when speaking or writing your texts. Knowing the definition ofMarkov chain, as well as those of other words, enriches your vocabulary and provides you with more and better linguistic resources.

English

Noun

Markov chain (plural Markov chains)

  1. (probability theory) A discrete-time stochastic process containing a Markov property.
    • 2004 July 27, F. Keith Barker et al., “Phylogeny and diversification of the largest avian radiation”, in PNAS, page 11040, column 2:
      The probability density of the Bayseian posterior was estimated by Metropolis-coupled Markov chain Monte Carlo, with multiple incrementally heated chains.

Hypernyms

Hyponyms

Translations

See also