Markov chain definition

  • noun:
    • A discrete-time stochastic procedure using the Markov property.
    • A random procedure (Markov procedure) in which the possibilities of discrete says in a series count only regarding the properties associated with the instantly preceding state and/or next preceeding state, in addition to the road in which the preceding state was reached. It varies through the much more basic Markov procedure for the reason that the says of a Markov string tend to be discrete versus constant. Specific real processes, such as for instance diffusion of a molecule in a fluid, are modelled as a Markov chain. See also arbitrary walk.
    • a Markov procedure for which the parameter is discrete time values

Related Sources

  • Definition for "Markov chain"
    • A discrete-time stochastic procedure using the Markov property.
    • View More
  • Hypernym for "Markov chain"
  • Cross Reference for "Markov chain"
  • Variant for "Markov chain"
  • Agriculture Dictionary for "Markov chain"
  • Technology Dictionary for "Markov chain"
    • A Markov sequence is a mathematical…
    • View More
53 votes

How would you define Markov chain?

All the definitions on AZdictionary were written by people just like you. Now's your chance to add your own!