markoff chain definition

  • noun:
    • a Markov process which is why the parameter is discrete time values
29 votes

How would you define markoff chain?

All the definitions on AZdictionary were written by people just like you. Now's your chance to add your own!