is a sequence of steps which has an equal probability which governs the transition period between stages.
MARKOV CHAIN: “Between two stages, there maybe a markov chain which consists of steps of equal probability.”
is a sequence of steps which has an equal probability which governs the transition period between stages.