markovian
/mɑrˈkoʊviən/Definitions
1. adjective
Relating to or describing a stochastic process named after the Russian mathematician Andrey Markov, in which the probability of the next state depends only on the current state.
“The random walk model exhibited Markovian properties, where the next step was determined solely by the current position.”
2. noun
A stochastic process that satisfies the Markov property.
“The researcher studied a Markovian process to analyze the behavior of a complex system.”