MARKOV PROCESS
\mˈɑːkɒv pɹˈə͡ʊsɛs], \mˈɑːkɒv pɹˈəʊsɛs], \m_ˈɑː_k_ɒ_v p_ɹ_ˈəʊ_s_ɛ_s]\
Definitions of MARKOV PROCESS
Sort: Oldest first
-
a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
By Princeton University
Word of the day
Mustagh Range
- a mountain range in northern Kashmir; an extension of the Hindu Kush; contains 2nd highest peak