Accueil > Term: Markov chain
Markov chain
A stochastic process with a finite number of states in which the probability of occurrence of a future state is conditional only upon the current state; past states are inconsequential. In meteorology, Markov chains have been used to describe a raindrop size distribution in which the state at time step n + 1 is determined only by collisions between pairs of drops comprising the size distribution at time step n.
- Partie du discours : noun
- Secteur d’activité/Domaine : Météo
- Catégorie : Météorologique
- Company: AMS
0
Créateur
- Kevin Bowles
- 50% positive feedback