Bereiche | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe
DY: Fachverband Dynamik und Statistische Physik
DY 51: Poster: Stat. Phys., Comp. Meth
DY 51.6: Poster
Donnerstag, 4. April 2019, 15:00–18:00, Poster B2
Phase transition for parameter learning of Hidden Markov Models — •Nikita Rau1, Jörg Lücke2, and Alexander K. Hartmann1 — 1Institut of Physics, University of Oldenburg — 2Department of Medical Physics, University of Oldenburg
We study by computer simulations [1] the learning process during which the parameters of Hidden Markov Models (HMMs) [2] are estimated from a sequence of observed data which is generated artificially. Using the Baum-Welch algorithm [3], an Expectation-Maximization algorithm from the field of Machine Learning, we are capable of observing the learning process of different sized HMMs. By changing the amount of accessible learning data and its noise level, we observe a phase-transition-like change in the performance of the learning algorithm. For bigger HMMs and more learning data, the learning behaviour improves tremendously by reaching a certain threshold in the noise strength. Before reaching the threshold, the parameter-learning is prone to errors, after exceeding it, the learning process becomes nearly perfect. This observation is strongly dependant of the amount of learning data.
[1] A. K. Hartmann, Big Practical Guide to Computer Simulations, World-Scientific, Singapore 2015)
[2] R. Durbin, S. Eddy, A. Krogh, and G. Mitchison, Biological Sequence Analysis (Cambridge University Press 2001)
[3] L. E. Baum, Ann. Math. Statist., vol. 41, 164–171, 1970