Section: Scientific Foundations
Mixture models
Participants : Lamiae Azizi, Senan James Doyle, Jean-Baptiste Durand, Florence Forbes, Gersende Fort, Stéphane Girard, Vasil Khalidov, Darren Wraith, Marie-José Martinez.
In a first approach, we consider statistical parametric models,
being the parameter possibly multi-dimensional usually
unknown and to be estimated. We consider cases
where the data naturally divide into observed data
y = y1, ..., yn and unobserved or missing data
z = z1, ..., zn . The missing data zi represents for instance the
memberships to one of a set of K alternative categories. The
distribution of an observed yi can be written as a finite
mixture of distributions,
These models are interesting in that they may point out an hidden variable responsible for most of the observed variability and so that the observed variables are conditionally independent. Their estimation is often difficult due to the missing data. The Expectation-Maximization (EM) algorithm is a general and now standard approach to maximization of the likelihood in missing data problems. It provides parameters estimation but also values for missing data.
Mixture models correspond to independent zi 's. They are more and more used in statistical pattern recognition. They allow a formal (model-based) approach to (unsupervised) clustering.