Author:
Jia Kexin,Xin Yuxia,Cheng Ting
Abstract
Abstract
To improve the robustness of order selection and parameter learning for Gaussian mixture model (GMM), this paper proposes a competitive stop expectation-maximization (EM) algorithm, which is based on two stop conditions. The first condition is a Lilliefors test based multivariate (MV) normality criterion, which is used to determine whether to split a component into two different components. The EM algorithm stops splitting when all components have MV normality. The minimum description length (MDL) criterion is used in the second condition, which competes with the first condition to prevent the EM algorithm from over-splitting. Simulation experiments verify the effectiveness of the proposed algorithm.
Subject
General Physics and Astronomy
Reference9 articles.
1. A dynamic model selection algorithm for mixtures of Gaussian processes[J];Zhao;2016 IEEE 13th International Conference on Signal Processing,2016
2. Gaussian Mixture Model-Based Ensemble Kalman Filter for Machine Parameter Calibration[J];Fan;IEEE Transactions on Energy Conversion,2018
3. On Convergence Properties of the EM algorithm for Gaussian Mixtures [J];Xu;Neural Computation,1996
4. Unsupervised learning of Gaussian mixtures based on variational component splitting [J];Constantinopoulos;IEEE transactions on Neural Networks,2007