混合高斯模型是一种典型的非高斯概率密度模型,获得广泛应用。其参数的优效估计可以通过最大似然方法获得,但最大似然估计往往因其非线性而难以实现,故期望最大化(Expectation-Maximization,EM)迭代算法成为一种常用的替代方法。常规EM算法性能受迭代初值设置影响大,且不能对模型阶数做出估计。一种名为贪婪EM的改进算法可以克服这两个缺点,获得更为准确的模型参数估计,但其运算量一般会远大于前者。本文对这两种EM算法进行综合研究,深入挖掘两者之间的关系,并基于相同的数值仿真实例,直观地演示比较两者的性能差异。
Gaussian mixture is a typical and widely-used non-Gaussian probability density distribution model. The expectation-maximization algorithm is a usual iterative realization for the maximum likelihood estimation of its para-meters. However, its performance depends highly on the initial values. And it can not estimate the order of Gaussian mixture. The greedy expectation-maximization algorithm can solve these problems by incrementally adding Gaussian components to the mixture. But its operation quantity is often much larger than the former. The relationship between these two algorithms is discussed, and their concrete realization methods are given comparatively. With the same nu-merical instance, their performance differences are illustrated and studied.