负熵是一种重要的非高斯性度量方法,最大化负熵使随机变量的非高斯性达到最大,从而使输出的各分量之间相互独立。负熵最大化算法以负熵作为目标函数,牛顿迭代法作为优化算法,针对牛顿迭代法中对初始值选择敏感的问题,用牛顿下山法代替牛顿迭代法,通过改变下山因子,使目标函数呈下降趋势,降低算法对初始值的依赖性。实验结果表明,改进后的算法在不同初始值下均能较好地分离语音音乐混合信号,改善了初值敏感问题。
Negative entropy is an important method of measuring non-gaussian. Each output component is independent of each other by maximizing the negentropy that makes the non-Gaussian maximum. Negentropy maximization takes negentropy as the objective function and Newton iteration method as the optimization algorithm. In order to solve the sensitivity problem of the initial value of Newton iteration, Newton downhill method is proposed instead of the original method. The Newton downhill reduces the dependence of the initial value by changing the downhill factor that makes the objective function on a declining trend. The simulation experiment results show that the proposed method can separate mixed signal of speech and music better under different initial values. Thus the Newton downhill method solves the initial value sensitivity problem effectively.