随机双梯度算法是独立分量分析中一个重要的学习算法,但该算法收敛速度慢,稳态误差大,不利于信号的准确适时性处理.论文重点对随机双梯度算法进行了改进,提出一种基于负熵的随机双梯度算法.在改进的算法中,用负熵来度量其中的随机变量非高斯性,从而来克服峭度的不稳健性.论文最后通过理论分析和仿真实验证明这种改进的随机双梯度算法具有较好的分离效果且稳定性高.
Stochastic dual-gradient algorithm is an important learning algorithm of independent component analysis,whose convergence speed is slow and steady-state error is large,which leads to inaccuracy in timely signal processing. Focusing on the improvement of stochastic dual-gradient algorithm,a stochastic dual-gradient algorithm based on negative entropy is proposed,in which negative entropy is used to measure the non-Gaussian of random variables and thus to overcome the kurtosis of robustness in the improved algorithm. By theoretical analysis and simulation experiments the paper finally proves that the improved Stochastic Dual-Gradient Algorithm has better separation effect and higher stability.