GLA具有较强的抗噪声能力,但是其收敛的稳定性和学习速度是一对矛盾.通常为保证收敛的稳定性,需要选取足够小的步长,但过小的步长会导致训练时间过长.结合自适应步长的原理,提出改进型的算法TDBDGLA.实验结果表明,与采用同种强化方案的GLA相比,TDBDGLA取得更低的误分率,并且对于给出的衡量稳定性和学习速度的指标,TDBDGLA比GLA提高了11%以上.
GLA has strong noise-tolerant ability, while its stability of convergence is in contradiction with learning speed. In general, to guarantee the stability of convergence, it requires a step size that is small enough. However, if the step size is too small, it will result in very long training time. The principle of self-adaptive step size is incorporated to propose the improved TDBDGLA algorithm. The experimental results show that compared with GLA using the same reinforcement scheme, TDBDGLA achieves lower classification error rate. In the given metric for measuring stability and learning speed, TDBDGLA improves by more than 11% compared with GLA.