考虑有限样本集上Elman网络梯度学习法的确定性收敛性。证明了误差函数的单调递减性。给出了一个弱收敛性结果和一个强收敛结果,表明误差函数的梯度收敛于0,权值序列收敛于固定点。通过数值例子验证了理论结果的正确性。
The gradient method for training Elman networks with finite training sample set is consid- ered. The monotonicity of the error function in the iteration is shown. A weak and a strong conver-gence results are proved, indicating that the gradient of the error function goes to zero and the weight sequence goes to a fixed point, respectively. A numerical example is given to support the theoretical findings.