本文考虑了Solodov和Svaiter提出的带误差项的下降算法的收敛性.其重要特征是在收敛性的证明过程中没有应用梯度函数的Hoder连续性.因此,我们在较弱的条件下得到了该算法的收敛性结果.
We consider the convergence properties of descent methods with errors proposed by Solodov and Svaiter. An important feature of this paper is that the Holder continuity of gradient function is not used in the proof of the convergence theorem. Therefore, we obtain the convergence results of these methods under mild conditions.