共轭梯度法是最优化中最常用的方法之一,广泛地应用于求解大规模优化问题,其中参数β_k的不同选取可以构成不同的共轭梯度法.给出了一类含有三个参数的共轭梯度算法,这种算法能够在给定的条件下证明选定的β_k在每一步都能产生一个下降方向,同时在强Wolfe线搜索下,这种算法具有全局收敛性.
Conjugate gradient methods that are widely applied in solving large- scale optimization problems, and are one of the most useful type of methods in optimization. However, with different choices of the parameter βk, there are many different conjugate gradient methods. This paper presents a new class of three- parameter family of conjugate gradient methods. It is proved that with proper choice of the parameters βk, the methods can produce a descent search direction at every iteration, at the same time, the global convergence of the algorithm is also proved under the strong Wolfe line search conditions.