针对参数βk的不同选取可以构成不同的共轭梯度法,给出了一类求解无约束最优化问题的修正的共轭梯度算法,这种算法能够在较弱条件下证明选定的卢。在每一步都能产生一个下降方向,且在Wolfe线搜索下具有全局收敛性.另外这种算法在另一种Wolfe搜索条件下,若搜索方向为下降时,也具有全局收敛性.
The conjugate gradient method is an effective method to solve the large-scale optimization problems, and it is widely used in practice. On the basis of the choice of the parameter βk, there are many different conjugate gradient methods. This paper proposed a class of new conjugate gradient methods for solving the unconstrained optimization problem. And these methods can prove that the parameters βk selected and produce a descent search direction at every iteration and is globally convergent under the Wolfe line search conditions.