共轭梯度法仅需利用一阶导数信息就可克服最速下降法收敛慢的缺点,又避免了牛顿法需要存储和计算Hesse矩阵并求逆的缺点。它把共轭性与最速下降法相结合,利用已知点处的梯度构造一组共轭方向,并沿这组方向进行搜素,求出目标函数的极小点。它是解决大规模无约束优化问题最有用的一个方法,但是其最大局限性在于依赖初始值,不能确保收敛到全局最小值。因此,提出了一种新的混合共轭梯度法,它满足线性搜索的独立性下降条件,这种新方法是βk^FR(Fletcher-Reeves)和βk^PRP(Polak-Ribiere-Polyak)2种方法的混合。做了新方法的收敛性分析,数值结果表明这种新算法效率比较好,竞争力强,所需存储量小,具有步收敛性,稳定性高,而且不需要任何外来参数的优势。
Conjugate gradient method can overcome the slow convergence of steepest descent method and avoid weakness of storing and computing Hessen matrix in Newton method by the first-order derivative information.Combining conjugation with the steepest descent method this method constructs a set of by the of known points,and gives the minimum point of the objective function along the direction of conjugate direction.It is one of the most useful methods for solving large scale unconstrained optimization problems.But depending on initial value is the biggest limitation for this method that cannot ensure the convergence to the global minimum.In this article,a new hybrid conjugate gradient method which satisfies the descent condition of line searches is proposed.The new algorithm is a hybrid of FletcherReeves(β^FRk)and Polak-Ribiere-Polyak(β^PRPk)methods.Convergence analysis of the new method is presented.Numerical result shows that the proposed hybrid algorithm is just as competitive and better than other hybrid conjugate gradient methods,and needs small storage,has the step convergence and high stability,and does not require any external parameters.