研究了求解无约束优化问题的记忆梯度法,利用当前和前面迭代点的信息产生下降方向,得到了一类新的无约束优化算法,在Wolfe线性搜索下证明了其全局收敛性.新算法结构简单,不用计算和存储矩阵,适于求解大型优化问题.数值试验表明算法有效.
In this paper,the memory gradient method for unconstrained optimization problems is concerned.A new unconstrained optimization algorithm is presented by using the current and previous iterative information to generate a decent direction.We prove the global convergence under the Wolfe line search.The method has a simple structure and avoids the computation and storage of some matrices.It is suitable to solve large scale optimization problems.Numerical experiment shows that the new algorithm is not only feasible but also effective.