1 引言 考虑无约束优化问题(UP): min f(x),x∈R^n,
In this paper, we present a new memory gradient method for uncon- strained optimization problems. The global convergence and linear convergence rate are proved under some mild conditions. At each iteration, the new iterative point is determined by means of a curve search rule that resembles Armijo's lin- ear search rule. It is particular that the search direction and the step-size are determined simultaneously at each iteration. The method, similarly to conjugate gradient methods, avoids the computation and storage of some matrices associated with the Hessian of objective functions. It is suitable to solve large scale minimiza- tion problems. Numerical results show. that the new method is efficient in practical computation.