在这篇文章,没有重启的一个新降下记忆坡度方法为解决大规模被建议非强迫的优化问题。方法有下列吸引人的性质:1 ) 搜索方向总是是一足够地,在没有线搜索的每次重复的降下方向使用了;2 ) 搜索方向总是满足角度性质,它独立于客观功能的凸状。在温和条件下面,作者证明建议方法有全球集中,和它的集中率也被调查。数字结果证明新降下记忆方法为给定的测试问题是有效的。
In this article, a new descent memory gradient method without restarts is proposed for solving large scale unconstrained optimization problems. The method has the following attractive properties: 1) The search direction is always a sufficiently descent direction at every iteration without the line search used; 2) The search direction always satisfies the angle property, which is independent of the convexity of the objective function. Under mild conditions, the authors prove that the proposed method has global convergence, and its convergence rate is also investigated. The numerical results show that the new descent memory method is efficient for the given test problems.