共轭梯度法是优化大规模目标函数的一种经典方法。根据复梯度、复Hessian阵与实梯度、实Hessian阵之间的关系,将共轭梯度法推广到复数域,用于解决复数域的优化问题。针对共轭法的一些缺点,如每步迭代利用线性搜索来确定优化的步长及可能寻找到的极值点不一定为极小值等缺点,提出在Hessian阵不正定时利用负曲率方向作为搜索方向,利用实数域二阶导数简化思想,使寻找下降负曲率方向简单化,同时根据目标函数信息调节搜索步长,保持函数值单调下降。对该算法进行复数域优化数值仿真,结果表明:该算法与复数域的SCG算法及Quasi-Newton算法相比,计算较为简单且优化效果更优。
Conjugate gradient algorithm is a classical method for large-scaled optimization problems. Based on the relationships of real gradient and complex gradient, real Hessian and complex Hessian, the conjugate gradient algorithm is extended to the complex domain. To cope with the disadvantages of conjugate gradient algorithm, this paper proposes using the negative curvature direction as the search direction for the case that the Hessian matrix is indefinite. By means of the simplify idea of function derivation in real domain, the process to search the descent direction becomes easier. Meanwhile, the step size is adjusted according to the value of function so as to keep the decline of the function value. The simulations show that improved method can obtain a better performance.