为了解决全局优化算法中的一个难点--算法易于陷入局部极小点,设计了一个平滑函数,该函数可以消除一些局部极小点,而在包含最优点的部分,函数保持不变.这样,通过对此平滑函数的优化,局部极小点的数目就会在迭代过程中大量地减少,使算法更易找出全局极小点;根据平滑函数的性质,设计了一个新的杂交算子,此算子能自适应地产生优质的后代;利用平滑函数的性质,巧妙地将一维搜索技术用于算法的设计之中,从而使算法的速度大大提高;在此基础上,设计了一个解全局优化问题的新的高效进化算法,并且证明了其全局收敛性.最后的数值实验也表明新算法十分有效.
A common difficulty for the existing global optimization methods is that they are not easy to escape from the local optimal solutions and therefore often not find the global optimal solution. In order to make it escapes from the local optimal solutions and find the global optimal solution easier, first, the authors construct a smoothing function. It can eliminate all such local optimal solutions worse than the best solution found so far. Moreover, it can keep the original function unchanged in the region in which the values of the original function are not worse than its value at the best solution found so far. Thus, if optimizing this smoothing function instead of the original objective function, the number of the local optimal solutions will be largely decreased with progress of the iterations. As a result, it becomes much easier for an algorithm to find a global optimal solution. Second, a new crossover operator is designed based on the properties of the smoothing function. It can adaptively generate high quality offspring for any situation. Third, by making use of the properties of the smoothing function, the line search technique is properly combined into the algorithm design, which will make the proposed algorithm converge much faster. Based on all these, a novel effective evolutionary algorithm for global optimization is proposed and itsglobal convergence is proved. At last, the numerical simulations for several standard benchmark problems are made and the simulation results show that the proposed algorithm is very effective.