针对智能优化算法原理复杂,相关参数设置困难的缺点,尝试利用拟牛顿法对核Fisher判别分析多个核参数进行自动优化。根据经验风险最小准则构建目标函数。为保证目标函数连续且可导,通过连续的sig—mold函数将离散的二进制输出转化为连续的概率输出。利用正交表选取初始核参数点。实验结果表明,所提算法具有同遗传算法相近的分类性能,且收敛速度快,原理简单,可以很好地应用于核Fisher判别分析多个核参数优化。
The principle of intelligent optimization algorithms is complex and the setting of their parameters is difficult, so it is difficult for intelligent optimization algorithms to optimize the parameters of kernel Fisher dis criminant analysis(KFDA). A quasi-Newton algorithm to automatically optimize the multiple parameters of KF DA is proposed. The objective function is constructed using an empirical risk minimization principle. To make the objective function continuous and derivative, a sigmoid function is introduced to transform the discrete bina- ry output of KFDA into continuous probability output. The initial parameters are selected by orthogonal array. Experimental results indicate that the classification performance of the proposed algorithm is close to the genetic algorithm. The higher convergence rate and simpler principle are obtained by using the proposed algorithm in comparison with the genetic algorithm. The proposed algorithm can be effectively used to optimize the multiple kernel parameters of KFDA.