将遗传算法应用于以SVM为弱分类器的AdaBoost算法,产生了一种识别率高,泛化能力好的强分类器,本文称之为GA-AdaBoostSVM算法。该算法先训练多个支持向量机作为弱分类器,然后用AdaBoost算法将多个弱分类器组合成一个强分类器,在组合的同时采用遗传算法对各弱分类器的权值进行全局寻优。此算法特点在于:(1)传统的AdaBoost算法,对所有弱分类器的权值无法给出一个最优的组合,GA—AdaBoostSVM算法用遗传算法对弱分类器的权值进行全局寻优,得到的强分类器具有更高的识别准确率。(2)为提高强分类器的泛化能力,在训练弱分类器时,合理调整RBF核的参数,使各个弱分类器在准确率和差异性之间得到折中,从而提高整合后的强分类器的泛化能力。最后,通过试验与传统AdaBoostSVM进行对比,表明GA-AdaBoostSVM的优越性。
An algorithm using genetic algorithm to improve the performance of AdaBoost with SVM based weak classifiers was proposed. This method, named GA-AdaBoostSVM, has advantages of higher identification rate and better generalization performance. The algorithm first trains some support vector machines as weak classifiers, and then uses AdaBoost algorithm to embody the weak classifiers into a strong classifier, while using genetic algorithm to optimize weights of weak classifiers for global optimization. Its characters are as follows: (1) Traditional AdaBoost algorithms cannot give an optimized weight for weak classifiers. GA-AdaBoostSVM optimizes the weights of SVM weak classifiers using genetic algorithm, leading to higher accurate identification rate of strong classifier. (2) To enhance generalization performance of strong classifier, it implements some strategies to adjustof RBF kernel, the distributions of accuracy and diversity over these SVM weak classifiers are tuned to achieve a good balance. Experimental result demonstrates that GA-AdaBoostSVM achieved better generalization performance and higher identification rate than the existing AdaBoostSVM methods.