AdaBoost作为一种有效的集成学习方法,能够明显提高不稳定学习算法的分类正确率,但对稳定的Naive Bayesian分类算法的提升效果却不明显.为此,利用多种特征评估函数建立不同的特征视图,生成多个有差畀的加权朴素贝叶斯(WNB)基分类器;尝试使用几种不同的方式将样本权重嵌入WNB基分类器的参数中,对WNB产生扰动,进一步增加基分类器的不稳定性.实验结果表明.对比AdaBoost所提算法.BoostMV-WNB能够明显提升WNB文本分类器的性能.
AdaBoost, as an effective ensemble learning method, can improve the performance of unstable learning algorithms, yet works poorly with Naive Bayesian classifier due to its relative stability. So, a revised AdaBoost algorithm with weighted Naive Bayesian (WNB) classifier named Boost MV- WNB was proposed. Firstly, at boosting iterations, multi-views are constructed on the same training set in terms of different terms evaluation functions. Then diverse WNB classifiers are generated by using multiple views. Moreover, the weights of training examples are introduced to the parameters of WNB classifier utilizing a certain function. In this way, the base WNB classifiers become more unstable due to the perturbation. Experimental comparison shows that the BoostMV-WNB algorithm performs better than AdaBoost with WNB text categorization.