研究了小样本数据集的神经网络分类器集成,提出了适合于小样本数据集的神经网络分类器集成方法Novel_NNE,通过生成差异数据提高神经网络集成中个体的差异性,从而提高集成学习的泛化性能;最后应用不同的融合技术针对UCI标准数据集进行了实验研究.结果表明,在集成算法Novel_NNE中,使用相对多数投票与贝叶斯融合方法的性能优于行为知识空间融合方法.
Ensemble learning has become a hot topic in machine learning. It dramatically improves the generalization performance of a classifier. In this paper, neural network ensemble for small data sets is studied and an approach to neural network ensemble (Novel-.NNE) is presented. For increasing ensemble diversity, a diverse data set is generated as part training set in order to create diverse neural network classifiers. Moreover, different combinational methods are studied for Novel_NNE. Experimental results show that Novel_NNE for both the relative majority vote method and the Bayes combinational method achieves higher predictive accuracy.