选择性集成方法预测速度快,所需存储空间小,泛化能力强。有序分类方法在解决分级决策,竞标决策等问题上是一种行之有效的方法,针对有序分类选择集成问题定义了加权kappa增益,充分考虑标签之间序差距与集成机制,并用来对bagging生成的有序基学习机进行选择性集成。提出的基于加权kappa增益的选择集成方法与bagging随机集成,不加权kappa选择集成和reduce-error选择集成在四组人造数据集和六组真实数据集进行比较,结果表明该方法的测试误差在集成很少数目的基学习机时就达到最小值,且在三组人造数据集和五组真实数据集上获得最小的测试误差,较bagging方法,不加权kappa方法,reduce-error方法误差分别最大程度降低了7.54%,1.68%,1.10%。
The technique of selective ensemble predicts is fast,and it requires small storage space and has better generalization ability.Ordinal classification is a kind of effective approach in solving sorting decision,bidding decisions and so on,this work was to define weighted kappagain for the ordinal selective ensemble,which fully measured the gap between labels and voting mechanism.Moreover,it was applied to select ordinal basis learning machines generated by bagging.The selective ensemble based on weighted kappagain was compared with bagging random integration,no-weighted kappa selective ensemble and reduce-error selective ensemble on four artificial and six real data sets.The results show that the prediction error of the proposed method can achieve the minimum after a small number of basis learning machines to ensemble,and achieves the minimum test error on three artificial data sets and five real data sets.Compared with bagging method,weighted kappa method,reduce-error the greatest reduction are7.54%,1.68%,1.10%respectively.