众多的模型被建议了减少 Na 的分类错误 ? 由削弱它的属性独立假设和一些的 ve Bayes 表明了显著错误性能。考虑到学习的那个整体是减少分类器的分类错误的一个有效方法,这份报纸建议双层贝叶斯的分类器整体(DLBCE ) 算法基于经常的 itemsets。DLBCE 构造双层为新例子包含了的各经常的 itemset 并且最后的贝叶斯的分类器(DLBC ) 整体由根据有条件的相互的信息把不同重量分到不同分类器的所有分类器。试验性的结果证明建议算法超过另外的突出的算法。
Numerous models have been proposed to reduce the classification error of Naive Bayes by weakening its attribute independence assumption and some have demonstrated remarkable error performance. Considering that ensemble learning is an effective method of reducing the classifmation error of the classifier, this paper proposes a double-layer Bayesian classifier ensembles (DLBCE) algorithm based on frequent itemsets. DLBCE constructs a double-layer Bayesian classifier (DLBC) for each frequent itemset the new instance contained and finally ensembles all the classifiers by assigning different weight to different classifier according to the conditional mutual information. The experimental results show that the proposed algorithm outperforms other outstanding algorithms.