极限学习机(Extreme Learning Machine,ELM)是一种速度快,泛化能力强的训练单隐含层前馈神经网络(Single-hidden Layer Feed-forward Neural-network,SLFN)的算法.但是在应用ELM解决实际问题时,需要先确定合适的SLFN结构.然而,对于给定的问题,确定合适的SLFN结构是非常困难的.针对这一问题,本文提出了一种集成学习方法.用该方法解决问题时,不需要事先确定SLFN的结构.提出的方法包括3步:(1)初始化一个比较大的SLFN;(2)用ELM重复训练若干个Dropout掉若干个隐含层结点的SLFNs;(3)用多数投票法集成训练好的SLFNs,并对测试样例进行分类.在10个数据集上进行了实验,比较了本文提出的方法和传统的极限学习机方法.实验结果表明,本文提出的方法在分类性能上优于传统的极限学习机算法.
Extreme learning machine ( ELM) is an algorithm for training single-hidden layer feed-forward neural-network (SLFN) with fast speed and good generalization. It is indispensable to firstly select an appropriate architecture of SLFN when applying ELM to solve practical problems. However. It is very difficult to select an appropriate architecture of SLFN.In order to deal with this problem,an ensemble learning method is proposed in this paper. It is unnecessary to select the appropriate architecture of SLFN when using the proposed method to solve the problems. The proposed method includes 3 steps :( 1) initialize a big SLFN; (2 ) retrain some SLFNs with dropout hidden nodes with ELM; (3 ) th e trained SLFNs are integrated by majority voting method, and the integrated model is used to classify testing instances. We experimentally compared the proposed approach with traditional ELM on 10 data sets, and the experimental results confirm that the proposed apprach outperforms the traditional ELM on performance of classification.