在极限学习机(extreme learning machine,ELM)网络中,对可加型隐单元的激活函数通常选择的是Sigmoid函数.因此,首先提出一种新型修正线性函数的近似平滑函数Softplus来替代它.Softplus激活函数因为更接近生物学的激活模型且具有一定的稀疏能力,可进一步优化网络性能.其次,为了使ELM算法训练的网络具有更好的分类性能,考虑了类内距和类间距的约束,提出了基于改进Fisher判别约束的ELM算法,从而使解析求得的输出权值更加利于分类,进一步改进了识别性能.最后,在手写数字库和人脸库上的实验证明了改进ELM算法的可行性和优越性.
In the extreme learning machine(ELM) network,sigmoid activation function is usually chosen for additive hidden neurons.Therefore,this paper replaced this activation function with a smooth approximation called softplus function.Because of being closer to the biological activation model and having certain sparseness,softplus activation function can further optimize network performance.In order to have a better classification performance,the optimization model of ELM by the improved Fisher discriminative analysis was restricted,and animproved ELM algorithm was proposed.Thus the output weights can be obtained analytically and are more conducive for classification.Finally,the experiments on handwritten digit database and face database prove the feasibility and superiority of the improved ELM algorithm.