为了减小传统的极限学习机网络的规模及提高网络的泛化性能,利用Akaike信息准则作为学习的最优停止准则以选择合适的隐层节点数量,同时利用修正Gram-Schmidt算法自动调整网络参数,提出改进的极限学习机网络构造算法。通过与传统极限学习机在通用标杆问题上的实验结果比较表明,该改进的极限学习机具有更精简的网络结构和更快的学习速度,同时具有良好的学习精度。
To reduce the dimension of a neural network and improve the generalization capability of the extreme learning machine(ELM) network,Akaike information criterion(AIC) was implemented to choose a suitable number of hidden units,and the modified Gram-Schmidt(MGS) method was also implemented to automatically adjust the network parameters.In comparison with the conventional ELM learning method on several commonly used regressor benchmark problems,the improved ELM algorithm could achieve a compact network with much faster training speed and satisfactory accuracy.