为解决二进前馈神经网络(BFNN)缺乏高效实用学习算法的问题,提出一种新的融合自适应调节结构和权值的BFNN学习算法.该算法借鉴并改进了极限学习机(ELM)方法,可以高效地训练单隐层的BFNN来解决分类问题.为了满足网络的训练精度,算法可以自动增加隐层神经元个数和调节网络隐层及输出层神经元权值;同时为了提高网络的泛化精度,算法通过建立二进神经元敏感性作为度量隐层神经元重要性的尺度,自动地裁剪重要性小的神经元,并对裁剪损失的信息进行补偿.实验结果验证了该算法在处理离散分类问题时的可行性和有效性.
Focusing on the lack of efficient and practical learning algorithm for Binary Feedforward Neural Networks (BFNN) , a novel learning algorithm by fusing the self-adaptations of both architecture and weight for training BFNN is proposed. Based on improving the methodology of Extreme Learning Machines (ELM) , the algorithm can effectively train BFNNs with single hidden layer for solving classification problems. In order to satisfy training accuracy, the algorithm can automatically increase hidden neurons and adjust the neuron' s weights with the Perceptron Learning Rule. As to improve generalization accuracy, the algorithm can automatically, by establishing binary neuron' s sensitivity as a tool for measuring the relevance of each hidden neuron, prune the least relevant hidden neuron with some compensation for information losing due to the pruning. Experiment results verified the feasibility and effectiveness of the proposed algorithm.