为了提高自动编码器算法的学习精度,更进一步降低分类任务的分类错误率,提出一种组合稀疏自动编码器(SAE)和边缘降噪自动编码器(m DAE)从而形成稀疏边缘降噪自动编码器(Sm DAE)的方法,将稀疏自动编码器和边缘降噪自动编码器的限制条件加载到一个自动编码器(AE)之上,使得这个自动编码器同时具有稀疏自动编码器的稀疏性约束条件和边缘降噪自动编码器的边缘降噪约束条件,提高自动编码器算法的学习能力。实验表明,稀疏边缘降噪自动编码器在多个分类任务上的学习精度都高于稀疏自动编码器和边缘降噪自动编码器的分类效果;与卷积神经网络(CNN)的对比实验也表明融入了边缘降噪限制条件,而且更加鲁棒的Sm DAE模型的分类精度比CNN还要好。
In order to improve the learning accuracy of Auto-Encoder( AE) algorithm and further reduce the classification error rate,Sparse marginalized Denoising Auto-Encoder( Sm DAE) was proposed combined with Sparse AutoEncoder( SAE) and marginalized Denoising Auto-Encoder( m DAE). Sm DAE is an auto-encoder which was added the constraint conditions of SAE and m DAE and has the characteristics of SAE and m DAE,so as to enhance the ability of deep learning. Experimental results show that Sm DAE outperforms both SAE and m DAE in the given classification tasks;comparative experiments with Convolutional Neural Network( CNN) show that Sm DAE with marginalized denoising and a more robust model outperforms convolutional neural network.