鉴于传统的协同过滤推荐算法在处理冷启动和数据较稀疏的问题上表现不佳,提出一种将堆栈降噪自编码器(stacked denoising autoencodes,SDAE)同最近邻推荐方法相结合的混合SDAE推荐模型。,使用逐层自编码的思想将极限学习机与降噪自编码器堆叠形成基于极限学习机(extreme learning machine,ELM)计算的堆栈降噪自编码器的深度学习模型,最终用模型提取的抽象特征应用于最近邻算法预测打分。并通过多组数据集上各种模型的实验结果表明,在稀疏度低于8%时,与余弦相似度模型和皮尔森相似度模型相比,混合SDAE推荐模型实验效果分别提高了11.3%和21.1%,与潜在矩阵分解模型相比,混合SDAE模型收敛所需的迭代次数少近30%,而在与相似度模型和矩阵分解模型的三组比较实验中,混合SDAE模型的稳定性也表现最良好,所提出的混合SDAE模型收敛速度较快,并有效解决数据稀疏与冷启动的问题,
In view of solving the problems of poor performance in the traditional collaborative filtering recommendation algo- rithm in dealing with the cold start and sparse data, this paper proposed a method that Combined the deep model of the stacked denoising autoencoders with the nearest neighbor recommendation method. Aiming to form new hybrid recommendation model, treating the autoencoder as the basic unit, training process transformed a unsupervised learning problem. By combining sparse coding algorithm, the denoising criteria, extreme learning machine and using the idea of layer by layer autoencoder. The abstract features extracted from the final model applied to the nearest neighbor algorithm to predict the scoring. The experimental data sets of various models show that when the sparsity is less than 8%, compared with the cosine similarity model and Pearson similarity model, hybrid recommendation model experimental results are increased by 11.3% and 21.1% , and compared with the potential matrix decomposition model , number of iterations for convergence that mixed model required is less nearly 30%. In the three groups of comparative experiments, the stability of the hybrid model also is the best, thus validating faster convergence speed of mixed model and impressive effect in deal with data sparsity and cold start.