针对最小二乘支持向量回归机缺乏鲁棒性和稀疏性,提出采用自下而上的学习方式和循环逐一删除样本框架的鲁棒稀疏算法。为增强鲁棒性,采用基于留一误差的鲁棒"3σ"准则检测并删除异常样本。为提高稀疏性,采用基于最小绝对留一误差的剪枝策略删除不重要样本。为降低计算量,采用快速留一误差和减量学习更新模型。实验结果表明:新算法有较强的鲁棒性,同时在模型泛化性能略有下降的情况下,支持向量数目大幅减少。
Least squares support vector regression machine(LSSVRM) is lack of robustness and sparseness.Aiming at this problem,a robust sparse algorithm was proposed.The bottom-to-up learning strategy and recursive sample-elimination were employed in this method.In order to enhance the robustness,robust three-sigma rule based on leave-one-out(LOO) error was used to detect the outliers and eliminating them.In order to improve the sparseness,pruning strategy based on smallest absolute LOO error was adopted to eliminate unimportant samples.In order to reduce computational burden,fast LOO error and decremental learning for updating the model were employed.Experimental results showed that the novel algorithm had high robustness,and the number of support vectors lowered obviously at the cost of little model's generalization performance.