位置:成果数据库 > 期刊 > 期刊详情页
基于支持向量机泛化误差界的多核学习方法
  • 期刊名称:武汉大学学报:理学版
  • 时间:2012.2.10
  • 页码:149-156
  • 分类:TP181[自动化与计算机技术—控制科学与工程;自动化与计算机技术—控制理论与控制工程]
  • 作者机构:[1]天津大学计算机科学与技术学院,天津300072
  • 相关基金:国家自然科学基金(61070044); 天津市自然科学基金(11JCYBJC00700)资助项目
  • 相关项目:机器学习核方法模型选择与组合的核矩阵近似分析方法
作者: 刘勇|廖士中|
中文摘要:

基于支持向量机(SVM)泛化误差界,提出了一种精确且有效的多核学习方法.首先,应用SVM泛化误差界推导多核学习优化形式,并给出求解其目标函数微分的计算公式.然后,设计高效的迭代算法来求解该优化问题.最后,分析了算法的时间复杂度,并基于Rademacher复杂度给出了算法的泛化误差界,该泛化界在基核个数很大时依然有效.在标准数据集上的实验表明,相对于一致组合方法以及当前流行的单核和多核学习方法,所提出的方法具有较高的准确率.

英文摘要:

Kernel selection is the key both in support vector machine(SVM) research and application.Existing methods mostly select a kernel from basic kernels based on data.However,recent applications have shown that using multiple kernels not only increase the flexibility of SVM but also enhance the interpretability of the decision function and improve the performance.Here,an accurate and efficient multiple kernel learning method,which is based on the upper bounds of the generalization error of SVM,is presented.The form of the solution of the optimization problem is provided first.Then,with an efficient iterative algorithm,the solution is computed.Finally,a learning bound of our proposed method is presented based on the Rademacher complexity.The bound has only a logarithmic dependency on the total number of basic kernels,which indicates that the bound is valid for very large number of basic kernels.Experiments show that our method gives better accuracy results than both SVM with the uniform combination of basis kernels and other state-of-art kernel learning approaches.

同期刊论文项目
同项目期刊论文