基于支持向量机(SVM)泛化误差界,提出了一种精确且有效的多核学习方法.首先,应用SVM泛化误差界推导多核学习优化形式,并给出求解其目标函数微分的计算公式.然后,设计高效的迭代算法来求解该优化问题.最后,分析了算法的时间复杂度,并基于Rademacher复杂度给出了算法的泛化误差界,该泛化界在基核个数很大时依然有效.在标准数据集上的实验表明,相对于一致组合方法以及当前流行的单核和多核学习方法,所提出的方法具有较高的准确率.
Kernel selection is the key both in support vector machine(SVM) research and application.Existing methods mostly select a kernel from basic kernels based on data.However,recent applications have shown that using multiple kernels not only increase the flexibility of SVM but also enhance the interpretability of the decision function and improve the performance.Here,an accurate and efficient multiple kernel learning method,which is based on the upper bounds of the generalization error of SVM,is presented.The form of the solution of the optimization problem is provided first.Then,with an efficient iterative algorithm,the solution is computed.Finally,a learning bound of our proposed method is presented based on the Rademacher complexity.The bound has only a logarithmic dependency on the total number of basic kernels,which indicates that the bound is valid for very large number of basic kernels.Experiments show that our method gives better accuracy results than both SVM with the uniform combination of basis kernels and other state-of-art kernel learning approaches.