支持向量机(SVM)的性能与SVM参数的选择有关。SVM参数的优化需要一个准则。针对核函数选择RBF形式的情况,提出了一个新的SVM参数优化的准则,称作导数平方和准则。与著名的SVM参数优化方法如交叉验证或Radius/Margin Bound准则方法相比,利用提出的参数优化准则得到的分类面能够在原空间对样本集“平分秋色”,体现了SVM分类器的结构风险最小化的原则,而且算法简单、计算量小、更易于实现。
The generalization performance of a Support Vector Machine (SVM) is determined by its hyperparameters. A criterion should be given first when the hyperparameters are optimized. A new criterion called the sum of the squared derivatives was proposed. Compared with the already well-known hyperparameters optimization method such as k ? fold cross-validation method or Radius/Margin Bound method, the class-separating hyper-surface designed based on this criterion could ‘leg and leg’ the whole original input space for all the samples, thus it supports the structural risk minimization principle better. Moreover, the algorithm based on this criterion is simple, and needs less computation, thus it can be implemented easily.