在高维特征空间中,具有支持向量机形式的学习机的决策超平面倾向于通过原点,并不需要偏置.但在-支持向量回归机(ν-SVR)中存在偏置,为了研究偏置在ν-SVR中的作用,提出了无偏置的ν-SVR优化问题并给出其求解方法.在标准数据集上的实验表明,无偏置ν-SVR的泛化性能好于ν-SVR.根据对偶优化问题的解空间分析,偏置不应包含在-SVR优化问题中,ν-SVR的决策超平面在高维特征空间中应通过原点.
In the high-dimensional feature space,the decision hyperplane of learning machine with support vector machine style tends to pass through the origin and the bias is not need.However,bias exists in-support vector regression(ν-SVR).To study the role of bias in-SVR,optimization formulation of ν-SVR without bias is proposed and the corresponding method of solving the dual optimization formulation is presented.The experimental results on benchmark data sets show that the generalization ability of ν-SVR without bias is better than-SVR.Based on the analysis of solution space on dual optimization formulation,the bias should not be contained in the optimization formulation of ν-SVR,and the hyperplane of-SVR should pass through the origin.