我们与稀少的支持向量机器(SVM ) 在特征选择学习策略。最近,所谓的 L p -SVM (0 p L 1-SVM。然而, L p -SVM 是一非凸并且 non-Lipschitz 优化问题。数字地解决这个问题是挑战性的。在这份报纸,我们再用形式表示 L p -SVM 进有线性客观功能和光滑的限制(LOSC-SVM ) 的一个优化模型以便它能被数字方法为光滑的抑制优化解决。我们人工的数据集的数字实验显示出那 LOSC-SVM (0 p L 1-SVM。
We study the strategies in feature selection with sparse support vector machine (SVM). Recently, the socalled Lp-SVM (0 〈 p 〈 1) has attracted much attention because it can encourage better sparsity than the widely used L1-SVM. However, Lp-SVM is a non-convex and non-Lipschitz optimization problem. Solving this problem numerically is challenging. In this paper, we reformulate the Lp-SVM into an optimization model with linear objective function and smooth constraints (LOSC-SVM) so that it can be solved by numerical methods for smooth constrained optimization. Our numerical experiments on artificial datasets show that LOSC-SVM (0 〈 p 〈 1) can improve the classification performance in both feature selection and classification by choosing a suitable parameter p. We also apply it to some real-life datasets and experimental results show that it is superior to L1-SVM.