Lasso是一种基于一范式的特征选择方法。与已有的特征选择方法相比较,Lasso不仅能够准确地选择出与类标签强相关的变量,同时还具有特征选择的稳定性,因而成为人们研究的一个热点。但是,Lasso方法与其他特征选择方法一样,在高维海量或高维小样本数据集的特征选择容易出现计算开销过大或过学习问题(过拟和)。为解决此问题,提出一种改进的Lasso方法:均分式Lasso方法。均分式Lasso方法将特征集均分成K份,对每份特征子集进行特征选择,将每份所选的特征进行合并,再进行一次特征选择。实验表明,均分式Lasso方法能够很好地对高维海量或高维小样本数据集进行特征选择,是一种有效的特征选择方法。
Lasso is a feature selection method based on 1-norm. Compared with the existing feature selection methods, it not only selects the features with strong correlation with the class label, but also has good stability, so that it has received considerable attention. However, with a high-dimensional and large dataset, like other feature selection methods, Lasso encounters the problems of large computation and overfitting. To address this issue, this paper proposes an improved Lasso method, called K-part Lasso. The K-part Lasso method divides the feature set into K-parts. It selects the features from each feature subset, and then merges the selected features into one feature set. It selects the features from this merged feature set. Experimental results show that the K-part Lasso method can effectively deal with the high-dimensional and large sample datasets.