标准最小包含球(Minimum enclosing ball,MEB)模型的对偶问题可视为MEB问题并能够利用核心集向量机(Corevectormachine,CVM)实现大样本的快速训练,但对于一般化MEB模型,对偶问题中的不等式约束发生了变化而不能视为MEB问题,不能方便地使用CVM来解决大样本的快速训练.为此,提出了一般化MEB快速学习方法(Fast learning of generalized MEB,FL-GMEB),首先放松对偶问题中的不等式约束条件,使其等价于中心约束的MEB问题,从而利用CVM获得其核心集(Coreset,CS);然后利用局部线性嵌入(Locally linear embedding,LLE)的逆思想将CS扩充为拓展核心集(Extended coreset,ECS);最后将ECS及其对应的优化权作为一般化MEB模型的逼近解.UCI和USPS数据集上的实验结果表明,FL-GMEB在大样本快速训练方面具有较好的性能优势.
Recent researches have indicated that the standard minimum enclosing ball (MEB) can be used for training large datasets effectively by employing core vector machine (CVM). However, the generalized MEB can not be considered as the MEB problem due to its different constraint inequalities and accordingly we can not directly use CVM to train the generalized MEB for large datasets. In this paper, a fast learning approach called fast learning of generalized MEB (FL-GMEB) is presented for large datasets. First, FL-GMEB slightly relaxes the constraints in the generalized MEB such that it can be equivalent to the corresponding center-constrained MEB, which can be solved with the corresponding core set (CS) by CVM. Then, FL-GMEB attempts to obtain the extended core set (ECS) by expanding neighbors of some samples in CS into ECS in terms of the inverse concept of locally linear embedding (LLE). Finally, FL-CMEB takes the optimized weights of ECS as the approximate solution of the generalized MEB. Experimental results on UCI and USPS datasets demonstrate that the proposed method is effective.