多核学习是整合多个子核在一个优化框架内,从而寻求到多个子核之间的一个最佳线性组合,而且多核学习可以获得比单核学习更好的分类性能.受极限学习思想的启发,提出了快速随机多核学习分类方法.当满足极限学习的理论框架时,可以在构造核的过程中,对参数随机赋值,构造一种随机核.可以缩减子核的规模,加快了多核学习的计算时间,并且节省了内存空间,使得多核学习可以处理更大规模的问题.另外,通过使用经验Rademacher复杂度来分析多核学习的一般性误差,从而获得比原有多核学习更高的分类精度.结果表明,与经典的快速多核学习算法相比,文中提供的算法计算更快,占用内存空间更小,分类精度更高.
Multiple kernel learning (MKL) combines multiple kernels in a convex optimization framework and seeks the best line combination of them. Generally, MKL can get better results than single kernel learning, but heavy computational burden makes MKL impractical Inspired by the extreme learning machine (ELM), a novel fast MKL method based on the random kernel is proposed. When the framework of ELM is satisfied, the kernel parameters can be given randomly, which produces the random kernel~ Thus, the sub-kernel scale is reduced largely, which accelerates the training time and saves the memory. Furthermore, the reduced kernel scale can reduce the error bound of MKL by analyzing the empirical Rademacher complexity of MKL. It gives a theoretical guarantee that the proposed method gets a higher classification accuracy than traditional MKL methods. Experiments indicate that the proposed method uses a faster speed, more small memory and gets better results than several classical fast MKL methods.