针对训练样本多时核Fisher判别分析(KFDA)的计算代价大,特征提取速度慢问题,本文提出一种KFDA的快速算法。该算法首先基于线性相关性理论,设计出一种优化方法,快速寻找训练样本在特征空间所张成的子空间的一组基;然后用这组基线性表示最佳投影方向,结合特征空间中的Fisher准则函数,推导出求解最佳投影方向的新公式,其求解过程只需对一个阶数等于基的个数的矩阵特征值分解,同时提取某样本特征时只需计算该样本与这组基之间的核函数。基于多个数据集的实验验证了该算法的有效性。
The standard Kernel Fisher Discriminant Analysis(KFDA) may suffer from the large computation complexity and the slow speed of feature extraction for the case of large number of training samples. To tackle these problems, a fast algorithm of KFDA is presented. The algorithm firstly proposes an optimized algorithm based on the theory of linear correlation, which finds out a basis of the sub-space spanned by the training samples mapped onto the feature space and which avoids the operation of matrix inversion ; Then using the linear combination of the basis to express the optimal projection vectors, and combining with Fisher criterion in the feature space, a novel criterion for the computation of the optimal projection vectors is presented, which only needs to calculate the eigenvalue of a matrix which size is the same as the number of the basis. In addition, the feature extraction for one sample only needs to calculate the kernel functions between the basis and the sample. The experimental results using different datasets demonstrate the validity of the presented algorithm.