为了有效地解决非线性特征提取中存在的鉴别效率和样本外问题,最大限度地保持观测信息,并进一步提高相关方法的降维性能,将核学习的方法应用到判别随机近邻嵌入分析方法中,提出一种核判别随机近邻嵌入分析方法.通过引入核函数,将原空间中的样本映射到高维核空间中,构建了用于反映同类和异类数据间相似度的联合概率表达式;在此基础上,引入线性投影矩阵生成对应子空间数据;最后在类内Kullback-Leiber(KL)散度最小和类间KL散度最大的准则下建立目标泛函.该方法突出了异类样本间的特征差异,使样本变得线性可分,从而提高了分类性能.在COIL-20图像库和ORL,Yale经典人脸库上进行实验,验证了文中方法的分类鉴别能力.
In order to improve the discriminative efficiency and solve the out-of-sample problem which exist in non-linear feature extraction, a kernel-based discriminative stochastic neighbor embedding analysis (KDSNE) method is proposed by imposing the kernel trick, which furthest maintains the observation information and effectively improves the performance of dimensionality reduction. Based on DSNE, the proposed method skillfully introduces kernel function and maps the data into a high- dimensional feature space, then it selects the joint probability to model the pairwise similarities of input samples and uses a linear projection matrix to get low-dimensional representations. Moreover, KDSNE chooses the Kullback-Leiber divergence to quantify the proximity of two probability distributions to build the penalty function. KDSNE outstands the feature differences between inter- class samples and makes the samples linear separable so as to improve the classification performance. Experimental results on COIL-20, ORL and Yale databases show the discriminative performance of the method.