局部保持投影(LPP)只考虑了投影后的局部性,而忽视了非局部性.针对这个问题,引入非局部散布矩阵,提出无监督的差分鉴别特征提取算法,通过最大化非局部和局部之间的散度差来寻找最优变换矩阵,并将其成功地应用于人脸识别.该算法同时引入非局部和局部的信息,揭示隐含在高维图像空间中的非线性结构;采用差分的形式求解最优变换矩阵,以避免"小样本"问题;对LPP中的邻接矩阵进行了修正,以更准确地描述样本之间的邻近关系.在Yale和AR标准人脸库上的实验结果验证了文中算法的有效性.
Locality preserving projections (LPP) only concerns the projected locality property while ignores that of "nonlocality". To tackle this problem, a novel unsupervised method of difference discriminant feature extraction is presented. This method extracts an optimal transformation matrix based on maximal nonlocal and local scatter difference, and is successfully applied to face recognition. The proposed method takes into account both the nonlocal and local information to account for the nonlinear structures hidden in the high-dimensional image space. In this method, the "small size sample" problem is avoided by employment of difference operation and the neighborhood relationship is better described by an adequate modification of the adjacency matrix. Extensive experiments on Yale and AR face database demonstrate the effectiveness of the proposed method.