位置:成果数据库 > 期刊 > 期刊详情页
训练样本类内局部调整的人脸识别方法
  • ISSN号:0469-5097
  • 期刊名称:《南京大学学报:自然科学版》
  • 时间:0
  • 分类:TP181[自动化与计算机技术—控制科学与工程;自动化与计算机技术—控制理论与控制工程]
  • 作者机构:[1]江南大学物联网工程学院,无锡214122, [2]安庆师范学院数学与计算科学学院,安庆246133
  • 相关基金:国家自然科学基金(61373055,61103128); 111引智计划(B12018)
中文摘要:

针对PCA和2DPCA人脸识别方法在特征提取过程中仅考虑总体散度而忽视类内散度的问题,提出了一种基于训练样本类内局部调整的人脸识别方法.首先,对每一类训练样本利用线性插值方法生成类内虚拟样本作为新的训练样本;其次,对新的训练样本和测试样本利用PCA或2DPCA方法提取特征;最后,用最近邻分类器进行识别分类.在ORL、YALE、XM2VTS人脸数据库上验证,实验结果说明本文算法的有效性.

英文摘要:

Dimensionality reduction is a popular technique in the areas of computer vision and pattern recognition.Among the numerous methods,principal component analysis(PCA)and two-dimensional principal component analysis(2DPCA)are widely investigated and have become the most successful approaches for feature extraction and data representation.PCA aims to extract a subspace in which the variance is maximized(or the reconstruction error is minimized).However,prior to performing PCA,the 2Dsamples must be transformed into 1Dvectors.This procedure often results in a high-dimensional image vector space,which is called"curse of dimensionality".Thus it is difficult to evaluate the covariance matrix accurately due to high-dimensional data and the limited training samples.In order to efficiently extract features from facial images,a straightforward technique,called 2DPCA,is proposed for feature extraction.In contrast to conventional PCA,2DPCA is based on 2Dmatrices rather than 1Dvectors.Consequently,it is easier to evaluate the covariance matrix accurately and less time is required for feature extraction.Nevertheless,PCA and 2DPCA only consider the overall divergence and tend to ignore within-class divergence in the process of feature extraction.In this paper,to remedy this problem to some extend,a new method for face recognition based on local adjustment to within-class training samples is proposed.The proposed method generates within-class virtual samplesand finds the low-dimensional and compact representations for the high-dimensional data,hence it can simultaneously exploits the within-class divergence and the overall divergence.Our method mainly consists of the following steps.Firstly,linear interpolation method is used to generate within-class virtual samples and the original training samples are replaced by these virtual samples for the subsequent feature extraction;Secondly,features are extracted from the generated virtual samples and test samples using PCA and 2DPCA;Finally,the nearest neighbor classifier

同期刊论文项目
期刊论文 48 会议论文 4 获奖 6
同项目期刊论文
期刊信息
  • 《南京大学学报:自然科学版》
  • 中国科技核心期刊
  • 主管单位:中华人民共和国教育部
  • 主办单位:南京大学
  • 主编:龚昌德
  • 地址:南京汉口路22号南京大学(自然科学版)编辑部
  • 邮编:210093
  • 邮箱:xbnse@netra.nju.edu.cn
  • 电话:025-83592704
  • 国际标准刊号:ISSN:0469-5097
  • 国内统一刊号:ISSN:32-1169/N
  • 邮发代号:28-25
  • 获奖情况:
  • 中国自然科学核心期刊,中国期刊方阵“双效”期刊
  • 国内外数据库收录:
  • 美国化学文摘(网络版),美国数学评论(网络版),德国数学文摘,中国中国科技核心期刊,中国北大核心期刊(2004版),中国北大核心期刊(2008版),中国北大核心期刊(2011版),中国北大核心期刊(2014版),中国北大核心期刊(2000版)
  • 被引量:9316