为了提高尺度不变特征变换(scale invariant feature transform,SIFT)算法的不变性,并降低图像中存在多个相似区域时的误匹配率,提出了一种将基于局部二进制模式的中心对称的改进局部三进制模式(center-symmetric improved local ternary patterns,CS-ILTP)描述子和全局灰度值分布(global distribution of intensity,GDI)描述子相融合的局部不变特征描述算法.通过迭代变换,使得由SIFT算法得到的初始特征点收敛到仿射不变点并得到仿射不变区域;分别提取CS-ILTP和GDI描述符,从而得到图像的局部不变特征描述.实验结果表明,所提算法具有高鲁棒性和独特性,相似区域和人工路标匹配中的正确匹配特征个数分别比SIFT算法增加了100%和86%以上.
In order to improve the invariance of the scale invariant feature transform (SIFT) algorithm and reduce mismatches when multiple regions are similar, a novel algorithm for image local invariant feature descrip- tion is proposed. An improved texture feature descriptor called center-symmetric improved local ternary patterns (CS-ILTP) which is a modified version of the local binary patterns is introduced, and then it is fused with the global distribution of intensity (GDI) as the local feature descriptor in the SIFT. Through iterative transforma- tion, the initial feature point derived from the SIFT converges to affine invariant point and affine invariant region is extracted. Then the image local feature descriptor is obtained by extracting the CS-ILTP and GDI, respective- ly. The matching experiments show that the proposed algorithm has high robustness and uniqueness. Compared with the SIFT, the number of correct matching feature points in similar regions matching and artificial land marks matching has increased more than 100 ~ and by 86 ~, respectively.