针对尺度不变特征变换(SIFT)描述子仅利用特征点的局部邻域灰度信息而对图像内具有相似灰度分布的特征点易产生误匹配的问题,提出一种基于典型相关分析(CCA)的SIFT误匹配剔除方法。该方法首先利用SIFT算法进行匹配,得到初始匹配对;然后根据典型相关成分的线性关系拟合直线,利用点到直线的距离剔除大部分误匹配点对;对剩余的匹配点对,逐一分析其对典型相关成分的共线性的影响,剔除影响程度大的特征点对。实验结果表明,该方法能够在剔除误匹配的同时保留更多的正确匹配,提高了图像配准的精度。
A method to remove Scale Invariant Feature Transform (SIFT) mismatches based on Canonical Correlation Analysis (CCA) was presented to improve the quality of feature matching when the feature points locate in some similar structures of one image. At first, SIFT matching algorithm was used to get the initial matching pairs. Then, a line was fitted based on the linear relation between the canonical correlation components. The mismatches were removed by thresholding the distances from the points to the line. Furthermore, the influence of each remained match on the collineartiy degree was analyzed to indicate false matches. The experimental results show that the proposed algorithm can remove mismatches efficiently and keep more correct matches.