图像特征匹配的核心是通过距离函数实现在高维矢量空间进行相似性检索.重点研究提取好的特征点并快速准确地找到查询点的近邻.首先,提取图像的多量、有区别且稳健的SURF(Speededuprobustfeature)特征点,并将特征点凸包进行Delaunay剖分.然后,对Delaunay三角边抽样、聚类、量化并构建索引.通过票决算法,将点对匹配与否映射到矩阵中以解决距离度量没有利用数据集本身所蕴含的任何结构信息和搜索效率相对较低的问题.结合SURF算法和Delaunay三角网提出一种特征匹配的新方法,在标准图像集上的实验验证,在耗时基本相同的情况下,提取的特征点较多且正确匹配率较高.
The most important part in image feature matching is to retrieve feature vectors via a distance function. This paper focuses on better extracting feature points and establishing pointst neighborhoods more quickly and accurately. First, the convex hulls of speeded up robust feature (SURF) feature points are divided into Delaunay triangles. Then, the indexes of the Delaunay connections are built by sampling, clustering and quantization. Finally, we construct a matching grid of the pairwise points by a voting algorithm, which improves matching efficiency without using any relevant structural information. The paper proposes a novel matching method based on SURF feature and Delaunay triangular meshes. Experiment results verify that, the method is able to extract more feature points and achieve feature matching with a higher accuracy while maintaining time cost.