针对未标定相机视觉模型的三维重构中经典的SIFT算子初始匹配误匹配率高、传统的RANSAC算法误匹配剔除效果差的问题,提出一种基于改进的RANSAC和斜率统计的误匹配剔除算法.首先加强RANSAC剔除效果的可控性,从增强极线约束的角度对传统RANSAC进行了改进;然后提出一种迭代的斜率一致性算法在RANSAC之前进行误匹配的先期剔除,以削减RANSAC的计算量、加快其收敛速度;考虑到此算法对大尺度旋转或是倒立图像对的误匹配剔除失效的问题,提出一种基于坐标变换的斜率统计算法,可以实现对此类图像对部分误匹配的快速剔除.对专业网站的测试图像和不同场景下、重复性纹理丰富、行人遮挡严重的多组自拍摄图像进行实验,平均对极距离在0.7个像素以内;对比实验结果证实,该算法的实时性和鲁棒性明显优于同类算法;最后提出深度一致性方法对重构计划之外的匹配点进行自动剔除,获得特征点的三维坐标后利用Open GL进行了纹理贴图操作,可以清晰准确地再现出场景的细节信息.
In 3D reconstruction from uncalibrated views, SIFT matching has a high probability of outliers and the traditional RANSAC algorithm has a bad performance for elimination of outliers. To address this issue, this paper proposes an algorithm based on improved RANSAC and slope statistics. Firstly, the proposed approach makes RANSAC be more controllable, and improves the traditional RANSAC by enhancing the epipolar constraint. Secondly, in order to reduce computation cost and speed up the convergence rate of RANSAC, we present an iterative algorithm called consistency of slope to eliminate parts of outliers before RANSAC. Considering this algorithm is ineffective for the sharply rotated or inverted image pairs, this paper puts forward a fast algorithm called slope statistics based on coordinate transformation to complete the partial elimination for such image pairs in advance. Experiments on the testing image sequence from the professional website and multiple sets of self-photograph image pairs under different scenes which have rich repetitive texture and serious occlusion of pedestrians are presented, and average symmetric epipolar distance is less than 0.7 pixels. The experimental results show that the proposed algorithm is more robust and has better real-time performance than the existing algorithms. Finally, this paper introduces a method called consistency of depth to remove the points beyond reconstruction plan automatically. We complete texture paste operations by Open GL after obtaining 3D coordinates of the feature points, and clearly show details of the scene.