针对实际的复杂动态场景,提出了一种基于视差空间的立体视觉里程计方法.利用SIFT特征点的尺度和旋转不变性及一些合理的约束条件,实现左、右图像对和连续帧间的特征点匹配和跟踪.通过结合了RANSAC的最小二乘估计滤除运动物体上的干扰特征点,得到较为准确的运动参数的初始值,在视差空间中推导出视觉里程估计的数学模型,通过最小化误差函数得到最终运动估计.实验结果表明,该算法在室内外存在运动物体的复杂动态场景中都具有较传统方法更高的精度.
A novel stereo visual odometry algorithm based on disparity space was proposed for real dynamic environments. Successive frames of stereo images were used. The accuracy was obtained in both feature matching and tracking using the scale and rotation invariance of the scale invariant feature transform (SIFT) feature points together with some reasonable constraints. Least-squares algorithm with random sample consensus (RANSAC) was used to remove disturbing feature points on moving objects and obtain the initial estimation. Then a mathematic model for accurate and robust visual odometry estimation was derived from disparity space. The motion estimation was obtained by minimizing the error function. Experi- mental results show that the algorithm achieves better performance under indoor and outdoor environments with independent moving objects.