提出一种考虑特征分布的局部特征提取算法,提取非结构化环境中符合各项同性分布的局部特征作为自然路标,使得基于行为的机器人能够利用这些路标实现高精度的视觉归航.以SIFT(scaleinvariantfeaturetransform)算法为基础,通过改善局部特征的分布均匀性得到了UD—SIFT(uniformdistribution—SIFT)算法,并提出一种新的评价标准对局部特征的分布均匀性进行评估.采用基于全景视觉的ADV(averagedisplacementvector)和ALV(averagelandmarkvector)方法,在室内、走廊和室外环境下进行归航实验,相对于原始SIFT算法,采用UD—SIFT算法时,归航的平均方向误差降低了25.01%以上.结果表明:本文算法有效改善了特征的均匀分布状况,提高了机器人的归航精度.
Taking the feature distribution into account, a local feature extraction algorithm is proposed. The local features satisfying isotropic distribution in the unstructured environment are extracted as natural landmarks. Thus the behavior- based robot can achieve high-precision visual homing by utilizing those landmarks. Based on the SIFT (scale invariant feature transform) algorithm, the UD-SIFT (uniform distribution-SIFT) algorithm is obtained by improving the uniformity of the feature distribution. In addition, a novel criterion for evaluating the uniformity is proposed. The visual homing experiments are carried out indoors, in the corridor and outdoors, using the ADV (average displacement vector) and ALV (average landmark vector) methods which are both based on the panoramic vision. Compared with the original SIFT, the UD-SIFT lowers the homing average angular error by more than 25.01%. The results show that this algorithm effectively improves the feature distribution and the robot homing precision.