论文标题

在不断变化的环境中向平面移动机器人的强大本地化:对应和深度密度的观点

Robust localization for planar moving robot in changing environment: A perspective on density of correspondence and depth

论文作者

Jiao, Yanmei, Liu, Lilu, Fu, Bo, Ding, Xiaqing, Wang, Minhang, Wang, Yue, Xiong, Rong

论文摘要

平面移动机器人的视觉定位对于各种室内服务机器人应用很重要。为了处理室内环境中的无纹理区域和频繁的人类活动,提出了一种新型的可靠视觉定位算法,该算法提出了利用浓密的对应关系和平面移动机器人的稀疏深度。关键组件是一个最小解决方案,该解决方案可以用一个3D-2D对应关系和一个2d-2d对应关系计算绝对摄像头姿势。在两个方面,优势是显而易见的。首先,鲁棒性会增强,因为通过使用或没有深度的所有对应关系,用于姿势估计的样本设置是最大的。其次,不需要额外的努力来构建以利用质感和重复性纹理场景的密集信件。这是有意义的,因为构建密集地图的计算量很高,尤其是在大规模上。此外,提出了不同解决方案之间的概率分析,并设计了自动解决方案选择机制,以通过在不同的环境特征中选择适当的解决方案来最大化成功率。最后,从对应关系和深度密度的角度考虑情况的完整视觉定位管道将在仿真和公共现实世界室内定位数据集中进行总结和验证。该代码在GitHub上发布。

Visual localization for planar moving robot is important to various indoor service robotic applications. To handle the textureless areas and frequent human activities in indoor environments, a novel robust visual localization algorithm which leverages dense correspondence and sparse depth for planar moving robot is proposed. The key component is a minimal solution which computes the absolute camera pose with one 3D-2D correspondence and one 2D-2D correspondence. The advantages are obvious in two aspects. First, the robustness is enhanced as the sample set for pose estimation is maximal by utilizing all correspondences with or without depth. Second, no extra effort for dense map construction is required to exploit dense correspondences for handling textureless and repetitive texture scenes. That is meaningful as building a dense map is computational expensive especially in large scale. Moreover, a probabilistic analysis among different solutions is presented and an automatic solution selection mechanism is designed to maximize the success rate by selecting appropriate solutions in different environmental characteristics. Finally, a complete visual localization pipeline considering situations from the perspective of correspondence and depth density is summarized and validated on both simulation and public real-world indoor localization dataset. The code is released on github.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源