Lidar-based teach-and-repeat of mobile robot trajectories - Archive ouverte HAL Access content directly
Conference Papers Year : 2013

Lidar-based teach-and-repeat of mobile robot trajectories

(1) , (1) , (2) , (1)
1
2

Abstract

Automation of logistics tasks for small lot sizes and flexible production processes requires the development of intuitive and easy-to-use systems that allow non-expert shop floor workers to naturally instruct transportation systems in changing environments. To this end, we present a novel laser-based scheme for teach-and-repeat of mobile robot trajectories that relies on scan matching to localize the robot relative to a taught trajectory, which is represented by a sequence of raw odometry and 2D laser data. This approach has two advantages. First, it does not require to build a globally consistent metrical map of the environment, which reduces setup time. Second, the direct use of raw sensor data avoids additional errors that might be introduced by the fact that grid maps only provide an approximation of the environment. Real-world experiments carried out with a holonomic and a differential drive platform demonstrate that our approach repeats trajectories with an accuracy of a few millimeters. A comparison with a standard Monte Carlo localization approach on grid maps furthermore reveals that our method yields lower tracking errors for teach-and-repeat tasks.
Fichier principal
Vignette du fichier
C23-sprunk13iros.pdf (2.01 Mo) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

lirmm-01247139 , version 1 (21-12-2015)

Identifiers

Cite

Christoph Sprunk, Gian Diego Tipaldi, Andrea Cherubini, Wolfram Burgard. Lidar-based teach-and-repeat of mobile robot trajectories. IROS: Intelligent RObots and Systems, Nov 2013, Tokyo, Japan. pp.3144-3149, ⟨10.1109/IROS.2013.6696803⟩. ⟨lirmm-01247139⟩
281 View
958 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More