Fusing Monocular Information in Multicamera SLAM - LAAS - Laboratoire d'Analyse et d'Architecture des Systèmes Accéder directement au contenu
Article Dans Une Revue IEEE Transactions on Robotics Année : 2008

Fusing Monocular Information in Multicamera SLAM

Résumé

This paper explores the possibilities of using monocu-lar simultaneous localization and mapping (SLAM) algorithms in systems with more than one camera. The idea is to combine in a single system the advantages of both monocular vision (bearings-only, infinite range observations but no 3-D instantaneous information) and stereovision (3-D information up to a limited range). Such a system should be able to instantaneously map nearby objects while still considering the bearing information provided by the observation of remote ones. We do this by considering each camera as an independent sensor rather than the entire set as a monolithic su-persensor. The visual data are treated by monocular methods and fused by the SLAM filter. Several advantages naturally arise as interesting possibilities, such as the desynchronization of the firing of the sensors, the use of several unequal cameras, self-calibration, and cooperative SLAM with several independently moving cameras. We validate the approach with two different applications: a stereovision SLAM system with automatic self-calibration of the rig's main extrinsic parameters and a cooperative SLAM system with two independent free-moving cameras in an outdoor setting.
Fichier non déposé

Dates et versions

hal-02915727 , version 1 (04-11-2020)

Identifiants

Citer

Joan Solà, André Monin, Michel Devy, Teresa Vidal-Calleja. Fusing Monocular Information in Multicamera SLAM. IEEE Transactions on Robotics, 2008, 24 (5), pp.958-968. ⟨10.1109/TRO.2008.2004640⟩. ⟨hal-02915727⟩
9 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More