Skip to Main content Skip to Navigation
Journal articles

Fusing Monocular Information in Multicamera SLAM

Abstract : This paper explores the possibilities of using monocu-lar simultaneous localization and mapping (SLAM) algorithms in systems with more than one camera. The idea is to combine in a single system the advantages of both monocular vision (bearings-only, infinite range observations but no 3-D instantaneous information) and stereovision (3-D information up to a limited range). Such a system should be able to instantaneously map nearby objects while still considering the bearing information provided by the observation of remote ones. We do this by considering each camera as an independent sensor rather than the entire set as a monolithic su-persensor. The visual data are treated by monocular methods and fused by the SLAM filter. Several advantages naturally arise as interesting possibilities, such as the desynchronization of the firing of the sensors, the use of several unequal cameras, self-calibration, and cooperative SLAM with several independently moving cameras. We validate the approach with two different applications: a stereovision SLAM system with automatic self-calibration of the rig's main extrinsic parameters and a cooperative SLAM system with two independent free-moving cameras in an outdoor setting.
Document type :
Journal articles
Complete list of metadata

Cited literature [26 references]  Display  Hide  Download
Contributor : André Monin <>
Submitted on : Wednesday, November 4, 2020 - 3:06:50 PM
Last modification on : Thursday, June 10, 2021 - 3:05:40 AM



Joan Solà, André Monin, Michel Devy, Teresa Vidal-Calleja. Fusing Monocular Information in Multicamera SLAM. IEEE Transactions on Robotics, IEEE, 2008, 24 (5), pp.958-968. ⟨10.1109/TRO.2008.2004640⟩. ⟨hal-02915727⟩



Record views