ReSLAM

Abstract

State-of-the-art SLAM methods are designed to work only with the type of camera employed to create the map, and little attention has been paid to the reusability of the maps created. In other words, the maps generated by current methods can only be reused with the same camera employed to create them. This paper presents a novel SLAM approach that allows maps generated with one camera to be used by other cameras with different resolutions and optics. Our system allows, for instance, creating highly detailed maps processed off-line with high-end computers, to be reused later by low-powered devices (e.g. a drone or robot) using a different camera. The first map, called base map, can be reused with other cameras and dynamically adapted by creating an augmented map. The principal idea of our method is a bottom-up pyramidal representation of the images that allows us to seamlessly match keypoints between different camera types. The experiments conducted validate our proposal, showing that it outperforms the state-of-the-art approaches, namely ORBSLAM, OpenVSLAM and UcoSLAM.

Cite us

ReSLAM: Reusable SLAM with heterogeneous cameras. Romero-Ramirez, F. J.; Muñoz-Salinas, R.,  Marín-Jimenez M.J., Carmona-Poyato, A; Medina-Carnicer R. Neurocomputing vol. 563, 2024, pp. 126940.

@article{ROMERORAMIREZ2024126940, 
title = {ReSLAM: Reusable SLAM with heterogeneous cameras}, 
journal = {Neurocomputing}, 
volume = {563}, 
pages = {126940}, 
year = {2024}, 
issn = {0925-2312}, 
doi = {https://doi.org/10.1016/j.neucom.2023.126940}, 
author = {Francisco J. Romero-Ramirez and Rafael Muñoz-Salinas and Manuel J. Marín-Jiménez and Angel Carmona-Poyato and Rafael Medina-Carnicer}
}

The  original published version can be obtained here. And a free version of the paper can be obtained here.

Code and datasets

From an engineering point of view,  ReSLAM is an evolution of UcoSLAM. As such, it includes all UcoSLAM features and adds the ability of using different cameras. The source code can be downloaded at the following link:

ReSLAM code

The datasets used for the experimental section of this paper can be downloaded from the following links:

Documentation

Please, visit the ReSLAM manual for more information on its use.