NeoSLAM: Neural Object SLAM for Loop Closure and Navigation

Proceedings of the 31st International Conference on Artificial Neural Networks (ICANN 2022) pages 443--455, doi: 10.1007/978-3-031-15934-3_37 - Sep 2022
Associated documents :  
Simultaneous Localization and Mapping (SLAM) with fixed landmark objects creates topological maps by extracting semantic information from the environment. In this paper, we propose a new method for mapping, Neural Object SLAM (NeoSLAM), which uses objects seen in stereo images to learn associations between the pose of the robot and the observed landmark objects. We perform mapping with a biologically inspired approach based on creating patterns memorizing places in a network of grid cells and head direction cells. Our model is inspired by the object vector cells discovered recently by neuroscientists exploring the navigation of mammals. We model the firing field of these cells with a feed-forward neural network and create keyframes of objects with their 3D pose in a world-centered frame of reference. We train a Hebbian network connecting keyframe templates to the grid cells to memorize familiar places. We use the NeuroSLAM algorithm to train the grid cells and the head direction cells with the 4 Degree of Freedom (DoF) poses of the robot. Then, we detect loops in the trajectory by matching objects in the keyframes. Finally, we create an object experience map and correct the cumulative error if we detect loop closure candidates. Thus, our system performs object-based place recognition with a brain-inspired approach and produces 2D/3D object topological maps.

 

@InProceedings{RWW22, 
 	 author =  {Raoui, Younès and Weber, Cornelius and Wermter, Stefan},  
 	 title = {NeoSLAM: Neural Object SLAM for Loop Closure and Navigation}, 
 	 booktitle = {Proceedings of the 31st International Conference on Artificial Neural Networks (ICANN 2022)},
 	 number = {},
 	 volume = {},
 	 pages = {443--455},
 	 year = {2022},
 	 month = {Sep},
 	 publisher = {Springer International Publishing},
 	 doi = {10.1007/978-3-031-15934-3_37}, 
 }