Abstract
Perception systems are fundamental in outdoor robotics, as their correct functionality is essential for tasks such as terrain identification, localization, navigation, and analysis of objects of interest. This is particularly relevant in search and rescue (SAR) robotics, where one current research focuses on the mobility and traversal of unstructured terrains (commonly resulting from natural disasters or attacks) using quadruped robots. 3D sensory systems, such as those based on 360-degree LiDAR, tend to create dead zones within a considerable radius relative to their placement (typically on the upper part of the robot), leaving the locomotion system without terrain information in those areas. This paper addresses the problem of eliminating these dead zones in the robot's direction of movement during the process of environment reconstruction using point clouds. To achieve this, a ROS-based method has been implemented to integrate "n" point clouds from different sensory sources into a single point cloud. The applicability of this method has been tested in generating elevation maps of the environment with different resolutions, using the quadruped robot ARTU-R (A1 Rescue Task UPM Robot) and short- and long-range RGB-D sensors, strategically placed on its lower front part. Additionally, the method has demonstrated real-time functionality and robustness concerning the issue of frame association in the fusion of information from decentralized sources. The code is available to the community in the authors' GitHub repository https://github.com/Robcib-GIT/pcl_fusion.