Perception sensor integration for improved environmental reconstruction in quadruped robotics
Contenido principal del artículo
Resumen
Perception systems are fundamental in outdoor robotics, as their correct functionality is essential for tasks such as terrain identification, localization, navigation, and analysis of objects of interest. This is particularly relevant in search and rescue (SAR) robotics, where one current research focuses on the mobility and traversal of unstructured terrains (commonly resulting from natural disasters or attacks) using quadruped robots. 3D sensory systems, such as those based on 360-degree LiDAR, tend to create dead zones within a considerable radius relative to their placement (typically on the upper part of the robot), leaving the locomotion system without terrain information in those areas. This paper addresses the problem of eliminating these dead zones in the robot's direction of movement during the process of environment reconstruction using point clouds. To achieve this, a ROS-based method has been implemented to integrate "n" point clouds from different sensory sources into a single point cloud. The applicability of this method has been tested in generating elevation maps of the environment with different resolutions, using the quadruped robot ARTU-R (A1 Rescue Task UPM Robot) and short- and long-range RGB-D sensors, strategically placed on its lower front part. Additionally, the method has demonstrated real-time functionality and robustness concerning the issue of frame association in the fusion of information from decentralized sources. The code is available to the community in the authors' GitHub repository https://github.com/Robcib-GIT/pcl_fusion.
Palabras clave:
Detalles del artículo
Citas
Benedek, C., Majdik, A., Nagy, B., Rozsa, Z., Sziranyi, T., 2021. Positioning and perception in lidar point clouds. Digital Signal Processing 119, 103193. DOI: https://doi.org/10.1016/j.dsp.2021.103193 DOI: https://doi.org/10.1016/j.dsp.2021.103193
Blackburn, M. R., Everett, H. R., Laird, R. T., 8 2002. After action report to the jointprogram office: Center for the robotic assisted search and rescue (crasar) related efforts at the world trade center. Tech. rep., SPACE AND NAVAL WARFARE SYSTEMS CENTER SAN DIEGO CA. DOI: https://doi.org/10.21236/ADA495121
Cruz Ulloa, C., Marzo 2024. Quadrupedal robots in search and rescue : Perception and teleoperation, no Publicado. URL: DOI: 10.20868/UPM.thesis.81769 DOI: https://doi.org/10.20868/UPM.thesis.81769
Cruz Ulloa, C., Garcia, M., del Cerro, J., Barrientos, A., 2023a. Deep learning for victims detection from virtual and real search and rescue environments. In: Tardioli, D., Matell´an, V., Heredia, G., Silva, M. F., Marques, L. (Eds.), ROBOT2022: Fifth Iberian Robotics Conference. Springer International Publishing, Cham, pp. 3–13. DOI: https://doi.org/10.1007/978-3-031-21062-4_1
Cruz Ulloa, C., Prieto Sánchez, G., Barrientos, A., Del Cerro, J., 2021. Autonomous thermal vision robotic system for victims recognition in search and rescue missions. Sensors 21 (21). DOI: 10.3390/s21217346 DOI: https://doi.org/10.3390/s21217346
Cruz Ulloa, C., Sánchez, L., Del Cerro, J., Barrientos, A., 2023b. Deep learning vision system for quadruped robot gait pattern regulation. Biomimetics 8 (3). DOI: 10.3390/biomimetics8030289 DOI: https://doi.org/10.3390/biomimetics8030289
Cruz Ulloa, C., Álvarez, J., del Cerro, J., Barrientos, A., 2024. Vision-based collaborative robots for exploration in uneven terrains. Mechatronics 100, 103184. DOI: https://doi.org/10.1016/j.mechatronics.2024.103184 DOI: https://doi.org/10.1016/j.mechatronics.2024.103184
Eguchi, R., KenElwood, Lee, E. K., Greene, M., 2012. The 2010 canterbury and 2011 christchurch new zealand earthquakes and the 2011 tohoku japan earthquake. Tech. rep., Earthquake Engineering Research Institute.
Fankhauser, P., Bloesch, M., Hutter, M., 2018. Probabilistic terrain mappingfor mobile robots with uncertain localization. IEEE Robotics and Automation Letters (RA-L) 3 (4), 3019–3026. DOI: 10.1109/LRA.2018.2849506 DOI: https://doi.org/10.1109/LRA.2018.2849506
Kruijff, I., Freda, L., Gianni, M., Ntouskos, V., Hlavac, V., Kubelka, V., Zimmermann, E., Surmann, H., Dulic, K., Rottner, W., Gissi, E., Oct 2016. Deployment of ground and aerial robots in earthquake-struck amatrice in
italy (brief report). In: 2016 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR). pp. 278–279. DOI: 10.1109/SSRR.2016.7784314 DOI: https://doi.org/10.1109/SSRR.2016.7784314
Li, Y., Ibanez-Guzman, J., 2020. Lidar for autonomous driving: The principles, challenges, and trends for automotive lidar and perception systems. IEEE Signal Processing Magazine 37 (4), 50–61. DOI: 10.1109/MSP.2020.2973615 DOI: https://doi.org/10.1109/MSP.2020.2973615
Li, Y., Kong, L., Hu, H., Xu, X., Huang, X., 2024. Optimizing lidar placements for robust driving perception in adverse conditions.
Ulloa, C. C., Llerena, G. T., Barrientos, A., del Cerro, J., 2023. Autonomous 3d thermal mapping of disaster environments for victims detection. In: Robot Operating System (ROS) The Complete Reference (Volume 7). Springer, pp. 83–117. DOI: https://doi.org/10.1007/978-3-031-09062-2_3
Wannous, C., Velasquez, G., 2017. United nations office for disaster risk reduction (unisdr)—unisdr’s contribution to science and technology for disaster risk reduction and the role of the international consortium on landslides (icl). In: Sassa, K., Mikos, M., Yin, Y. (Eds.), Advancing Culture of Living with Landslides. Springer International Publishing, Cham, pp. 109–115. DOI: https://doi.org/10.1007/978-3-319-59469-9_6
Whitman, J., Zevallos, N., Travers, M., Choset, H., 2018. Snake robot urban search after the 2017 mexico city earthquake. In: 2018 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR). pp. 1–6. DOI: 10.1109/SSRR.2018.8468633 DOI: https://doi.org/10.1109/SSRR.2018.8468633