Localización en entornos estructurados basada en la detección de esquinas
Contenido principal del artículo
Resumen
La gran exactitud y resolución que presentan las mediciones realizadas con sensores LiDAR (Light Detection And Ranging) los convierte en habituales en sistemas SLAM (Simultaneous Localization And Mapping). El gran volumen de datos proporcionado por dichos sensores se puede reducir a un conjunto de puntos característicos que definen el entorno. Dicha reducción de datos simplifica el proceso de mapeado y posicionamiento disminuyendo así la carga computacional del proceso SLAM. En este trabajo se propone un sistema para la estimación de la trayectoria seguida por un elemento robótico basado únicamente en información LiDAR 2D. La nube de puntos proporcionada por el sensor es analizada para extraer una serie de esquinas características que conforman el entorno de navegación, que nos permiten estimar el movimiento del robot mediante PLGO (Pose-Landmark Graph Optimization). Los resultados experimentales muestran como el sistema propuesto ofrece una precisión en la localización del robot comparable a la que se puede obtener mediante técnicas ICP (Iterative Closest Point).
Palabras clave:
Detalles del artículo
Citas
Altermatt, M., Martinelli, A., Tomatis, N., Siegwart, R., 2004. Slam with corner features based on a relative map. In: 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)(IEEE Cat. No. 04CH37566). Vol. 2. IEEE, pp. 1053–1058. DOI: 10.1109/IROS.2004.1389536 DOI: https://doi.org/10.1109/IROS.2004.1389536
Campos, C., Elvira, R., Rodríguez, J. J. G., M. Montiel, J. M., D. Tardós, J., 2021. Orb-slam3: An accurate open-source library for visual, visual–inertial, and multimap slam. IEEE Transactions on Robotics 37 (6), 1874–1890. DOI: 10.1109/TRO.2021.3075644 DOI: https://doi.org/10.1109/TRO.2021.3075644
Dellaert, F., 2012. Factor graphs and gtsam: A hands-on introduction. Tech. rep., Georgia Institute of Technology.
Geiger, A., Lenz, P., Urtasun, R., 2012. Are we ready for autonomous driving? the kitti vision benchmark suite. In: Conference on Computer Vision and Pattern Recognition (CVPR). DOI: 10.1109/CVPR.2012.6248074 DOI: https://doi.org/10.1109/CVPR.2012.6248074
Grisetti, G., Guadagnino, T., Aloise, I., Colosi, M., Della Corte, B., Schlegel, D., 2020. Least squares optimization: From theory to practice. Robotics 9 (3), 51. DOI: 10.3390/robotics9030051 DOI: https://doi.org/10.3390/robotics9030051
Guo, S., Rong, Z., Wang, S., Wu, Y., 2022. A lidar slam with pca-based feature extraction and two-stage matching. IEEE Transactions on Instrumentation and Measurement 71, 1–11. DOI: 10.1109/TIM.2022.3156982 DOI: https://doi.org/10.1109/TIM.2022.3156982
Huang, J., Wen, S., Liang, W., Guan, W., 2023. Vwr-slam: Tightly coupled slam system based on visible light positioning landmark, wheel odometer, and rgb-d camera. IEEE Transactions on Instrumentation and Measurement 72, 1–12. DOI: 10.1109/TIM.2022.3231332 DOI: https://doi.org/10.1109/TIM.2022.3231332
Li, R., Liu, J., Zhang, L., Hang, Y., 2014. Lidar/mems imu integrated navigation (slam) method for a small uav in indoor environments. In: 2014 DGON inertial sensors and systems (ISS). IEEE, pp. 1–15. DOI: 10.1109/InertialSensors.2014.7049479 DOI: https://doi.org/10.1109/InertialSensors.2014.7049479
Lin, W., Hu, J., Xu, H., Ye, C., Ye, X., Li, Z., 2017. Graph-based slam in indoor environment using corner feature from laser sensor. In: 2017 32nd Youth academic annual conference of chinese association of automation (YAC). IEEE, pp. 1211–1216. DOI: 10.1109/YAC.2017.7967597 DOI: https://doi.org/10.1109/YAC.2017.7967597
Prieto-Fernández, N., Fernández-Blanco, S., Fernández-Blanco, Á., Benítez-Andrades, J. A., Carro-De-Lorenzo, F., Benavides, C., 2023. Weighted conformal lidar-mapping for structured slam. IEEE Transactions on Instrumentation and Measurement 72, 1–10. DOI: 10.1109/TIM.2023.3284143 DOI: https://doi.org/10.1109/TIM.2023.3284143
Schuster, F., Keller, C. G., Rapp, M., Haueis, M., Curio, C., 2016. Landmark based radar slam using graph optimization. In: 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC). pp. 2559–2564. DOI: 10.1109/ITSC.2016.7795967 DOI: https://doi.org/10.1109/ITSC.2016.7795967
Shi, Z., Wang, P., Liu, W., Gao, C., 2023. Multi-sensor slam assisted by 2d lidar line features. In: International Conference on Haptics and Virtual Reality. Springer, pp. 73–80. DOI: 10.1007/978-3-031-56521-27 DOI: https://doi.org/10.1007/978-3-031-56521-2_7
Ulas, C., Temeltas, H., 2013. A fast and robust feature-based scan-matching method in 3d slam and the effect of sampling strategies. International Journal of Advanced Robotic Systems 10 (11), 396. DOI: 10.5772/56964 DOI: https://doi.org/10.5772/56964
Vazquez-Martin, R., Nuñez, P., del Toro, J., Bandera, A., Sandoval, F., 2006. Adaptive observation covariance for ekf-slam in indoor environments using laser data. In: MELECON 2006-2006 IEEE Mediterranean Electrotechnical Conference. IEEE, pp. 445–448. DOI: 10.1109/MELCON.2006.1653134 DOI: https://doi.org/10.1109/MELCON.2006.1653134
Xing, B. Y., Dang, R. N., Xu, P., Jiang, C. X., Jiang, L., apr 2020. Slam algorithm for aruco landmark array based on synchronization optimization. Journal of Physics: Conference Series 1507 (5), 052011. DOI: 10.1088/1742-6596/1507/5/052011 DOI: https://doi.org/10.1088/1742-6596/1507/5/052011
Zeng, Q., Tao, X., Yu, H., Ji, X., Chang, T., Hu, Y., 2023. An indoor 2d lidar slam and localization method based on artificial landmark assistance. IEEE Sensors Journal. DOI: 10.1109/JSEN.2023.3341832 DOI: https://doi.org/10.1109/JSEN.2023.3341832