Global Map Generation and SLAM using LiDAR and Stereo Camera for tracking motion of Mobile Robot

  • Edwin Leonel Álvarez - Gutiérrez Universidad Pedagógica y Tecnológica de Colombia
  • Fabián Rolando Jiménez - López Universidad Pedagógica y Tecnológica de Colombia
Keywords: LiDAR, Global Map, motion tracking, SLAM, mobile robot, stereo vision

Abstract

One of the topics of greatest attention in mobile robotics is related to the location and mapping of a robot in a given environment and the other, associated with the selection of the devices or sensors necessary to acquire as much external information as possible for the generation of a global map. The purpose of this article is to propose the integration between a caterpillar-type land mobile robot, SLAM tasks with LiDAR devices and the use of stereo vision through the ZED camera for the generation of a 2D global map and the tracking of the movement of the mobile robot using the MATLAB® software. The experiment consists of performing different detection tests to determine distances and track the position of mobile robot in a structured environment indoors, to observe the behavior of the mobile platform and determine the error in the measurements. The results obtained show that the integrated devices satisfactorily fulfill the tasks established in controlled conditions and in indoor environments, obtaining error percentages lower than 1 and 4% for the case of the LiDAR and the ZED camera respectively. An alternative was developed that solves one of the most common problems of mobile robotics in recent years and, additionally, this solution allows the possibility of merging other types of sensors such as inertial systems, encoders, GPS, among others, in order to improve the applications in the area and the quality of the information acquired from abroad.

Downloads

Download data is not yet available.

References

[1] D. C. Slaughter, D. K. Giles, and D. Downey, “Autonomous robotic weed control systems: A review,” Comput. Electron. Agric., vol. 61, no. 1, pp. 63-78, Apr. 2008. DOI: https://doi.org/10.1016/j.compag.2007.05.008.

[2] D. Ball et al. Robotics for Sustainable Broad-Acre Agriculture. In: L. Mejias, P. Corke, J. Roberts, (eds) Field and Service Robotics. Springer Tracts in Advanced Robotics, vol 105, 2015. Springer, Cham. DOI: https://doi.org/10.1007/978-3-319-07488-7_30.

[3] A. Barrientos et al., “Aerial remote sensing in agriculture: A practical approach to area coverage and path planning for fleets of mini aerial robots,” J. F. Robot., vol. 28, no. 5, pp. 667-689, Sep. 2011. DOI: https://doi.org/10.1002/rob.20403.

[4] S. G. Tzafestas and S. G. Tzafestas, “Mobile Robot Localization and Mapping,” Introd. to Mob. Robot Control, pp. 479-531, Jan. 2014. ISBN: 0124171036, 9780124171039.

[5] J. Park, J. Y. Kim, B. Kim, and S. Kim, “Global Map Generation using LiDAR and Stereo Camera for Initial Positioning of Mobile Robot,” in 2018 International Conference on Information and Communication Technology Robotics (ICT-ROBOT), 2018, pp. 1-4. DOI: 10.1109/ICT-ROBOT.2018.8549897.

[6] B. A. C. Caldato, R. A. Filho, and J. E. C. Castanho, “ORB-DOM: Stereo and odometer sensor fusion for simultaneous localization and mapping,” in 2017 Latin American Robotics Symposium (LARS) and 2017 Brazilian Symposium on Robotics (SBR), 2017, pp. 1-5. DOI: 10.1109/SBR-LARS-R.2017.8215301.

[7] L. Cheng, Y. Dai, R. Peng, and X. Nong, “Positioning and navigation of mobile robot with asynchronous fusion of binocular vision system and inertial navigation system,” Int. J. Adv. Robot. Syst., vol. 14, no. 6, p. 172988141774560, Nov. 2017. DOI: 10.1177/1729881417745607.

[8] D. T. Savaria and R. Balasubramanian, “V-SLAM: Vision-based simultaneous localization and map building for an autonomous mobile robot,” in 2010 IEEE Conference on Multisensor Fusion and Integration, 2010, pp. 1-6. DOI: 10.1109/MFI.2010.5604466.

[9] G. Campion, G. Bastin, and B. D’Andrca-Novel, “Structural Properties and Classification of Kinematic and Dynamic Models of Wheeled Mobile Robots,” 1996. DOI: 10.1109/ROBOT.1993.292023.

[10] J. L. Jones, B. A. Seiger, and A. M. Flynn, Mobile robots : inspiration to implementation. A.K. Peters, 1999. ISBN: 1568810970 / 9781568810973.

[11] J. Borenstein et al., “Where am I? Sensors and Methods for Mobile Robot Positioning Prepared by the University of Michigan For the Oak Ridge National Lab (ORNL) D& D Program and the United States Department of Energy’s Robotics Technology Development Program Within the Environmental Restoration, Decontamination and Dismantlement Project,” 1996.

[12] SLAMTEC, “RPLIDAR-A2 Laser Range Scanner_ Solid Laser Range Scanner|SLAMTEC,” 2018. [Online]. Available: https://www.slamtec.com/en/Lidar/A2. [Accessed: 14-Feb-2019].

[13] STEREOLABS, “Stereolabs - Capture the World in 3D,” 2018. [Online]. Available: https://www.stereolabs.com/. [Accessed: 14-Feb-2019].

[14] Dinesh Nair, “A Guide to Stereovision and 3D Imaging - Tech Briefs.” [Online]. Available: https://www.techbriefs.com/component/content/article/tb/features/articles/14925?start=1. [Accessed: 11-Apr-2019].

[15] MaxBotix, “How to Use an Ultrasonic Sensor with Arduino [With Code Examples],” 2017. [Online]. Available: https://www.maxbotix.com/Arduino-Ultrasonic-Sensors-085/.[Accessed: 14-Feb-2019].

[16] Edwin Leonel Álvarez Gutiérrez, “Red ZigBee para la Medición de Variables Físicas con Interfaz en Arduino-MATLAB,” I3+, vol. 3, p. 50–64 p., 2016. DOI: https://doi.org/10.24267/23462329.218.

[17] A. F. Silva-Bohórquez, L. E. Mendoza, and C. A. Peña-Cortés, “Sistema de inspección y vigilancia utilizando un robot aéreo guiado mediante visión artificial,” ITECKNE, vol. 10, no. 2, pp. 190-198, Feb. 2014. DOI: https://doi.org/10.15332/iteckne.v10i2.421.
Published
2019-12-16
How to Cite
Álvarez - Gutiérrez, E., & Jiménez - López, F. (2019). Global Map Generation and SLAM using LiDAR and Stereo Camera for tracking motion of Mobile Robot. ITECKNE, 16(2), 144-156. https://doi.org/https://doi.org/10.15332/iteckne.v16i2.2357
Section
Research and Innovation Articles