Sensory Fusion and integration of LIDAR with stereo vision and Kinect for analysis of depth

Authors

  • Sandra Carolina Imbachi Universidad del Cauca
  • Jeison Eduardo Vivas García Universidad del Cauca
  • Juan Fernando Florez Marulanda Universidad del Cauca

DOI:

https://doi.org/10.14482/inde.36.1.10937

Abstract

Sensory integration and fusion of a LIDAR system with two stereo vision cameras and Kinect for measuring depth is presented; filtering methods using maximum likelihood and Kalman for processing of data. it was developed a hardware fusion and later a software integration in order to carry out tests to 0,68, 1,5, 2, 2,5 and 4 m. Getting that the hardware fusion LIDAR-Kinect has the best result with an average error 1,08%, and in turn the Kalman filter provides better support for processing and accuracy of the data for both fusions with an average error of 1,35%.

Published

2017-12-24

How to Cite

[1]
S. C. Imbachi, J. E. Vivas García, and J. F. Florez Marulanda, “Sensory Fusion and integration of LIDAR with stereo vision and Kinect for analysis of depth”, Ing. y Des., vol. 36, no. 1, pp. 40–58, Dec. 2017.