Home > Published Issues > 2014 > Volume 2, No. 1, June 2014 >

Fusion of Depth and Color Images for Dense Simultaneous Localization and Mapping

Dylan T. Conway and John L. Junkins
Department of Aerospace Engineering, Texas A&M Univesity, College Station, TX, United States

Abstract—This paper presents a system for performing dense mapping of surface geometry and texture properties. Co-registered depth and grayscale images are incrementally fused into a global map in real-time as they are collected. The resulting dense map can be rendered into a virtual depth and grayscale image from any arbitrary pose. Comparison of the rendered and observed images provides a direct means of computing the sensor pose relative to the map allowing new data to be fused into the model. This frame-to-map tracking scheme, as opposed to frame-toframe
tracking, improves system accuracy and robustness. Additionally, the use of both surface geometry and color texture better constrains the pose solution and reduces the risk of tracking failures. This paper describes an implementation of the proposed algorithm and provides experimental results.

Index Terms—SLAM, data fusion, GPU, dense mapping

Cite: Dylan T. Conway and John L. Junkins, "Fusion of Depth and Color Images for Dense Simultaneous Localization and Mapping," Journal of Image and Graphics, Vol. 2, No. 1, pp. 64-69, June 2014. doi: 10.12720/joig.2.1.64-69