Abstract
Scene reconstruction with consumer-level RGB-D cameras has developed considerable momentum in both robotics and vision communities. In the literature of robotics, high-quality camera tracking, the key to accurate reconstruction, is challenging in geometric featureless scenes or under large lighting variation. In this letter, we propose an accurate RGB-D reconstruction approach, by incorporating 3D line features into key steps of the pipeline to significantly improve tracking robustness and reduce error accumulation. Our reconstruction approach is characterized by three key designs. First, we propose a robust 3D line extractor to efficiently extract accurate and consistent 3D lines from RGB-D data. Moreover, we design a joint objective for frame-to-model scan alignment considering both dense points and 3D lines in a robust error metric. With extra constraints from 3D lines and the robust error metric, scan alignment is further enhanced in challenging situations even without any heuristics for spurious correspondence pruning. Finally, we propose an efficient submap-based hierarchical optimization considering both point and 3D line correspondences to significantly reduce the accumulated error in camera tracking and obtain an accurate global 3D model. Extensive experiments on synthetic and real-world datasets demonstrate that our method outperforms state-of-the-art RGB-D reconstruction approaches for common indoor or outdoor scenes in challenging situations of practical scanning.
| Original language | English |
|---|---|
| Article number | 9468706 |
| Pages (from-to) | 6561-6568 |
| Number of pages | 8 |
| Journal | IEEE Robotics and Automation Letters |
| Volume | 6 |
| Issue number | 4 |
| Early online date | 30 Jun 2021 |
| DOIs | |
| Publication status | Published - Oct 2021 |
| Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2016 IEEE.
Funding
This work was supported in part by The National Key Research and Development Program of China under Grant 2019YFB1707501, in part by the National Natural Science Foundation of China under Grant 61772267, in part by the Aeronautical Science Foundation of China under Grant 2019ZE052008, and in part by the Natural Science Foundation of Jiangsu Province under Grant BK20190016.
Keywords
- Localization
- Mapping
- SLAM