Abstract
In this paper, we propose a normal estimation method for unstructured point cloud. We observe that geometric estimators commonly focus more on feature preservation but are hard to tune parameters and sensitive to noise, while learning-based approaches pursue an overall normal estimation accuracy but cannot well handle challenging regions such as surface edges. This paper presents a novel normal estimation method, under the co-support of geometric estimator and deep learning. To lowering the learning difficulty, we first propose to compute a suboptimal initial normal at each point by searching for a best fitting patch. Based on the computed normal field, we design a normal-based height map network (NH-Net) to fine-tune the suboptimal normals. Qualitative and quantitative evaluations demonstrate the clear improvements of our results over both traditional methods and learning-based methods, in terms of estimation accuracy and feature recovery.
Original language | English |
---|---|
Title of host publication | The Proceedings of the Conference on Computer Vision and Pattern Recognition 2020 (CVPR 2020) |
Pages | 13235-13244 |
Number of pages | 10 |
DOIs | |
Publication status | Published - 14 Jun 2020 |
Event | The IEEE/CVF Conference on Computer Vision and Pattern Recognition 2020 - Online Duration: 14 Jun 2020 → 19 Jun 2020 http://cvpr2020.thecvf.com/ |
Publication series
Name | Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition |
---|---|
Publisher | IEEE Computer Society |
ISSN (Print) | 1063-6919 |
Public Lecture
Public Lecture | The IEEE/CVF Conference on Computer Vision and Pattern Recognition 2020 |
---|---|
Abbreviated title | CVPR2020 |
Period | 14/06/20 → 19/06/20 |
Internet address |
Funding
This work was supported by the National Natural Science Foundation of China (No. 61502137), the Hong Kong Research Grants Council (No. PolyU 152035/17E), the HKIBS Research Seed Fund 2019/20 (No. 190-009), and the Research Seed Fund (No. 102367) of Lingnan University, Hong Kong.