Abstract
Self-attention networks have revolutionized the field of natural language processing and have also made impressive progress in image analysis tasks. Corrnet3D proposes the idea of first obtaining the point cloud correspondence in point cloud registration. Inspired by these successes, we propose an unsupervised network for non-rigid point cloud registration, namely NrtNet, which is the first network using a transformer for unsupervised large deformation non-rigid point cloud registration. Specifically, NrtNet consists of a feature extraction module, a correspondence matrix generation module, and a reconstruction module. Feeding a pair of point clouds, our model first learns the point-by-point features and feeds them to the transformer-based correspondence matrix generation module, which utilizes the transformer to learn the correspondence probability between pairs of point sets, and then the correspondence probability matrix conducts normalization to obtain the correct point set corresponding matrix. We then permute the point clouds and learn the relative drift of the point pairs to reconstruct the point clouds for registration. Extensive experiments on synthetic and real datasets of non-rigid 3D shapes show that NrtNet outperforms state-of-the-art methods, including methods that use grids as input and methods that directly compute point drift.
Funder
National Science Foundation of China
Hubei Key Laboratory of Intelligent Robot
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Reference54 articles.
1. The Road to Immersive Communication
2. An introduction of autonomous vehicles and a brief survey;Raviteja;J. Crit. Rev,2020
3. Introduction to augmented reality;Silva;Natl. Lab. Sci. Comput.,2003
4. Semantic3d. net: A new large-scale point cloud classification benchmark;Hackel;arXiv,2017
5. Classification and comparison of maximum power point tracking techniques for photovoltaic system: A review
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献