Abstract
Low-end LiDAR sensor provides an alternative for depth measurement and object recognition for lightweight devices. However due to low computing capacity, complicated algorithms are incompatible to be performed on the device, with sparse information further limits the feature available for extraction. Therefore, a classification method which could receive sparse input, while providing ample leverage for the classification process to accurately differentiate objects within limited computing capability is required. To achieve reliable feature extraction from a sparse LiDAR point cloud, this paper proposes a novel Clustered Extraction and Centroid Based Clustered Extraction Method (CE-CBCE) method for feature extraction followed by a convolutional neural network (CNN) object classifier. The integration of the CE-CBCE and CNN methods enable us to utilize lightweight actuated LiDAR input and provides low computing means of classification while maintaining accurate detection. Based on genuine LiDAR data, the final result shows reliable accuracy of 97% through the method proposed.
Funder
Ministry of Education Malaysia
Publisher
Public Library of Science (PLoS)
Reference52 articles.
1. Ogawa H, Tobita K, Sagayama K,Tomizuka M. A Guidance Robot for the Visually Impaired: System Description and Velocity Reference Generation. In: Computational Intelligence in Robotic Rehabilitation and Assistive Technologies (CIR2AT). 2014. p. 9–15.
2. Generating 3D texture models of vessel pipes using 2D texture transferred by object recognition;M Kim;Journal of Computational Design and Engineering,2021
3. 3D People Surveillance on Range Data Sequences of a Rotating Lidar;C Benedek;Pattern Recognition Letters,2014
4. Li Q, Chen S, Wang C, Li X, Wen C, Cheng M, et al. LO-Net: Deep Real-time Lidar Odometry. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2019. p. 8473–82.
5. LiDAR Point Cloud Recognition and Visualization with Deep Learning for Overhead Contact Inspection;X Tu;Sensors.,2020
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献