Ackerman Unmanned Mobile Vehicle Based on Heterogeneous Sensor in Navigation Control Application
Author:
Shih Chi-Huang1, Lin Cheng-Jian1ORCID, Jhang Jyun-Yu2ORCID
Affiliation:
1. Department of Computer Science and Information Engineering, National Chin-Yi University of Technology, Taichung 411, Taiwan 2. Department of Computer Science and Information Engineering, National Taichung University of Science and Technology, Taichung 404, Taiwan
Abstract
With the advancement of science and technology, the development and application of unmanned mobile vehicles (UMVs) have emerged as topics of crucial concern in the global industry. The development goals and directions of UMVs vary according to their industrial uses, which include navigation, autonomous driving, and environmental recognition; these uses have become the priority development goals of researchers in various fields. UMVs employ sensors to collect environmental data for environmental analysis and path planning. However, the analysis function of a single sensor is generally affected by natural environmental factors, resulting in poor identification results. Therefore, this study introduces fusion technology that employs heterogeneous sensors in the Ackerman UMV, leveraging the advantages of each sensor to enhance accuracy and stability in environmental detection and identification. This study proposes a fusion technique involving heterogeneous imaging and LiDAR (laser imaging, detection, and ranging) sensors in an Ackerman UMV. A camera is used to obtain real-time images, and YOLOv4-tiny and simple online real-time tracking are then employed to detect the location of objects and conduct object classification and object tracking. LiDAR is simultaneously used to obtain real-time distance information of detected objects. An inertial measurement unit is used to gather odometry information to determine the position of the Ackerman UMV. Static maps are created using simultaneous localization and mapping. When the user commands the Ackerman UMV to move to the target point, the vehicle control center composed of the robot operating system activates the navigation function through the navigation control module. The Ackerman UMV can reach the destination and instantly identify obstacles and pedestrians when in motion.
Funder
Taiwan Ministry of Science and Technology
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Reference34 articles.
1. Liu, Z., and Gao, B. (2020, January 15–17). Radar Sensors in Automatic Driving Cars. Proceedings of the 2020 5th International Conference on Electromechanical Control Technology and Transportation (ICECTT), Nanchang, China. 2. The Impact of Adverse Weather Conditions on Autonomous Vehicles: How Rain, Snow, Fog, and Hail Affect the Performance of a Self-Driving Car;Zang;IEEE Veh. Technol. Mag.,2019 3. Region-Based Convolutional Networks for Accurate Object Detection and Segmentation;Girshick;IEEE Trans. Pattern Anal. Mach. Intell.,2016 4. Yan, J., Lei, Z., Wen, L., and Li, S.Z. (2014, January 23–28). The Fastest Deformable Part Model for Object Detection. Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA. 5. Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C., and Berg, A.C. (2016, January 11–14). SSD: Single shot multiBox detector. Proceedings of the European Conference on Computer Vision, Amsterdam, The Netherlands.
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|