Real-Time Path Planning for Obstacle Avoidance in Intelligent Driving Sightseeing Cars Using Spatial Perception

Author:

Yang Xu1,Wu Feiyang1,Li Ruchuan2,Yao Dong2,Meng Lei3,He Ankai4ORCID

Affiliation:

1. School of Automation, Wuhan University of Technology, Wuhan 430070, China

2. Aerospace Information Research Institute, CAS Qilu, Jinan 250132, China

3. Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100080, China

4. National Engineering Research Center of Geographic Information System, China University of Geosciences, Wuhan 430074, China

Abstract

The increasing prevalence of intelligent driving sightseeing vehicles in the tourism industry underscores the critical importance of real-time planning for effective local obstacle avoidance paths when these vehicles encounter obstacles during operation. To fulfill this requirement, it is imperative to establish real-time dynamic perception as the foundational element. Thus, this paper introduces a novel local path planning algorithm founded on the principles of spatial perception. In the diverse array of road environments characterized by varying spatial features, sightseeing vehicles can effectively achieve safe and comfortable obstacle avoidance maneuvers. The proposed approach employs a high-precision positioning module and a real-time dynamic perception module to acquire real-time spatial information pertaining to the sightseeing vehicle and the road environment. It comprehensively integrates spatiotemporal safety constraints and obstacle avoidance curvature constraints to derive control points for the obstacle avoidance path. Specific control points undergo optimization and adjustment, ultimately resulting in the generation of the obstacle avoidance spatiotemporal path through discrete interpolation using B-spline curves. These locally tailored paths are subsequently compared with local obstacle avoidance paths generated using Bezier curves. The empirical validation of the proposed local obstacle avoidance path algorithm is conducted through a combination of simulation analysis and real vehicle verification. The research outcomes affirm that the algorithm can indeed produce smoother local obstacle avoidance paths, resulting in reduced front-wheel steering angles and yaw angle variations. This enhancement substantially contributes to the overall stability of sightseeing vehicles during obstacle avoidance maneuvers.

Funder

National Key R&D Program of China

Publisher

MDPI AG

Subject

Fluid Flow and Transfer Processes,Computer Science Applications,Process Chemistry and Technology,General Engineering,Instrumentation,General Materials Science

Reference39 articles.

1. Path Planning and Path Tracking for Autonomous Vehicle Based on MPC with Adaptive Dual-Horizon-Parameters;Li;Int. J. Automot. Technol.,2022

2. Xiong, L., Fu, Z., Zeng, D., Qian, Z., and Leng, B. (2021, January 17–19). A Path Planning and Tracking Framework Based on Model Predictive Control for Autonomous Vehicle Obstacle Avoidance. Proceedings of the 27th Symposium of the International Association of Vehicle System Dynamics, IAVSD 2021, Virtual.

3. Research on a lightweight unmanned sightseeing vehicle frame based on multi-condition and multi-objective optimization;Tang;Adv. Mech. Eng.,2022

4. Jiao, M., and Song, Y. (2019, January 23). Path Planning for Unmanned Campus Sightseeing Vehicle with Linear Temporal Logic. Proceedings of the 2018 Chinese Intelligent Systems Conference, Singapore.

5. Application Analysis of GIS on UGV Autonomous Navigation;Wang;Beijing Ligong Daxue Xuebao/Trans. Beijing Inst. Technol.,2019

Cited by 3 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3