Affiliation:
1. School of Electrical and Information Engineering, Beijing University of Civil Engineering and Architecture, Beijing 100044, China
2. Beijing Key Laboratory of Robot Bionics and Function Research, Beijing 100044, China
Abstract
As autonomous driving technology advances, the imperative of ensuring pedestrian traffic safety becomes increasingly prominent within the design framework of autonomous driving systems. Pedestrian trajectory prediction stands out as a pivotal technology aiming to address this challenge by striving to precisely forecast pedestrians’ future trajectories, thereby enabling autonomous driving systems to execute timely and accurate decisions. However, the prevailing state-of-the-art models often rely on intricate structures and a substantial number of parameters, posing challenges in meeting the imperative demand for lightweight models within autonomous driving systems. To address these challenges, we introduce Social Spatio-Temporal Graph Multi-Layer Perceptron (Social-STGMLP), a novel approach that utilizes solely fully connected layers and layer normalization. Social-STGMLP operates by abstracting pedestrian trajectories into a spatio-temporal graph, facilitating the modeling of both the spatial social interaction among pedestrians and the temporal motion tendency inherent to pedestrians themselves. Our evaluation of Social-STGMLP reveals its superiority over the reference method, as evidenced by experimental results indicating reductions of 5% in average displacement error (ADE) and 17% in final displacement error (FDE).
Funder
National Natural Science Foundation of China
Beijing University of Civil Engineering and Architecture Research Capacity Promotion Program for Young Scholars
Reference48 articles.
1. Large, F., Vasquez, D., Fraichard, T., and Laugier, C. (2004, January 14–17). Avoiding cars and pedestrians using velocity obstacles and motion prediction. Proceedings of the IEEE Intelligent Vehicles Symposium, Parma, Italy.
2. Porca: Modeling and planning for autonomous driving among many pedestrians;Luo;IEEE Robot. Autom. Lett.,2018
3. Wu, P., Chen, S., and Metaxas, D.N. (2020, January 14–19). Motionnet: Joint perception and motion prediction for autonomous driving based on bird’s eye view maps. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA.
4. Human motion trajectory prediction: A survey;Rudenko;Int. J. Robot. Res.,2020
5. Vision for mobile robot navigation: A survey;DeSouza;IEEE Trans. Pattern Anal. Mach. Intell.,2002