Abstract
Social distancing protocols have been highly recommended by the World Health Organization (WHO) to curb the spread of COVID-19. However, one major challenge to enforcing social distancing in public areas is how to perceive people in three dimensions. This paper proposes an innovative pedestrian 3D localization method using monocular images combined with terrestrial point clouds. In the proposed approach, camera calibration is achieved based on the correspondences between 2D image points and 3D world points. The vertical coordinates of the ground plane where pedestrians stand are extracted from the point clouds. Then, using the assumption that the pedestrian is always perpendicular to the ground, the 3D coordinates of the pedestrian’s feet and head are calculated iteratively using collinear equations. This allows the three-dimensional localization and height determination of pedestrians using monocular cameras, which are widely distributed in many major cities. The performance of the proposed method was evaluated using two different datasets. Experimental results show that the pedestrian localization error of the proposed approach was less than one meter within tens of meters and performed better than other localization techniques. The proposed approach uses simple and efficient calculations, obtains accurate location, and can be used to implement social distancing rules. Moreover, since the proposed approach also generates accurate height values, exclusionary schemes to social distancing protocols, particularly the parent-child exemption, can be introduced in the framework.
Funder
National Natural Science Foundation of China
State Key Laboratory of Coal Resources and Safe Mining
State Key Laboratory of Geohazard Prevention and Geoenvironment Protection
China Fundamental Research Funds for Central University
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献