Instantaneous Extraction of Indoor Environment from Radar Sensor-Based Mapping
-
Published:2024-02-02
Issue:3
Volume:16
Page:574
-
ISSN:2072-4292
-
Container-title:Remote Sensing
-
language:en
-
Short-container-title:Remote Sensing
Author:
Cho Seonmin1ORCID, Kwak Seungheon2ORCID, Lee Seongwook2ORCID
Affiliation:
1. School of Electronics and Information Engineering, College of Engineering, Korea Aerospace University, Goyang-si 10540, Republic of Korea 2. School of Electrical and Electronics Engineering, College of ICT Engineering, Chung-Ang University, Seoul 06974, Republic of Korea
Abstract
In this paper, we propose a method for extracting the structure of an indoor environment using radar. When using the radar in an indoor environment, ghost targets are observed through the multipath propagation of radio waves. The presence of these ghost targets obstructs accurate mapping in the indoor environment, consequently hindering the extraction of the indoor environment. Therefore, we propose a deep learning-based method that uses image-to-image translation to extract the structure of the indoor environment by removing ghost targets from the indoor environment map. In this paper, the proposed method employs a conditional generative adversarial network (CGAN), which includes a U-Net-based generator and a patch-generative adversarial network-based discriminator. By repeating the process of determining whether the structure of the generated indoor environment is real or fake, CGAN ultimately returns a structure similar to the real environment. First, we generate a map of the indoor environment using radar, which includes ghost targets. Next, the structure of the indoor environment is extracted from the map using the proposed method. Then, we compare the proposed method, which is based on the structural similarity index and structural content, with the k-nearest neighbors algorithm, Hough transform, and density-based spatial clustering of applications with noise-based environment extraction method. When comparing the methods, our proposed method offers the advantage of extracting a more accurate environment without requiring parameter adjustments, even when the environment is changed.
Funder
Ministry of SMEs and Startups Chung-Ang University Research
Reference27 articles.
1. Zhang, A., and Atia, M.M. (2020, January 25–28). Comparison of 2D localization using radar and lidar in long corridors. In Proceedings of the 2020 IEEE SENSORS. Proceedings of the 2020 IEEE SENSORS, Rotterdam, The Netherlands. 2. Vargas, J., Alsweiss, S., Toker, O., Razdon, R., and Santos, J. (2021). An overview of autonomous vehicles sensors and their vulnerability to weather conditions. Sensors, 21. 3. Dogru, S., and Marques, L. (2018, January 25–27). Evaluation of an automotive short range radar sensor for mapping in orchards. Proceedings of the 2018 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), Torres Vedras, Portugal. 4. Marck, J.W., Mohamoud, A., Houwen, E.V., and Heijster, R.V. (2013, January 9–11). Indoor radar SLAM: A radar application for vision and GPS denied environments. Proceedings of the 2013 European Radar Conference, Nuremberg, Germany. 5. Lu, C.X., Rosa, S., Zhao, P., Wang, B., Chen, J., Stankovic, A., Trigoni, N., and Markham, A. (2020, January 16–18). See through smoke: Robust indoor mapping with low-cost mmwave radar. Proceedings of the 18th ACM International Conference on Mobile Systems, Toronto, ON, Canada.
|
|