China Building Rooftop Area: the first multi-annual (2016–2021) and high-resolution (2.5 m) building rooftop area dataset in China derived with super-resolution segmentation from Sentinel-2 imagery
-
Published:2023-08-09
Issue:8
Volume:15
Page:3547-3572
-
ISSN:1866-3516
-
Container-title:Earth System Science Data
-
language:en
-
Short-container-title:Earth Syst. Sci. Data
Author:
Liu Zeping, Tang HongORCID, Feng Lin, Lyu Siqing
Abstract
Abstract. Large-scale and multi-annual maps of building rooftop area (BRA) are crucial for addressing policy decisions and sustainable development. In
addition, as a fine-grained indicator of human activities, BRA could
contribute to urban planning and energy modeling to provide benefits to human well-being. However, it is still challenging to produce a large-scale BRA due to the rather tiny sizes of individual buildings. From the viewpoint
of classification methods, conventional approaches utilize high-resolution
aerial images (metric or submetric resolution) to map BRA; unfortunately,
high-resolution imagery is both infrequently captured and expensive to
purchase, making the BRA mapping costly and inadequate over a consistent
spatiotemporal scale. From the viewpoint of learning strategies, there is a
nontrivial gap that persists between the limited training references and
the applications over geospatial variations. Despite the difficulties,
existing large-scale BRA datasets, such as those from Microsoft or Google,
do not include China, and hence there are no full-coverage maps of BRA in China
yet. In this paper, we first propose a deep-learning method, named the
Spatio-Temporal aware Super-Resolution Segmentation framework (STSR-Seg), to
achieve robust super-resolution BRA extraction from relatively
low-resolution imagery over a large geographic space. Then, we produce the
multi-annual China Building Rooftop Area (CBRA) dataset with 2.5 m
resolution from 2016–2021 Sentinel-2 images. CBRA is the first
full-coverage and multi-annual BRA dataset in China. With the designed training-sample-generation algorithms and the spatiotemporally aware learning
strategies, CBRA achieves good performance with a F1 score of 62.55 % (+10.61 % compared with the previous BRA data in China) based
on 250 000 testing samples in urban areas and a recall of 78.94 % based
on 30 000 testing samples in rural areas. Temporal analysis shows good
performance consistency over years and good agreement with other
multi-annual impervious surface area datasets. STSR-Seg will enable low-cost, dynamic, and large-scale BRA mapping
(https://github.com/zpl99/STSR-Seg, last access: 12 July 2023). CBRA will foster the development of BRA mapping and therefore provide basic data for sustainable research (Liu
et al., 2023; https://doi.org/10.5281/zenodo.7500612).
Funder
National Natural Science Foundation of China Beijing Normal University
Publisher
Copernicus GmbH
Subject
General Earth and Planetary Sciences
Reference73 articles.
1. Abraham, N. and Khan, N. M.: A novel focal tversky loss function with
improved attention u-net for lesion segmentation, in: 2019 IEEE 16th
international symposium on biomedical imaging (ISBI 2019), 8–11 April 2019,
Venice, Italy, 683–687, 2019. 2. Adriano, B., Yokoya, N., Xia, J., Miura, H., Liu, W., Matsuoka, M., and
Koshimura, S.: Learning from multimodal and multitemporal earth observation
data for building damage mapping, ISPRS J. Photogramm. Remote, 175, 132–143, 2021. 3. Arcgis online: https://www.arcgis.com/home/index.html, last access: 24 November 2022. 4. Appolloni, E., Orsini, F., Specht, K., Thomaier, S., Sanye-Mengual, E.,
Pennisi, G., and Gianquinto, G.: The global rise of urban rooftop
agriculture: A review of worldwide cases, J. Clean Prod., 296, 126556, https://doi.org/10.1016/j.jclepro.2021.126556, 2021. 5. Ayush, K., Uzkent, B., Meng, C., Tanmay, K., Burke, M., Lobell, D., and
Ermon, S.: Geography-aware self-supervised learning, in: Proceedings of the
IEEE/CVF International Conference on Computer Vision, 11–17 October 2021,
10181–10190, 2021a.
Cited by
15 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|