Abstract
High-latitude areas are very sensitive to global warming, which has significant impacts on soil temperatures and associated processes governing permafrost evolution. This study aims to improve first-layer soil temperature retrievals during winter. This key surface state variable is strongly affected by snow’s geophysical properties and their associated uncertainties (e.g., thermal conductivity) in land surface climate models. We used infrared MODIS land-surface temperatures (LST) and Advanced Microwave Scanning Radiometer for EOS (AMSR-E) brightness temperatures (Tb) at 10.7 and 18.7 GHz to constrain the Canadian Land Surface Scheme (CLASS), driven by meteorological reanalysis data and coupled with a simple radiative transfer model. The Tb polarization ratio (horizontal/vertical) at 10.7 GHz was selected to improve snowpack density, which is linked to the thermal conductivity representation in the model. Referencing meteorological station soil temperature measurements, we validated the approach at four different sites in the North American tundra over a period of up to 8 years. Results show that the proposed method improves simulations of the soil temperature under snow (Tg) by 64% when using remote sensing (RS) data to constrain the model, compared to model outputs without satellite data information. The root mean square error (RMSE) between measured and simulated Tg under the snow ranges from 1.8 to 3.5 K when using RS data. Improved temporal monitoring of the soil thermal state, along with changes in snow properties, will improve our understanding of the various processes governing soil biological, hydrological, and permafrost evolution.
Subject
General Earth and Planetary Sciences
Cited by
10 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献