Abstract
Ground-based microwave radiometers are increasingly used in operational meteorology and nowcasting. These instruments continuously measure the spectra of downwelling atmospheric radiation in the range 20–60 GHz used for the retrieval of tropospheric temperature and water vapor profiles. Spectroscopic uncertainty is an important part of the retrieval error budget, as it leads to systematic bias. In this study, we analyze the difference between observed and simulated microwave spectra obtained from more than four years of microwave and radiosonde observations over Nizhny Novgorod (56.2° N, 44° E). We focus on zenith-measured and elevation-scanning data in clear-sky conditions. The simulated spectra are calculated by a radiative transfer model with the use of radiosonde profiles and different absorption models, corresponding to the latest spectroscopy research. In the case of zenith-measurements, we found a systematic bias (up to ~2 K) of simulated spectra at 51–54 GHz. The sign of bias depends on the absorption model. A thorough investigation of the error budget points to a spectroscopic nature of the observed differences. The dependence of the results on the elevation angle and absorption model can be explained by the basic properties of radiative transfer and by cloud contamination at elevation angles.
Funder
Russian Science Foundation
Subject
General Earth and Planetary Sciences
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献