Affiliation:
1. Department of Atmospheric and Oceanic Sciences, University of Wisconsin–Madison, Madison, Wisconsin
Abstract
AbstractShannon entropy has long been accepted as a primary basis for assessing the information content of sensor channels used for the remote sensing of atmospheric variables. It is not widely appreciated, however, that Shannon information content (SIC) can be misleading in retrieval problems involving nonlinear mappings between direct observations and retrieved variables and/or non-Gaussian prior and posterior PDFs. The potentially severe shortcomings of SIC are illustrated with simple experiments that reveal, for example, that a measurement can be judged to provide negative information even in cases in which the postretrieval PDF is undeniably improved over an informed prior based on climatology. Following previous authors’ writing mainly in the data assimilation and climate analysis literature, the Kullback–Leibler (KL) divergence, also commonly known as relative entropy, is shown to suffer from fewer obvious defects in this particular context. Yet, even KL divergence is blind to the expected magnitude of errors as typically measured by the error variance or root-mean-square error. Thus, neither information metric can necessarily be counted on to respond in a predictable way to changes in the precision or quality of a retrieved quantity.
Funder
National Aeronautics and Space Administration
Publisher
American Meteorological Society
Subject
Atmospheric Science,Ocean Engineering
Cited by
8 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献