Abstract
Existing uncertainty assessments and mathematical models used for error estimation of global average temperature anomalies are examined. The error assessment model of Brohan et al 06 [1] was found not to describe the reality comprehensively and precisely enough. This was already shown for some type of errors by Frank [2];[3] hereinafter named F 10 and F 11. In addition to the findings in both papers by Frank a very common but new systematic error was isolated and defined here named “algorithm error” This error was so far regarded as self canceling or corrected by some unknown and unnamed homogenization processes. But this was not the case. It adds therefore a minimum additional systematic uncertainty of + 0.3 °C and −0.23°C respectively to any global mean anomaly calculation. This result is obtained when comparing only the most used algorithms against a “true” algorithm of measuring the daily temperature continuously. Due to the fact, that the real distribution of all applied algorithms (acc. as Griffith [4] showed is > 100) over time and space is not known, neither for land based temperature data nor for SST (Sea Surface Temperatures) the minimum value of said error is chosen here.
Subject
Energy (miscellaneous),Energy Engineering and Power Technology,Renewable Energy, Sustainability and the Environment,Environmental Engineering
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献