Affiliation:
1. Cheriton School of Computer Science, University of Waterloo, Waterloo, Ont., N2L 3G1, Canada
2. Cheriton School of Computer Science University of Waterloo, Waterloo, Ont., N2L 3G1, Canada
Abstract
As computer science has progressed, numerous models and measures have been developed over the years. Among the most commonly used in theoretical computer science are the RAM model, the I/O model, worst case analysis, space (memory) usage, average case analysis, amortized analysis, adaptive analysis and the competitive ratio. New models are added to this list every few years to re ect varying constraints imposed by novel application or advances in computer architectures. Examples of alternative models are the transdichotomous RAM or word-RAM, the data stream model, the MapReduce model, the cache oblivious model and the smoothed analysis model. New models and measures, when successful expand our understanding of computation and open new avenues of inquiry. As it is to be expected relatively few models and paradigms are introduced every year, and even less are eventually proven successful. In this paper we discuss rst certain shortcomings of the online competitive analysis model particularly as it concerns paging, discuss existing solutions in the literature as well as present recent progress in developing models and measures that better re ect actual practice for the case of paging. From there we proceed to a more general discussion on how to measure and evaluate new models within theoretical computer science and how to contrast them, when appropriate, to existing models. Lastly, we highlight certain \natural" choices and assumptions of the standard worst-case model which are often unstated and rarely explicitly justied. We contrast these choices to those made in the formalization of probability theory.
Publisher
Association for Computing Machinery (ACM)
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献