Affiliation:
1. ERIC, University of Lyon 2, France
Abstract
Performance measurement tools are very important, both for designers and users of Database Management Systems (DBMSs). Performance evaluation is useful to designers to determine elements of architecture, and, more generally, to validate or refute hypotheses regarding the actual behavior of a DBMS. Thus, performance evaluation is an essential component in the development process of well-designed and efficient systems. Users may also employ performance evaluation, either to compare the efficiency of different technologies before selecting a DBMS, or to tune a system. Performance evaluation by experimentation on a real system is generally referred to as benchmarking. It consists of performing a series of tests on a given DBMS to estimate its performance in a given setting. Typically, a benchmark is constituted of two main elements: a database model (conceptual schema and extension), and a workload model (set of read and write operations) to apply on this database, following a predefined protocol. Most benchmarks also include a set of simple or composite performance metrics such as response time, throughput, number of input/output, disk or memory usage, and so forth. The aim of this article is to present an overview of the major families of state-of-the-art database benchmarks, namely, relational benchmarks, object and object-relational benchmarks, XML benchmarks, and decision-support benchmarks; and to discuss the issues, tradeoffs, and future trends in database benchmarking. We particularly focus on XML and decision-support benchmarks, which are currently the most innovative tools that are developed in this area.
Reference23 articles.
1. Abiteboul, S., Benjelloun, O., Manolescu, I., Milo, T., & Weber, R. (2002). Active XML: Peer-to-Peer Data and Web Services Integration. 28th International Conference on Very Large Data Bases (VLDB 02). Hong Kong, China. 1087-1090.
2. Afanasiev, L., Manolescu, I., & Michiels, P. (2005). MemBeR: A Micro-benchmark Repository for XQuery. 3rd International XML Database Symposium (XSym 05). Trondheim, Norway. LNCS. 3671, 144-161.
3. Anderson, T. L., Berre, A. G., Mallison, M., Porter, H. H., & Schneider, B. (1990). The HyperModel Benchmark. International Conference on Extending Database Technology (EDBT 90). Venice, Italy. LNCS. 416, 317-331.
4. Böhme, T., & Rahm, E. (2001). XMach-1: A Benchmark for XML Data Management. Datenbanksysteme in Büro, Technik und Wissenschaft (BTW 01), Oldenburg, Germany. 264-273.
5. The XOO7 Benchmark. Efficiency and Effectiveness of XML Tools and Techniques and Data Integration over the Web. VLDB 2002 Workshop EEXTT. Hong Kong, China.;S.Bressan;LNCS,2002