Affiliation:
1. Stanford University, Stanford, CA
2. Bell Labs, Lucent Technologies, Murray Hill, NJ
Abstract
While a variety of lossy compression schemes have been developed for certain forms of digital data (e.g., images, audio, video), the area of lossy compression techniques for arbitrary data tables has been left relatively unexplored. Nevertheless, such techniques are clearly motivated by the ever-increasing data collection rates of modern enterprises and the need for effective, guaranteed-quality approximate answers to queries over massive relational data sets.In this paper, we propose
SPARTAN,
a system that takes advantage of attribute semantics and data-mining models to perform lossy compression of massive data tables.
SPARTAN
is based on the novel idea of exploiting predictive data correlations and prescribed error-tolerance constraints for individual attributes to construct concise and accurate
Classification and Regression Tree (CaRT)
models for entire columns of a table. More precisely,
SPARTAN
selects a certain subset of attributes (referred to as
predicted
attributes) for which no values are explicitly stored in the compressed table; instead, concise
error-constrained
CaRTs that predict these values (within the prescribed error tolerances) are maintained. To restrict the huge search space of possible CaRT predictors,
SPARTAN
uses a Bayesian network structure to guide the selection of CaRT models that minimize the overall storage requirement, based on the prediction and materialization costs for each attribute.
SPARTAN
's CaRT-building algorithms employ novel integrated pruning strategies that take advantage of the given error constraints on individual attributes to minimize the computational effort involved. Our experimentation with several real-life data sets offers convincing evidence of the effectiveness of
SPARTAN
's model-based approach ---
SPARTAN
is able to consistently yield substantially better compression ratios than existing semantic or syntactic compression tools (e.g., gzip) while utilizing only small samples of the data for model inference.
Publisher
Association for Computing Machinery (ACM)
Reference21 articles.
1. "NetFlow Services and Applications". Cisco Systems 1999. "NetFlow Services and Applications". Cisco Systems 1999.
2. SPARTAN
3. Bayesian networks for lossless dataset compression
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献