Affiliation:
1. Computer Science and Engineering, 352350, University of Washington, Seattle, WA
Abstract
We present a technique for identifying repetitive information transfers and use it to analyze the redundancy of network traffic. Our insight is that dynamic content, streaming media and other traffic that is not caught by today's Web caches is nonetheless likely to derive from similar information. We have therefore adapted similarity detection techniques to the problem of designing a system to eliminate redundant transfers. We identify repeated byte ranges between packets to avoid retransmitting the redundant data.
We find a high level of redundancy and are able to detect repetition that Web proxy caches are not. In our traces, after Web proxy caching has been applied, an additional 39% of the original volume of Web traffic is found to be redundant. Moreover, because our technique makes no assumptions about HTTP protocol syntax or caching semantics, it provides immediate benefits for other types of content, such as streaming media, FTP traffic, news and mail.
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Networks and Communications,Software
Reference16 articles.
1. Squid Web proxy cache. http://www.squid-cache.org/.]] Squid Web proxy cache. http://www.squid-cache.org/.]]
2. National Institute of Standards and Technology Specifications for secure hash standard April 1995. Federal Information Processing Standards Publication 180-1.]] National Institute of Standards and Technology Specifications for secure hash standard April 1995. Federal Information Processing Standards Publication 180-1.]]
3. Web proxy caching
4. CAIDA. Traffic workload overview. http://www.caida.org/Learn/Flow/tcpudp.html June 1999.]] CAIDA. Traffic workload overview. http://www.caida.org/Learn/Flow/tcpudp.html June 1999.]]
Cited by
61 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献