Affiliation:
1. Vanderbilt University, Nashville, TN, USA
2. RTI, Sunnyvale, CA, USA
Abstract
The Object Management Group's (OMG) Data Distribution Service (DDS) provides many configurable policies which determine end-to-end quality of service (QoS) of applications. It is challenging to predict the system's performance in terms of latencies, throughput, and resource usage because diverse combinations of QoS configurations influence QoS of applications in different ways. To overcome this problem, design-time formal methods have been applied with mixed success, but lack of sufficient accuracy in prediction, tool support, and understanding of formalism has prevented wider adoption of the formal techniques. A promising approach to address this challenge is to emulate system behavior and gather data on the QoS parameters of interest by experimentation. To realize this approach, which is preferred over formal methods due to their limitations in accurately predicting QoS, we have developed a model-based automatic performance testing framework with generative capabilities to reduce manual efforts in generating a large number of relevant QoS configurations that can be deployed and tested on a cloud platform. This paper describes our initial efforts in developing and using this technology.
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Graphics and Computer-Aided Design,Software
Reference4 articles.
1. The many faces of publish/subscribe
2. A QoS policy configuration modeling language for publish/subscribe middleware platforms
3. Expertus: A Generator Approach to Automate Performance Testing in IaaS Clouds
4. Object Management Group. Data Distribution Service for Real-time Systems Specification 1.2 edition January 2007. Object Management Group. Data Distribution Service for Real-time Systems Specification 1.2 edition January 2007.
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献