Affiliation:
1. Kalinga Institute of Industrial Technology, India
2. Kalinga Institue of Industrial Technology, India
Abstract
Data centers are cost-effective infrastructures for storing large volumes of data and hosting large scale service applications. Cloud computing service providers are rapidly deploying data centers across the world with a huge number of servers and switches. These data centers consume significant amounts of energy, contributing to high operational costs. In this chapter, we study energy savings of data centers by consolidation and switching off of those virtual machines which are not in use. According to this policy, c virtual machines continue serving the customer until the number of idle servers attains the threshold level d; then d idle servers take synchronous vacation simultaneously, otherwise these servers begin serving the customers. Numerical results are provided to demonstrate the applicability of the proposed model for the data center management in particular, to quantify theoretically the tradeoff between the conflicting aims of energy efficiency and Quality of Service (QoS) requirements specified by cloud tenants.
Reference63 articles.
1. Armbrust, M., Fox, A., Griffith, R., Joseph, A. D., Katz, R. H., Konwinski, A., & Zaharia, M. (2009). Above the clouds: a Berkeley view of cloud computing. UC Berkley. Retrieved from http://www.eecs.berkeley.edu/Pubs/TechRpts/2009/EECS-2009-28.html
2. GreeDi: An energy efficient routing algorithm for big data on cloud
3. Trusted Energy-Efficient Cloud-Based Services Brokerage Platform
4. An energy-aware service composition algorithm for multiple cloud-based IoT applications
5. Energy Efficient Cloud Computing Environment via Autonomic Meta-director Framework
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献