Abstract
In recent days, we face workload and time series issue in cloud computing. This leads to wastage of network, computing and resources. To overcome this issue we have used integrated deep learning approach in our proposed work. Accurate prediction of workload and resource allocation with time series enhances the performance of the network. Initially the standard deviation is reduced by applying logarithmic operation and then powerful filters are adopted to remove the extreme points and noise interference. Further the time series is predicted by integrated deep learning method. This method accurately predicts the workload and sequence of resource along with time series. Then the obtained data is standardized by a Min-Max scalar and the quality of the network is preserved by incorporating network model. Finally our proposed method is compared with other currently used methods and the results are obtained.
Publisher
Inventive Research Organization
Cited by
15 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Infrastructure as Code (IaC): Insights on Various Platforms;Advances in Intelligent Systems and Computing;2023
2. Non-Intrusive Load Monitoring for Energy Consumption Disaggregation;2022 3rd International Conference on Smart Electronics and Communication (ICOSEC);2022-10-20
3. Methods of Application Control in Cloud Computing Systems;Advances in Intelligent Systems and Computing;2022-09-30
4. Energy Efficient Cloud Task Scheduling Policy Using Virtual Machine Concept and VMRRU Technique;2022 International Conference on Inventive Computation Technologies (ICICT);2022-07-20
5. Exploration on Task Scheduling using Optimization Algorithm in Cloud computing;2022 6th International Conference on Trends in Electronics and Informatics (ICOEI);2022-04-28