Affiliation:
1. Rao Bahadhur Y. Mahabaleshwarappa Engineering College, Bellary, Karnataka, India
Abstract
The amount of time required by a process for executing on the CPU is called CPU-Burst Time, on which implementation of some CPU-scheduling algorithms such as Shortest Job First (SJF) and Shortest-Remaining Time First (SRTF) are relied on. There are many vivid methods for prognosticating the CPU-Burst length, one among them is Machine Learning based approach to calculate the burst- time of the processes in the ready queue. This approach involves the selection of most appropriate attributes of the process by making use of selection techniques, and based on these attributes, the CPU-Burst for the process is calculated in the grid. The ML algorithms used for this are Linear Regression, K-Nearest Neighbors (KNN), and Decision Tree (DT). This approach results in higher efficiency as it obtained linear relationship between the process and CPU Burst-Time. When tested and evaluated the workload dataset. The efficiency of ML approach is high because KNN provides better results in terms of CC and RAE, and for the factors like time, space and burst-estimation, attributes selection techniques are used.