Abstract
Kernel Ridge Regression is a supervised machine-learning approach that merges concepts from ridge regression and the kernel trick. It is especially beneficial for addressing regression problems characterized by a nonlinear relationship between the input and output variables. Kernel Ridge Regression (KRR) involves utilizing the kernel trick to implement ridge regression. The approach can acquire knowledge of a non-linear function in a space with more dimensions while taking advantage of ridge regression's regularization. However, the hyper-parameter settings that define the kernel type influence KRR's effectiveness. Significant processing costs, memory expenses, and low accuracy burden the current approaches for collecting these hyper parameter values. This study introduces a substantial enhancement to the golden eagle optimization method. The enhancement entails implementing elite opposite-based learning (EOBL) to increase population diversity in the search space. We do this to choose the optimal hyper parameters effectively. We used ten publicly available multi-class datasets to verify and authenticate the effectiveness of the suggested enhancement to Kernel Ridge Regression. Based on several assessment criteria, the results clearly showed that the suggested enhancement outperforms all other basic procedure techniques in terms of categorization efficacy.