Abstract
Abstract
Gaunt factors are fundamental in describing the interaction of free electrons with photons, playing a crucial role in astrophysical processes such as radiation transport and emission spectra. Traditional methods for computing Gaunt factors involve complex integrations and intricate mathematical formulations, often being computationally expensive and time-consuming. This study explores an alternative approach using machine learning models to predict free-free Gaunt factors. Three models were employed: Artificial Neural Network (ANN), Support Vector Regression (SVR), and Gradient Boosting Regression (GBR). The obtained results demonstrate high performance, with R
2 scores ranging from 0.98 to 0.99, indicating the potential of machine learning models to accurately predict Gaunt factors.