Author:
Pratheek S,Ashwin R S,Balaji C
Abstract
Abstract
High-heat-flux-density data centres face significant thermal management challenges, particularly those for 5G and AI applications. Elevated processor temperatures lead to frequent thermal throttling and non-uniform thermal stresses, negatively impacting server performance and reliability. This highlights the need for effective thermal management strategies. Machine learning-based temperature prediction algorithms have shown promise in proactively managing the thermal condition of the system. In his study, a Raspberry Pi 4 Model B+ was immersed in mineral oil to simulate the workloads based on cryptography, Fast Fourier Transform (FFT) and data analytics. The discrete wavelet transform technique is used to mitigate the impact of noise interference in the experimental data. The results show that the attention-based LSTM encoder-decoder model (LSTM-ED-Attention) coupled with the entropy minimisation strategy outperformed the baseline LSTM encoder-decoder model by 22, 15 and 20% for prediction horizons of 10, 30 and 60 seconds, respectively.