Abstract
The focus of this work is to computationally obtain an optimized neural network (NN) model to predict battery average Nusselt number (Nuavg) data using four activations functions. The battery Nuavg is highly nonlinear as reported in the literature, which depends mainly on flow velocity, coolant type, heat generation, thermal conductivity, battery length to width ratio, and space between the parallel battery packs. Nuavg is modeled at first using only one hidden layer in the network (NN1). The neurons in NN1 are experimented from 1 to 10 with activation functions: Sigmoidal, Gaussian, Tanh, and Linear functions to get the optimized NN1. Similarly, deep NN (NND) was also analyzed with neurons and activations functions to find an optimized number of hidden layers to predict the Nuavg. RSME (root mean square error) and R-Squared (R2) is accessed to conclude the optimized NN model. From this computational experiment, it is found that NN1 and NND both accurately predict the battery data. Six neurons in the hidden layer for NN1 give the best predictions. Sigmoidal and Gaussian functions have provided the best results for the NN1 model. In NND, the optimized model is obtained at different hidden layers and neurons for each activation function. The Sigmoidal and Gaussian functions outperformed the Tanh and Linear functions in an NN1 model. The linear function, on the other hand, was unable to forecast the battery data adequately. The Gaussian and Linear functions outperformed the other two NN-operated functions in the NND model. Overall, the deep NN (NND) model predicted better than the single-layered NN (NN1) model for each activation function.
Subject
Energy (miscellaneous),Energy Engineering and Power Technology,Renewable Energy, Sustainability and the Environment,Electrical and Electronic Engineering,Control and Optimization,Engineering (miscellaneous)
Cited by
35 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献