Affiliation:
1. School of Power and Energy, Northwestern Polytechnical University, Xi’an 710072, China
Abstract
This research introduces the Enhanced Scale-Aware efficient Transformer (ESAE-Transformer), a novel and advanced model dedicated to predicting Exhaust Gas Temperature (EGT). The ESAE-Transformer merges the Multi-Head ProbSparse Attention mechanism with the established Transformer architecture, significantly optimizing computational efficiency and effectively discerning key temporal patterns. The incorporation of the Multi-Scale Feature Aggregation Module (MSFAM) further refines 2 s input and output timeframe. A detailed investigation into the feature dimensionality was undertaken, leading to an optimized configuration of the model, thereby improving its overall performance. The efficacy of the ESAE-Transformer was rigorously evaluated through an exhaustive ablation study, focusing on the contribution of each constituent module. The findings showcase a mean absolute prediction error of 3.47∘R, demonstrating strong alignment with real-world environmental scenarios and confirming the model’s accuracy and relevance. The ESAE-Transformer not only excels in predictive accuracy but also sheds light on the underlying physical processes, thus enhancing its practical application in real-world settings. The model stands out as a robust tool for critical parameter prediction in aero-engine systems, paving the way for future advancements in engine prognostics and diagnostics.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献