Affiliation:
1. Mathematics National Central University Taoyuan Taiwan
2. Space Science and Engineering National Central University Taoyuan Taiwan
3. Center for Astronautical Physics and Engineering National Central University Taoyuan Taiwan
4. Computer Science and Information Engineering National Central University Taoyuan Taiwan
Abstract
AbstractIonospheric total electron content (TEC) is a key indicator of the space environment. Geophysical forcing from above and below drives its spatial and temporal variations. A full understanding of physical and chemical principles, available and well‐representable driving inputs, and capable computational power are required for physical models to reproduce simulations that agree with observations, which may be challenging at times. Recently, data‐driven approaches, such as deep learning, have therefore surged as means for TEC prediction. Owing to the fact that the geophysical world possesses a sequential nature in time and space, Transformer architectures are proposed and evaluated for sequence‐to‐sequence TEC predictions in this study. We discuss the impacts of time lengths of choice during the training process and analyze what the neural network has learned regarding the data sets. Our results suggest that 12‐layer, 128‐hidden‐unit Transformer architectures sufficiently provide multi‐step global TEC predictions for 48 hr with an overall root‐mean‐square error (RMSE) of ∼1.8 TECU. The hourly variation of RMSE increases from 0.6 TECU to about 2.0 TECU during the prediction time frame.
Funder
National Science and Technology Council
Publisher
American Geophysical Union (AGU)
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献