Author:
Li Wenzhong,Liu Chengshuai,Hu Caihong,Niu Chaojie,Li Runxi,Li Ming,Xu Yingying,Tian Lu
Abstract
AbstractFlood forecasting using traditional physical hydrology models requires consideration of multiple complex physical processes including the spatio-temporal distribution of rainfall, the spatial heterogeneity of watershed sub-surface characteristics, and runoff generation and routing behaviours. Data-driven models offer novel solutions to these challenges, though they are hindered by difficulties in hyperparameter selection and a decline in prediction stability as the lead time extends. This study introduces a hybrid model, the RS-LSTM-Transformer, which combines Random Search (RS), Long Short-Term Memory networks (LSTM), and the Transformer architecture. Applied to the typical Jingle watershed in the middle reaches of the Yellow River, this model utilises rainfall and runoff data from basin sites to simulate flood processes, and its outcomes are compared against those from RS-LSTM, RS-Transformer, RS-BP, and RS-MLP models. It was evaluated against RS-LSTM, RS-Transformer, RS-BP, and RS-MLP models using the Nash–Sutcliffe Efficiency Coefficient (NSE), Root Mean Square Error (RMSE), Mean Absolute Error (MAE), and Bias percentage as metrics. At a 1-h lead time during calibration and validation, the RS-LSTM-Transformer model achieved NSE, RMSE, MAE, and Bias values of 0.970, 14.001m3/s, 5.304m3/s, 0.501% and 0.953, 14.124m3/s, 6.365m3/s, 0.523%, respectively. These results demonstrate the model's superior simulation capabilities and robustness, providing more accurate peak flow forecasts as the lead time increases. The study highlights the RS-LSTM-Transformer model's potential in flood forecasting and the advantages of integrating various data-driven approaches for innovative modelling.
Funder
National Key Research Priorities Program of China
National Natural Science Foundation of China
Publisher
Springer Science and Business Media LLC
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献