Data-Centric Benchmarking of Neural Network Architectures for the Univariate Time Series Forecasting Task
-
Published:2024-08-26
Issue:3
Volume:6
Page:718-747
-
ISSN:2571-9394
-
Container-title:Forecasting
-
language:en
-
Short-container-title:Forecasting
Author:
Schlieper Philipp1ORCID, Dombrowski Mischa1ORCID, Nguyen An1, Zanca Dario1ORCID, Eskofier Bjoern12ORCID
Affiliation:
1. Department Artificial Intelligence in Biomedical Engineering, Friedrich-Alexander-University, 91052 Erlangen, Germany 2. Institute of AI for Health, Helmholtz Center Munich German Research Center for Environmental Health, 85764 Neuherberg, Germany
Abstract
Time series forecasting has witnessed a rapid proliferation of novel neural network approaches in recent times. However, performances in terms of benchmarking results are generally not consistent, and it is complicated to determine in which cases one approach fits better than another. Therefore, we propose adopting a data-centric perspective for benchmarking neural network architectures on time series forecasting by generating ad hoc synthetic datasets. In particular, we combine sinusoidal functions to synthesize univariate time series data for multi-input-multi-output prediction tasks. We compare the most popular architectures for time series, namely long short-term memory (LSTM) networks, convolutional neural networks (CNNs), and transformers, and directly connect their performance with different controlled data characteristics, such as the sequence length, noise and frequency, and delay length. Our findings suggest that transformers are the best architecture for dealing with different delay lengths. In contrast, for different noise and frequency levels and different sequence lengths, LSTM is the best-performing architecture by a significant amount. Based on our insights, we derive recommendations which allow machine learning (ML) practitioners to decide which architecture to apply, given the dataset’s characteristics.
Reference35 articles.
1. He, X. A Survey on Time Series Forecasting. Proceedings of the 3D Imaging—Multidimensional Signal Processing and Deep Learning. 2. Deep Learning for Time Series Forecasting: A Survey;Torres;Big Data,2021 3. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł., and Polosukhin, I. (2017). Attention is All you Need. Proceedings of the Advances in Neural Information Processing Systems, Curran Associates, Inc. 4. Wen, Q., Zhou, T., Zhang, C., Chen, W., Ma, Z., Yan, J., and Sun, L. (2023). Transformers in Time Series: A Survey. arXiv. 5. Long Short-Term Memory;Hochreiter;Neural Comput.,1997
|
|