Data-Centric Benchmarking of Neural Network Architectures for the Univariate Time Series Forecasting Task

Author:

Schlieper Philipp1ORCID,Dombrowski Mischa1ORCID,Nguyen An1,Zanca Dario1ORCID,Eskofier Bjoern12ORCID

Affiliation:

1. Department Artificial Intelligence in Biomedical Engineering, Friedrich-Alexander-University, 91052 Erlangen, Germany

2. Institute of AI for Health, Helmholtz Center Munich German Research Center for Environmental Health, 85764 Neuherberg, Germany

Abstract

Time series forecasting has witnessed a rapid proliferation of novel neural network approaches in recent times. However, performances in terms of benchmarking results are generally not consistent, and it is complicated to determine in which cases one approach fits better than another. Therefore, we propose adopting a data-centric perspective for benchmarking neural network architectures on time series forecasting by generating ad hoc synthetic datasets. In particular, we combine sinusoidal functions to synthesize univariate time series data for multi-input-multi-output prediction tasks. We compare the most popular architectures for time series, namely long short-term memory (LSTM) networks, convolutional neural networks (CNNs), and transformers, and directly connect their performance with different controlled data characteristics, such as the sequence length, noise and frequency, and delay length. Our findings suggest that transformers are the best architecture for dealing with different delay lengths. In contrast, for different noise and frequency levels and different sequence lengths, LSTM is the best-performing architecture by a significant amount. Based on our insights, we derive recommendations which allow machine learning (ML) practitioners to decide which architecture to apply, given the dataset’s characteristics.

Publisher

MDPI AG

Reference35 articles.

1. He, X. A Survey on Time Series Forecasting. Proceedings of the 3D Imaging—Multidimensional Signal Processing and Deep Learning.

2. Deep Learning for Time Series Forecasting: A Survey;Torres;Big Data,2021

3. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł., and Polosukhin, I. (2017). Attention is All you Need. Proceedings of the Advances in Neural Information Processing Systems, Curran Associates, Inc.

4. Wen, Q., Zhou, T., Zhang, C., Chen, W., Ma, Z., Yan, J., and Sun, L. (2023). Transformers in Time Series: A Survey. arXiv.

5. Long Short-Term Memory;Hochreiter;Neural Comput.,1997

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3