Abstract
AbstractWith the latest advances in deep learning-based generative models, it has not taken long to take advantage of their remarkable performance in the area of time series. Deep neural networks used to work with time series heavily depend on the size and consistency of the datasets used in training. These features are not usually abundant in the real world, where they are usually limited and often have constraints that must be guaranteed. Therefore, an effective way to increase the amount of data is by using data augmentation techniques, either by adding noise or permutations and by generating new synthetic data. This work systematically reviews the current state of the art in the area to provide an overview of all available algorithms and proposes a taxonomy of the most relevant research. The efficiency of the different variants will be evaluated as a central part of the process, as well as the different metrics to evaluate the performance and the main problems concerning each model will be analysed. The ultimate aim of this study is to provide a summary of the evolution and performance of areas that produce better results to guide future researchers in this field.
Funder
Universidad Politécnica de Madrid
Publisher
Springer Science and Business Media LLC
Subject
Artificial Intelligence,Software
Reference128 articles.
1. Duong H-T, Nguyen-Thi T-A (2021) A review: preprocessing techniques and data augmentation for sentiment analysis. Comput Soc Netw 8(1):1–16
2. Felix EA, Lee SP (2019) Systematic literature review of preprocessing techniques for imbalanced data. IET Softw 13(6):479–496
3. Goodfellow IJ, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, Courville A, Bengio Y (2014) Generative adversarial networks
4. Lecun Y (1987) PhD Thesis: Modeles connexionnistes de L’apprentissage (connectionist Learning Models). Universite P. et M. Curie (Paris 6)
5. Kingma DP, Welling M (2014) Auto-encoding variational bayes
Cited by
62 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献