Abstract
Abstract
Quantum Recurrent Neural Networks are receiving an increased attention thanks to their enhanced generalization capabilities in time series analysis. However, their performances were bottlenecked by long training times and unscalable architectures. In this paper, we propose a novel Quantum Recurrent Neural Network model based on Quantum Gated Recurrent Units. It uses a learnable Variational Quantum Layer to process temporal data, interspersed with two classical layers to properly match the dimensionality of the input and output vectors. Such an architecture has fewer quantum parameters than existing Quantum Long Short-Term Memory models. Both the quantum networks were evaluated on periodic and real-world time series datasets, together with the classical counterparts. The quantum models exhibited superior performances compared to the classical ones in all the test cases. The Quantum Gated Recurrent Units outperformed the Quantum Long Short-Term Memory network despite having a simpler internal configuration. Moreover, the Quantum Gated Recurrent Units network demonstrated to be about 25% faster during the training and inference procedure over the Quantum Long Short-Term Memory. This improvement in speed comes with one less quantum circuit to be executed, suggesting that our model may offer a more efficient alternative for implementing Quantum Recurrent Neural Networks on both simulated and real quantum hardware.
Funder
European Union - NextGenerationEU
Reference57 articles.
1. Deep learning for computer vision: a brief review;Voulodimos;Computational Intelligence and Neuroscience,2018
2. Sequence to sequence learning with neural networks;Sutskever,2014
3. A survey on deep learning based sentiment analysis;Joseph;Materials Today: Proceedings,2022
4. Deep neural networks for electric energy theft and anomaly detection in the distribution grid;Ceschini,2021
5. Long short-term memory;Hochreiter;Neural Comput.,1997