Quantum Convolutional Long Short-Term Memory Based on Variational Quantum Algorithms in the Era of NISQ
-
Published:2024-03-22
Issue:4
Volume:15
Page:175
-
ISSN:2078-2489
-
Container-title:Information
-
language:en
-
Short-container-title:Information
Author:
Xu Zeyu12ORCID, Yu Wenbin1234ORCID, Zhang Chengjun23ORCID, Chen Yadang2ORCID
Affiliation:
1. School of Software, Nanjing University of Information Science and Technology, Nanjing 210044, China 2. School of Computer Science, Nanjing University of Information Science and Technology, Nanjing 210044, China 3. Wuxi Institute of Technology, Nanjing University of Information Science & Technology, Wuxi 214000, China 4. Jiangsu Collaborative Innovation Center of Atmospheric Environment and Equipment Technology (CICAEET), Nanjing University of Information Science and Technology, Nanjing 210044, China
Abstract
In the era of noisy intermediate-scale quantum (NISQ) computing, the synergistic collaboration between quantum and classical computing models has emerged as a promising solution for tackling complex computational challenges. Long short-term memory (LSTM), as a popular network for modeling sequential data, has been widely acknowledged for its effectiveness. However, with the increasing demand for data and spatial feature extraction, the training cost of LSTM exhibits exponential growth. In this study, we propose the quantum convolutional long short-term memory (QConvLSTM) model. By ingeniously integrating classical convolutional LSTM (ConvLSTM) networks and quantum variational algorithms, we leverage the variational quantum properties and the accelerating characteristics of quantum states to optimize the model training process. Experimental validation demonstrates that, compared to various LSTM variants, our proposed QConvLSTM model outperforms in terms of performance. Additionally, we adopt a hierarchical tree-like circuit design philosophy to enhance the model’s parallel computing capabilities while reducing dependence on quantum bit counts and circuit depth. Moreover, the inherent noise resilience in variational quantum algorithms makes this model more suitable for spatiotemporal sequence modeling tasks on NISQ devices.
Funder
Natural Science Foundation of China Natural Science Foundation of Jiangsu Province
Reference49 articles.
1. Long short-term memory;Hochreiter;Neural Comput.,1997 2. Karpathy, A., and Fei-Fei, L. (2015, January 7–12). Deep visual-semantic alignments for generating image descriptions. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA. 3. Pascanu, R., Mikolov, T., and Bengio, Y. (2013, January 16–21). On the difficulty of training recurrent neural networks. Proceedings of the 30th International Conference on Machine Learning, PMLR, Atlanta, GA, USA. 4. Sutskever, I., Vinyals, O., and Le, Q.V. (2014, January 8–13). Sequence to sequence learning with neural networks. Proceedings of the Advances in Neural Information Processing Systems 27 (NIPS 2014), Montreal, QC, Canada. 5. Bengio, Y., Goodfellow, I., and Courville, A. (2015). Deep Learning, MIT Press.
|
|