Abstract
Distributed training across several quantum computers could significantly improve the training time and if we could share the learned model, not the data, it could potentially improve the data privacy as the training would happen where the data is located. One of the potential schemes to achieve this property is the federated learning (FL), which consists of several clients or local nodes learning on their own data and a central node to aggregate the models collected from those local nodes. However, to the best of our knowledge, no work has been done in quantum machine learning (QML) in federation setting yet. In this work, we present the federated training on hybrid quantum-classical machine learning models although our framework could be generalized to pure quantum machine learning model. Specifically, we consider the quantum neural network (QNN) coupled with classical pre-trained convolutional model. Our distributed federated learning scheme demonstrated almost the same level of trained model accuracies and yet significantly faster distributed training. It demonstrates a promising future research direction for scaling and privacy aspects.
Funder
Office of Science
Brookhaven National Laboratory
Subject
General Physics and Astronomy
Reference121 articles.
1. Very deep convolutional networks for large-scale image recognition;Simonyan;arXiv,2014
2. Deep Learning for Computer Vision: A Brief Review
3. Sequence to sequence learning with neural networks;Sutskever;Adv. Neural Inf. Process. Syst.,2014
4. Mastering the game of Go with deep neural networks and tree search
Cited by
70 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献