Abstract
AbstractTraining large neural networks on big datasets requires significant computational resources and time. Transfer learning reduces training time by pre-training a base model on one dataset and transferring the knowledge to a new model for another dataset. However, current choices of transfer learning algorithms are limited because the transferred models always have to adhere to the dimensions of the base model and can not easily modify the neural architecture to solve other datasets. On the other hand, biological neural networks (BNNs) are adept at rearranging themselves to tackle completely different problems using transfer learning. Taking advantage of BNNs, we design a dynamic neural network that is transferable to any other network architecture and can accommodate many datasets. Our approach uses raytracing to connect neurons in a three-dimensional space, allowing the network to grow into any shape or size. In the Alcala dataset, our transfer learning algorithm trains the fastest across changing environments and input sizes. In addition, we show that our algorithm also outperformance the state of the art in EEG dataset. In the future, this network may be considered for implementation on real biological neural networks to decrease power consumption.
Funder
Gouvernement du Canada | Natural Sciences and Engineering Research Council of Canada
United States Department of Defense | Defense Threat Reduction Agency
Publisher
Springer Science and Business Media LLC
Reference46 articles.
1. Jakubovitz, D., Giryes, R., Rodrigues, M. R. D. Generalization error in deep learning. In Compressed Sensing and Its Applications, Third International MATHEON Conference 2017 153–195 (Birkhäuser, 2019).
2. Chen, S.-T., Cornelius, C., Martin, J. & Chau, D. H. P. Shapeshifter: robust physical adversarial attack on faster R-CNN object detector. In Machine Learning and Knowledge Discovery in Databases 52–68 (Springer, 2018).
3. Jiang, Y. et al. Methods and analysis of the first competition in predicting generalization of deep learning. In Proc. NeurIPS 2020 Competition and Demonstration Track 170–190 (PMLR, 2021).
4. Tan, C. et al. A survey on deep transfer learning. In 27th International Conference on Artificial Neural Networks and Machine Learning 270–279 (2018).
5. Zhuang, F. et al. A comprehensive survey on transfer learning. Proc. IEEE 109, 43–76 (2020).
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献