Abstract
AbstractThis article proposes a novel method to optimise the Dynamic Architecture Neural Network (DAN2) adapted for a multi-task learning problem. The multi-task learning neural network adopts a multi-head and serial architecture with DAN2 layers acting as the basic subroutine. Adopting a dynamic architecture, the layers are added consecutively starting from a minimal initial structure. The optimisation method adopts an iterative heuristic scheme that sequentially optimises the shared layers and the task-specific layers until the solver converges to a small tolerance. Application of the method has demonstrated the applicability of the algorithm to simulated datasets. Comparable results to Artificial Neural Networks (ANNs) have been obtained in terms of accuracy and speed.
Funder
China Scholarship Council
BASF Corporation
Publisher
Springer Science and Business Media LLC
Subject
Artificial Intelligence,Software
Reference57 articles.
1. Himmelblau DM (2000) Applications of artificial neural networks in chemical engineering. Korean J Chem Eng 17(4):373–392
2. Ghiassi M, Saidane H (2005) A dynamic architecture for artificial neural networks. Neurocomputing 63:397–413
3. Blalock D, Ortiz J. G, Frankle J, Guttag J (2020) What is the state of neural network pruning? arXiv preprint arXiv:2003.03033 [Online]
4. Malach E, Yehudai G, Shalev-Shwartz S, Shamir O (2020) Proving the lottery ticket hypothesis: pruning is all you need. arXiv preprint arXiv:2002.00585 [Online]
5. Jiang T, Yang X, Shi Y, Wang H (2019) Layer-wise deep neural network pruning via iteratively reweighted optimization. In: ICASSP 2019-2019 IEEE international conference on acoustics, speech and signal processing (ICASSP). IEEE, pp 5606–5610
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献