A Mutual Learning Framework for Pruned and Quantized Networks
-
Published:2023-04-03
Issue:1
Volume:23
Page:e01
-
ISSN:1666-6038
-
Container-title:Journal of Computer Science and Technology
-
language:
-
Short-container-title:JCS&T
Author:
Li Xiaohai,Chen Yiqiang,Wang Jindong
Abstract
Model compression is an important topic in deep learning research. It can be mainly divided into two directions: model pruning and model quantization. However, both methods will more or less affect the original accuracy of the model. In this paper, we propose a mutual learning framework for pruned and quantized networks. We regard the pruned network and the quantizated network as two sets of features that are not parallel. The purpose of our mutual learning framework is to better integrate the two sets of features and achieve complementary advantages, which we call it feature augmentation. To verify the effectiveness of our framework, we select a pairwise combination of 3 state-of-the-art pruning algorithms and 3 state-of-theart quantization algorithms. Extensive experiments on CIFAR-10, CIFAR-100 and Tiny-imagenet show the benefits of our framework: through the mutual learning of the two networks, we obtain a pruning network and a quantization network with higher accuracy at the same time.
Publisher
Universidad Nacional de La Plata
Subject
Artificial Intelligence,Computer Science Applications,Computer Vision and Pattern Recognition,Hardware and Architecture,Computer Science (miscellaneous),Software