Abstract
Abstract
In practical application, the network depth and the number of parameters of person re-identification(re-ID) model bring great challenges to its deployment on devices with weak computing capabilities such as cloud desktop, mobile terminal and embedded terminal. To solve the above problem, this paper proposed a model compression method based on a depthwise separable convolutional network. Knowledge distillation is used to the core idea of model compression. Knowledge distillation is to transfer knowledge from a complex model to a simple model. The complex model is called the teacher model. The simple model is called the student model. This paper proposed ResNet18 based on depthwise separable convolution (ResNet18-DSC). ResNet50 is used as a teacher network and ResNet18-DSC is used as a student network. To narrow the performance gap between student and teacher networks. KL divergence loss function is used to approximate the soft label distribution of student network to that of teacher network. With a slight decrease in recognition rate, this method reduces the number of parameters by about 20 times and improves the calculation speed by about 20 times.
Subject
General Physics and Astronomy