1. Deep model compression: Distilling knowledge from noisy teachers;sau;ArXiv Preprint,2016
2. Distilling the knowledge in a neural network;hinton;ArXiv Preprint,2015
3. MobileNetV2: Inverted Residuals and Linear Bottlenecks
4. Mobilenets: Efficient convo-lutional neural networks for mobile vision applications;howard;ArXiv Preprint,2017
5. BlockDrop: Dynamic Inference Paths in Residual Networks