Abstract
Abstract
Generative adversarial networks (GANs) have shown promise in the field of small sample fault diagnosis. However, it is worth noting that generating synthetic data using GANs is time-consuming, and synthetic data cannot fully replace real data. To expedite the GAN-based fault diagnostics process, this paper proposes a hybrid lightweight method for compressing GAN parameters. First, three modules are constructed: a teacher generator, a teacher discriminator, and a student generator, based on the knowledge distillation GAN (KD-GAN) approach. The distillation operation is applied to both teacher generator and student generator, while adversarial training is conducted for the teacher generator and the teacher discriminator. Furthermore, a joint loss function is proposed to update the parameters of the student generator by combining distillation loss and adversarial loss. Additionally, the proposed KD-GAN method is combined with deep transfer learning (DTL) and leverages real data to enhance the diagnostic model’s performance. Two numerical experiments are performed to demonstrate that the proposed KD-GAN-DTL method outperforms other GAN-based fault diagnosis methods in terms of computational time and diagnostic accuracy.
Funder
Hubei Natural Science Foundation Youth Program
National Natural Science Foundation of China
Hubei Natural Science Foundation Innovation Development Joint Key Program
Wuhan Key Research and Development Plan Artificial Intelligence Innovation Special Program
Hubei Natural Science Foundation Innovation Group Program
Subject
Applied Mathematics,Instrumentation,Engineering (miscellaneous)
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献