Publication | Closed Access
Deep Multilayer Sparse Regularization Time- Varying Transfer Learning Networks With Dynamic Kullback–Leibler Divergence Weights for Mechanical Fault Diagnosis
31
Citations
27
References
2024
Year
Rotating machinery is widely used in industrial production, and its reliable operation is crucial for ensuring production safety and efficiency. Mechanical equipment often faces the challenge of variable speeds. However, existing research pays little attention to domain-adaptive and cross-device diagnostic tasks under time-varying conditions. To fill this research gap and address the serious domain shift problem in cross-device fault diagnosis tasks under time-varying speeds, this article proposes a deep multilayer sparse regularization time-varying transfer learning network (DMsrTTLN) with dynamic Kullback–Leibler divergence weights (DKLDW). The main contributions and innovations of DMsrTTLN are as follows: First, a multilayer sparse regularization module to effectively reduce speed fluctuations; second, an amplitude activation function to enhance the differentiation of data with different labels; third, the kurtosis maximum mean discrepancy, where the Gaussian kernel function adaptively adjusts according to the kurtosis values of the data to enhance domain adaptation capability; and finally, the DKLDW mechanism dynamically balances distance and adversarial metrics to improve model convergence and stability. The DMsrTTLN model with DKLDW exhibits strong generalization performance in cross-device domain shift scenarios. Experimental validation in the same-device and cross-device scenarios is performed on three mechanical machines under time-varying speeds, and the results are compared with those of six state-of-the-art approaches. The results showed that the DMsrTTLN has a better convergence effect and greater diagnostic accuracy.
| Year | Citations | |
|---|---|---|
Page 1
Page 1