Publication | Closed Access
Co-LDL: A Co-Training-Based Label Distribution Learning Method for Tackling Label Noise
39
Citations
43
References
2021
Year
Structured PredictionEngineeringMachine LearningAutoencodersLabel NoiseClassification MethodData ScienceData MiningPattern RecognitionSelf-supervised LearningLabel DistributionSemi-supervised LearningStatisticsSupervised LearningAutomatic ClassificationNoisy DataComputer ScienceDeep LearningDeep Neural NetworksStatistical Inference
Performances of deep neural networks are prone to be degraded by label noise due to their powerful capability in fitting training data. Deeming low-loss instances as clean data is one of the most promising strategies in tackling label noise and has been widely adopted by state-of-the-art methods. However, prior works tend to drop high-loss instances directly, neglecting their valuable information. To address this issue, we propose an end-to-end framework named Co-LDL, which incorporates the low-loss sample selection strategy with label distribution learning. Specifically, we simultaneously train two deep neural networks and let them communicate useful knowledge by selecting low-loss and high-loss samples for each other. Low-loss samples are leveraged conventionally for updating network parameters. On the contrary, high-loss samples are trained in a label distribution learning manner to update network parameters and label distributions concurrently. Moreover, we propose a self-supervised module to further boost the model performance by enhancing the learned representations. Comprehensive experiments on both synthetic and real-world noisy datasets are provided to demonstrate the superiority of our Co-LDL method over state-of-the-art approaches in learning with noisy labels. The source code and models have been made available at <uri xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">https://github.com/NUST-Machine-Intelligence-Laboratory/CoLDL</uri> .
| Year | Citations | |
|---|---|---|
Page 1
Page 1