Skip to content

This is the official implement code of paper ECLB: Efficient Contrastive Learning on Bi-level for Noisy Labels

Notifications You must be signed in to change notification settings

whyandbecause/ECLB

Repository files navigation

ECLB

ECLB: Efficient contrastive learning on bi-level for noisy labels, Accepted by KBS

[Paper]


Abstract: *For contrastive learning has achieved remarkable success in self-supervised and supervised informative representation, learning with noisy labels based on contrastive learning is becoming the research consensus. However, under noisy labels, how to efficiently leverage informative representation of various levels and how to effectively screen reliable positive pairs for the optimization of the contrastive learning model are still challenges. To address these issues, we innovatively propose a method named efficient contrastive learning on bi-level for noisy labels (ECLB), which is jointly implemented by both self-supervised and supervised contrastive learning. For the accessible informative representation, we propose to perform contrastive learning at two different levels: (1) feature level, where feature representation is jointly optimized by supervised and self-supervised feature contrastive loss; (2) label level, where feature and label representation are optimized by label distribution supervised contrastive loss. Furthermore, to alleviate the impact of noisy labels on the selection of reliable positive pairs in supervised contrastive learning and to reduce labor cost and computational complexity, we propose an efficient adaptive mask, which is dynamically generated by label self-equality mask, prediction self-equality mask, label-prediction equality mask, and feature similarity mask. Extensive experiments show that our proposed method outperforms other state-of-the-art methods in terms of robustness and generalization. *


Usage

Training Configuration

  • Change the arguments in train_*.py as your wanted, like the --root, --noise_type, --noise_ratio, --num_calss.
  • Then run it by:
python train_*.py  

Citation

If you find our work useful in your research, please consider citing: {@article{GUAN2024112128, title = {ECLB: Efficient contrastive learning on bi-level for noisy labels}, journal = {Knowledge-Based Systems}, pages = {112128}, year = {2024}, issn = {0950-7051}, doi = {https://doi.org/10.1016/j.knosys.2024.112128}, url = {https://www.sciencedirect.com/science/article/pii/S0950705124007627}, author = {Juwei Guan and Jiaxiang Liu and Shuying Huang and Yong Yang}, keywords = {Noisy labels, Label distribution contrastive learning, Feature contrastive learning, Adaptive mask}, abstract = {For contrastive learning has achieved remarkable success in self-supervised and supervised informative representation, learning with noisy labels based on contrastive learning is becoming the research consensus. However, under noisy labels, how to efficiently leverage informative representation of various levels and how to effectively screen reliable positive pairs for the optimization of the contrastive learning model are still challenges. To address these issues, we innovatively propose a method named efficient contrastive learning on bi-level for noisy labels (ECLB), which is jointly implemented by both self-supervised and supervised contrastive learning. For the accessible informative representation, we propose to perform contrastive learning at two different levels: (1) feature level, where feature representation is jointly optimized by supervised and self-supervised feature contrastive loss; (2) label level, where feature and label representation are optimized by label distribution supervised contrastive loss. Furthermore, to alleviate the impact of noisy labels on the selection of reliable positive pairs in supervised contrastive learning and to reduce labor cost and computational complexity, we propose an efficient adaptive mask, which is dynamically generated by label self-equality mask, prediction self-equality mask, label-prediction equality mask, and feature similarity mask. Extensive experiments show that our proposed method outperforms other state-of-the-art methods in terms of robustness and generalization. Our code is publicly available at: https://github.com/whyandbecause/ECLB} }}

Concat

If you have any questions, please feel free to contact me via email at [email protected] or [email protected].

About

This is the official implement code of paper ECLB: Efficient Contrastive Learning on Bi-level for Noisy Labels

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages