Skip to content

Commit

Permalink
Merge pull request #496 from yoshitomo-matsubara/dev
Browse files Browse the repository at this point in the history
Update news
  • Loading branch information
yoshitomo-matsubara authored Jan 10, 2025
2 parents c3c04bd + f752571 commit cafdae4
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,6 @@
[![DOI:10.1007/978-3-030-76423-4_3](https://zenodo.org/badge/DOI/10.1007/978-3-030-76423-4_3.svg)](https://doi.org/10.1007/978-3-030-76423-4_3)
[![DOI:10.18653/v1/2023.nlposs-1.18](https://zenodo.org/badge/DOI/10.18653/v1/2023.nlposs-1.18.svg)](https://doi.org/10.18653/v1/2023.nlposs-1.18)


***torchdistill*** (formerly *kdkit*) offers various state-of-the-art knowledge distillation methods
and enables you to design (new) experiments simply by editing a declarative yaml config file instead of Python code.
Even when you need to extract intermediate representations in teacher/student models,
Expand All @@ -19,6 +18,8 @@ In addition to knowledge distillation, this framework helps you design and perfo
simply by excluding teacher entries from a declarative yaml config file.
You can find such examples below and in [configs/sample/](https://github.com/yoshitomo-matsubara/torchdistill/tree/main/configs/sample/).

In December 2023, ***torchdistill*** officially joined [PyTorch Ecosystem](https://pytorch.org/ecosystem/).

When you refer to ***torchdistill*** in your paper, please cite [these papers](https://github.com/yoshitomo-matsubara/torchdistill#citation)
instead of this GitHub repository.
**If you use** ***torchdistill*** **as part of your work, your citation is appreciated and motivates me to maintain and upgrade this framework!**
Expand Down

0 comments on commit cafdae4

Please sign in to comment.