From f752571829f3ce5ff30460324eb284272703ed52 Mon Sep 17 00:00:00 2001 From: Yoshitomo Matsubara Date: Fri, 10 Jan 2025 14:37:58 -0800 Subject: [PATCH] Update news --- README.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 7d0b49a8..c4044826 100644 --- a/README.md +++ b/README.md @@ -7,7 +7,6 @@ [![DOI:10.1007/978-3-030-76423-4_3](https://zenodo.org/badge/DOI/10.1007/978-3-030-76423-4_3.svg)](https://doi.org/10.1007/978-3-030-76423-4_3) [![DOI:10.18653/v1/2023.nlposs-1.18](https://zenodo.org/badge/DOI/10.18653/v1/2023.nlposs-1.18.svg)](https://doi.org/10.18653/v1/2023.nlposs-1.18) - ***torchdistill*** (formerly *kdkit*) offers various state-of-the-art knowledge distillation methods and enables you to design (new) experiments simply by editing a declarative yaml config file instead of Python code. Even when you need to extract intermediate representations in teacher/student models, @@ -19,6 +18,8 @@ In addition to knowledge distillation, this framework helps you design and perfo simply by excluding teacher entries from a declarative yaml config file. You can find such examples below and in [configs/sample/](https://github.com/yoshitomo-matsubara/torchdistill/tree/main/configs/sample/). +In December 2023, ***torchdistill*** officially joined [PyTorch Ecosystem](https://pytorch.org/ecosystem/). + When you refer to ***torchdistill*** in your paper, please cite [these papers](https://github.com/yoshitomo-matsubara/torchdistill#citation) instead of this GitHub repository. **If you use** ***torchdistill*** **as part of your work, your citation is appreciated and motivates me to maintain and upgrade this framework!**