Skip to content

Commit

Permalink
Merge pull request #408 from yoshitomo-matsubara/dev
Browse files Browse the repository at this point in the history
Update README and documentation
  • Loading branch information
yoshitomo-matsubara authored Oct 27, 2023
2 parents bc8ab5e + 4254ad8 commit b465626
Show file tree
Hide file tree
Showing 5 changed files with 52 additions and 9 deletions.
22 changes: 17 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,14 +11,14 @@
and enables you to design (new) experiments simply by editing a declarative yaml config file instead of Python code.
Even when you need to extract intermediate representations in teacher/student models,
you will **NOT** need to reimplement the models, that often change the interface of the forward, but instead
specify the module path(s) in the yaml file. Refer to [this paper](https://github.com/yoshitomo-matsubara/torchdistill#citation) for more details.
specify the module path(s) in the yaml file. Refer to [these papers](https://github.com/yoshitomo-matsubara/torchdistill#citation) for more details.

In addition to knowledge distillation, this framework helps you design and perform general deep learning experiments
(**WITHOUT coding**) for reproducible deep learning studies. i.e., it enables you to train models without teachers
simply by excluding teacher entries from a declarative yaml config file.
You can find such examples below and in [configs/sample/](https://github.com/yoshitomo-matsubara/torchdistill/tree/main/configs/sample/).

When you refer to ***torchdistill*** in your paper, please cite [this paper](https://github.com/yoshitomo-matsubara/torchdistill#citation)
When you refer to ***torchdistill*** in your paper, please cite [these papers](https://github.com/yoshitomo-matsubara/torchdistill#citation)
instead of this GitHub repository.
**If you use** ***torchdistill*** **as part of your work, your citation is appreciated and motivates me to maintain and upgrade this framework!**

Expand Down Expand Up @@ -127,12 +127,12 @@ pipenv install "-e ."
```

## Issues / Questions / Requests
The documentation is work-in-progress. In the meantime, feel free to create an issue if you find a bug.
Feel free to create an issue if you find a bug.
If you have either a question or feature request, start a new discussion [here](https://github.com/yoshitomo-matsubara/torchdistill/discussions).
Please make sure the issue/question/request has not been addressed yet by searching through the issues and discussions.

## Citation
If you use ***torchdistill*** in your research, please cite the following paper.
If you use ***torchdistill*** in your research, please cite the following papers:
[[Paper](https://link.springer.com/chapter/10.1007/978-3-030-76423-4_3)] [[Preprint](https://arxiv.org/abs/2011.12913)]
```bibtex
@inproceedings{matsubara2021torchdistill,
Expand All @@ -145,12 +145,24 @@ If you use ***torchdistill*** in your research, please cite the following paper.
}
```

[[Preprint](https://arxiv.org/abs/2310.17644)]
```bibtex
@article{matsubara2023torchdistill,
title={{torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free Deep Learning Studies: A Case Study on NLP}},
author={Matsubara, Yoshitomo},
journal={arXiv preprint arXiv:2310.17644},
year={2023}
}
```

## Acknowledgments

Since June 2022, this project has been supported by [JetBrain's Free License Programs (Open Source)](https://www.jetbrains.com/community/opensource/?utm_campaign=opensource&utm_content=approved&utm_medium=email&utm_source=newsletter&utm_term=jblogo#support).
This project has been supported by Travis CI's OSS credits and [JetBrain's Free License Programs (Open Source)](https://www.jetbrains.com/community/opensource/?utm_campaign=opensource&utm_content=approved&utm_medium=email&utm_source=newsletter&utm_term=jblogo#support)
since November 2021 and June 2022, respectively.
![PyCharm logo](https://resources.jetbrains.com/storage/products/company/brand/logos/PyCharm.svg)



## References
- [:mag:](https://github.com/yoshitomo-matsubara/torchdistill/tree/main/examples/image_classification.py) [pytorch/vision/references/classification/](https://github.com/pytorch/vision/blob/main/references/classification/)
- [:mag:](https://github.com/yoshitomo-matsubara/torchdistill/tree/main/examples/object_detection.py) [pytorch/vision/references/detection/](https://github.com/pytorch/vision/tree/main/references/detection/)
Expand Down
1 change: 1 addition & 0 deletions docs/source/benchmarks.rst
Original file line number Diff line number Diff line change
Expand Up @@ -45,3 +45,4 @@ Original work
References
^^^^
* Yoshitomo Matsubara: `"torchdistill: A Modular, Configuration-Driven Framework for Knowledge Distillation" <https://link.springer.com/chapter/10.1007/978-3-030-76423-4_3>`_
* Yoshitomo Matsubara: `"torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free Deep Learning Studies: A Case Study on NLP" <https://arxiv.org/abs/2310.17644>`_
9 changes: 8 additions & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ In addition to knowledge distillation, this framework helps you design and perfo
simply by excluding teacher entries from a declarative yaml config file.
You can find such examples in `configs/sample/ of the official repository <https://github.com/yoshitomo-matsubara/torchdistill/tree/main/configs/sample/>`_.

When you refer to **torchdistill** in your paper, please cite `this paper <https://github.com/yoshitomo-matsubara/torchdistill#citation>`_
When you refer to **torchdistill** in your paper, please cite `these papers <https://github.com/yoshitomo-matsubara/torchdistill#citation>`_
instead of this GitHub repository.
**If you use torchdistill as part of your work, your citation is appreciated and motivates me to maintain and upgrade this framework!**

Expand Down Expand Up @@ -53,6 +53,13 @@ References
organization={Springer}
}
@article{matsubara2023torchdistill,
title={{torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free Deep Learning Studies: A Case Study on NLP}},
author={Matsubara, Yoshitomo},
journal={arXiv preprint arXiv:2310.17644},
year={2023}
}
Questions / Requests
==================
Expand Down
25 changes: 22 additions & 3 deletions docs/source/projects.rst
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,28 @@ It is pip-installable and published as a PyPI package i.e., you can install it b
Papers
*****

torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free Deep Learning Studies: A Case Study on NLP
----
* Author(s): Yoshitomo Matsubara
* Venue: EMNLP 2023 Workshop for Natural Language Processing Open Source Software (NLP-OSS)
* PDF: `Preprint <https://arxiv.org/abs/2310.17644>`_
* Code: `GitHub <https://github.com/yoshitomo-matsubara/torchdistill>`_

**Abstract**: Reproducibility in scientific work has been becoming increasingly important in research communities
such as machine learning, natural language processing, and computer vision communities due to the rapid development of
the research domains supported by recent advances in deep learning. In this work, we present a significantly upgraded
version of torchdistill, a modular-driven coding-free deep learning framework significantly upgraded from the initial
release, which supports only image classification and object detection tasks for reproducible knowledge distillation
experiments. To demonstrate that the upgraded framework can support more tasks with third-party libraries, we reproduce
the GLUE benchmark results of BERT models using a script based on the upgraded torchdistill, harmonizing with various
Hugging Face libraries. All the 27 fine-tuned BERT models and configurations to reproduce the results are published at
Hugging Face, and the model weights have already been widely used in research communities. We also reimplement popular
small-sized models and new knowledge distillation methods and perform additional experiments for computer vision tasks.


SC2 Benchmark: Supervised Compression for Split Computing
----
* Authors: Yoshitomo Matsubara, Ruihan Yang, Marco Levorato, Stephan Mandt
* Author(s): Yoshitomo Matsubara, Ruihan Yang, Marco Levorato, Stephan Mandt
* Venue: TMLR
* PDF: `Paper + Supp <https://openreview.net/forum?id=p28wv4G65d>`_
* Code: `GitHub <https://github.com/yoshitomo-matsubara/sc2-benchmark>`_
Expand All @@ -49,7 +68,7 @@ researchers better understand the tradeoffs of supervised compression in split c

Supervised Compression for Resource-Constrained Edge Computing Systems
----
* Authors: Yoshitomo Matsubara, Ruihan Yang, Marco Levorato, Stephan Mandt
* Author(s): Yoshitomo Matsubara, Ruihan Yang, Marco Levorato, Stephan Mandt
* Venue: WACV 2022
* PDF: `Paper + Supp <https://openaccess.thecvf.com/content/WACV2022/html/Matsubara_Supervised_Compression_for_Resource-Constrained_Edge_Computing_Systems_WACV_2022_paper.html>`_
* Code: `GitHub <https://github.com/yoshitomo-matsubara/supervised-compression>`_
Expand All @@ -70,7 +89,7 @@ latency. We furthermore show that the learned feature representations can be tun

torchdistill: A Modular, Configuration-Driven Framework for Knowledge Distillation
----
* Author: Yoshitomo Matsubara
* Author(s): Yoshitomo Matsubara
* Venue: ICPR 2020 International Workshop on Reproducible Research in Pattern Recognition
* PDF: `Paper <https://link.springer.com/chapter/10.1007/978-3-030-76423-4_3>`_
* Code: `GitHub <https://github.com/yoshitomo-matsubara/torchdistill>`_
Expand Down
4 changes: 4 additions & 0 deletions docs/source/subpkgs/models.rst
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,10 @@ To reproduce the test results for CIFAR datasets, the following repositories wer
- 95.53
- 77.14

Those results are reported in the following paper:

* Yoshitomo Matsubara: `"torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free Deep Learning Studies: A Case Study on NLP" <https://arxiv.org/abs/2310.17644>`_

.. automodule:: torchdistill.models.classification
:members:

Expand Down

0 comments on commit b465626

Please sign in to comment.