Skip to content
#

self-training

Here are 80 public repositories matching this topic...

This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.

  • Updated Oct 22, 2024

Improve this page

Add a description, image, and links to the self-training topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the self-training topic, visit your repo's landing page and select "manage topics."

Learn more