diff --git a/docs/blog/posts/distilation-part1.md b/docs/blog/posts/distilation-part1.md index 04c801001..3a0184d10 100644 --- a/docs/blog/posts/distilation-part1.md +++ b/docs/blog/posts/distilation-part1.md @@ -14,6 +14,8 @@ tags: Get ready to dive deep into the world of fine-tuning task specific language models with Python functions. We'll explore how the `instructor.instructions` streamlines this process, making the task you want to distil more efficient and powerful while preserving its original functionality and backwards compatibility. +If you want to see the full example checkout [examples/distillation](https://github.com/jxnl/instructor/tree/main/examples/distilations) + ## Why You Need Instructor Imagine you're developing a backend service that uses a mix old and new school ML practises, it may involve pipelines with multiple function calls, validations, and data processing. Sounds cumbersome, right? That's where `Instructor` comes in. It simplifies complex procedures, making them more efficient and easier to manage by adding a decorator to your function that will automatically generate a dataset for fine-tuning and help you swap out the function implementation. diff --git a/examples/distilations/readme.md b/examples/distilations/readme.md new file mode 100644 index 000000000..4c7583446 --- /dev/null +++ b/examples/distilations/readme.md @@ -0,0 +1,41 @@ +# What to Expect +This script demonstrates how to use the `Instructor` library for fine-tuning a Python function that performs three-digit multiplication. It uses Pydantic for type validation and logging features to generate a fine-tuning dataset. + +## How to Run + +### Prerequisites +- Python 3.9 +- `Instructor` library + +### Steps +1. **Install Dependencies** + If you haven't already installed the required libraries, you can do so using pip: + ``` + pip install instructor pydantic + ``` + +2. **Set Up Logging** + The script uses Python's built-in `logging` module to log the fine-tuning process. Ensure you have write permissions in the directory where the log file `math_finetunes.jsonl` will be saved. + +3. **Run the Script** + Navigate to the directory containing `script.py` and run it: + ``` + python three_digit_mul.py + ``` + + This will execute the script, running the function ten times with random three-digit numbers for multiplication. The function outputs and logs are saved in `math_finetunes.jsonl`. + +4. **Fine-Tuning** + Once you have the log file, you can run a fine-tuning job using the following `Instructor` CLI command: + ``` + instructor jobs create-from-file math_finetunes.jsonl + ``` + Wait for the fine-tuning job to complete. + +### Output + +That's it! You've successfully run the script and can now proceed to fine-tune your model. + +### Dispatch + +Once you have the model you can replace the model in `three_digit_mul_dispatch.py` with the model you just fine-tuned and run the script again. This time, the script will use the fine-tuned model to predict the output of the function. \ No newline at end of file