From 11b1116503430f15d860bc037f6327ce6d852de0 Mon Sep 17 00:00:00 2001 From: zhouyu5 Date: Fri, 3 Nov 2023 15:55:45 +0000 Subject: [PATCH] update readme --- e2eAIOK/deltatuner/README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/e2eAIOK/deltatuner/README.md b/e2eAIOK/deltatuner/README.md index 394d436d8..39f8ce2c2 100644 --- a/e2eAIOK/deltatuner/README.md +++ b/e2eAIOK/deltatuner/README.md @@ -68,6 +68,7 @@ def optimize(model, tokenizer, algo: str="auto", deltatuning_args: DeltaTunerArg - "auto" – If the input model is mpt, the algorithm is ssf; elif the algorithm is lora - "lora" – use the lora algotihm - "ssf" – use the ssf algotithm + - deltatuning_args.best_model_structure Specifies the pre-searched delta best structure so the model can be directly initilized without searching. - kwargs - used to initilize deltatuning_args through key=value, such as algo="lora" Return DeltaTunerModel - a wrapper of model, which composed of the original properties/function together with adavance properties/function provided by deltatuner @@ -80,6 +81,7 @@ def optimize(model, tokenizer, algo: str="auto", deltatuning_args: DeltaTunerArg Please refer to [example page](https://github.com/intel/e2eAIOK/tree/main/example) for more use cases on fine-tuning other LLMs with the help of DeltaTuner. ## Model supported matrix +We have upload the searched delta best structure to the [conf dir](https://github.com/intel/e2eAIOK/tree/main/e2eAIOK/deltatuner/deltatuner/conf/best_structure), so that users can directly use our searched structure for directly fine-tuning by passing the `DeltaTunerArguments.best_model_structure` to the `deltatuner.optimize` function. ### Causal Language Modeling