From 8129479f345d762dba710f652629950330073617 Mon Sep 17 00:00:00 2001 From: leonardozcm Date: Wed, 10 Jul 2024 09:50:01 +0800 Subject: [PATCH] baichuan2 --- python/llm/example/NPU/HF-Transformers-AutoModels/LLM/README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/python/llm/example/NPU/HF-Transformers-AutoModels/LLM/README.md b/python/llm/example/NPU/HF-Transformers-AutoModels/LLM/README.md index 57b5a1f33d5..3e0b108151b 100644 --- a/python/llm/example/NPU/HF-Transformers-AutoModels/LLM/README.md +++ b/python/llm/example/NPU/HF-Transformers-AutoModels/LLM/README.md @@ -12,6 +12,7 @@ In this directory, you will find examples on how you could apply IPEX-LLM INT4 o | MiniCPM | [openbmb/MiniCPM-2B-sft-bf16](https://huggingface.co/openbmb/MiniCPM-2B-sft-bf16) | | Phi-3 | [microsoft/Phi-3-mini-4k-instruct](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) | | Stablelm | [stabilityai/stablelm-zephyr-3b](https://huggingface.co/stabilityai/stablelm-zephyr-3b) | +| Baichuan2 | [baichuan-inc/Baichuan2-7B-Chat](https://huggingface.co/baichuan-inc/Baichuan-7B-Chat) | ## 0. Requirements To run these examples with IPEX-LLM on Intel NPUs, make sure to install the newest driver version of Intel NPU.