Skip to content

Commit

Permalink
fix
Browse files Browse the repository at this point in the history
  • Loading branch information
JinBridger committed Jun 12, 2024
1 parent 222e966 commit 9fbcf97
Show file tree
Hide file tree
Showing 2 changed files with 1 addition and 4 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,6 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all]
# install tiktoken required for GLM-4
pip install tiktoken
```

Expand Down Expand Up @@ -122,7 +121,6 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all]
# install tiktoken required for GLM-4
pip install tiktoken
```

Expand Down
3 changes: 1 addition & 2 deletions python/llm/example/CPU/PyTorch-Models/Model/glm4/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,6 @@ conda activate llm
pip install --pre --upgrade ipex-llm[all]
# install tiktoken required for GLM-4
pip install tiktoken
```

Expand Down Expand Up @@ -68,7 +67,7 @@ In the example, several arguments can be passed to satisfy your requirements:
- `--n-predict`: int, argument defining the max number of tokens to predict. It is default to be `32`.

#### 2.4 Sample Output
#### [THUDM/glm-4-9b-chat](https://huggingface.co/THUDM/glm-4-9b-chat)
##### [THUDM/glm-4-9b-chat](https://huggingface.co/THUDM/glm-4-9b-chat)
```log
Inference time: xxxx s
-------------------- Output --------------------
Expand Down

0 comments on commit 9fbcf97

Please sign in to comment.