From 105ceca64d0bfbdde85e48fd9a531318d8419983 Mon Sep 17 00:00:00 2001 From: garyzhang99 Date: Fri, 17 May 2024 14:24:49 +0800 Subject: [PATCH] fix precommit --- docs/sphinx_doc/en/source/tutorial/206-prompt.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/sphinx_doc/en/source/tutorial/206-prompt.md b/docs/sphinx_doc/en/source/tutorial/206-prompt.md index 552d70406..bf7273bf5 100644 --- a/docs/sphinx_doc/en/source/tutorial/206-prompt.md +++ b/docs/sphinx_doc/en/source/tutorial/206-prompt.md @@ -306,12 +306,12 @@ print(prompt) `LiteLLMChatWrapper` encapsulates the litellm chat API, which takes a list of messages as input. The litellm support different types of models, and each model might need to obey different formats. To simplify the usage, we provide a format -that could be compatible with most models. If more specifical formats are needed, -you can refer to the specifical model you use as weel as the -[litellm](https://github.com/BerriAI/litellm) documentation to customize your +that could be compatible with most models. If more specifical formats are needed, +you can refer to the specifical model you use as weel as the +[litellm](https://github.com/BerriAI/litellm) documentation to customize your own format function for your model. -- format all the messages in the chat history, into a single message with `"user"` as `role` +- format all the messages in the chat history, into a single message with `"user"` as `role` #### Prompt Strategy