From a4bc2186fd3b1f2fc724a3713d7f4af6e855f3c4 Mon Sep 17 00:00:00 2001 From: wenhao Date: Thu, 22 Feb 2024 14:51:03 +0800 Subject: [PATCH 1/6] fix minor example bug --- docs/sphinx_doc/source/tutorial/103-example.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/sphinx_doc/source/tutorial/103-example.md b/docs/sphinx_doc/source/tutorial/103-example.md index 068b42ab7..08ead7cf5 100644 --- a/docs/sphinx_doc/source/tutorial/103-example.md +++ b/docs/sphinx_doc/source/tutorial/103-example.md @@ -52,7 +52,7 @@ from agentscope.agents import DialogAgent, UserAgent agentscope.init(model_configs="./openai_model_configs.json") # Create a dialog agent and a user agent -dialogAgent = DialogAgent(name="assistant", model_config_name="gpt-4") +dialogAgent = DialogAgent(name="assistant", model_config_name="gpt-4", sys_prompt="You are a helpful ai assistant") userAgent = UserAgent() ``` From b9f381d23a13e981fd0873b6adc4871a58ebd01e Mon Sep 17 00:00:00 2001 From: wenhao Date: Thu, 22 Feb 2024 16:52:19 +0800 Subject: [PATCH 2/6] fix single typo --- docs/sphinx_doc/source/tutorial/201-agent.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/sphinx_doc/source/tutorial/201-agent.md b/docs/sphinx_doc/source/tutorial/201-agent.md index f4d513e53..76b6400b6 100644 --- a/docs/sphinx_doc/source/tutorial/201-agent.md +++ b/docs/sphinx_doc/source/tutorial/201-agent.md @@ -2,7 +2,7 @@ # Customizing Your Own Agent -This tutorial helps you to understand the `Agent` in mode depth and navigate through the process of crafting your own custom agent with AgentScope. We start by introducing the fundamental abstraction called `AgentBase`, which serves as the base class to maintain the general behaviors of all agents. Then, we will go through the *AgentPool*, an ensemble of pre-built, specialized agents, each designed with a specific purpose in mind. Finally, we will demonstrate how to customize your own agent, ensuring it fits the needs of your project. +This tutorial helps you to understand the `Agent` in more depth and navigate through the process of crafting your own custom agent with AgentScope. We start by introducing the fundamental abstraction called `AgentBase`, which serves as the base class to maintain the general behaviors of all agents. Then, we will go through the *AgentPool*, an ensemble of pre-built, specialized agents, each designed with a specific purpose in mind. Finally, we will demonstrate how to customize your own agent, ensuring it fits the needs of your project. ## Understanding `AgentBase` From e88ffad9155fa5e14ef7406cebd33ed980080709 Mon Sep 17 00:00:00 2001 From: wenhao Date: Thu, 22 Feb 2024 17:31:18 +0800 Subject: [PATCH 3/6] fix single typo in 105-logging.md --- docs/sphinx_doc/source/tutorial/105-logging.md | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/docs/sphinx_doc/source/tutorial/105-logging.md b/docs/sphinx_doc/source/tutorial/105-logging.md index dd6cbbc56..913fc92cc 100644 --- a/docs/sphinx_doc/source/tutorial/105-logging.md +++ b/docs/sphinx_doc/source/tutorial/105-logging.md @@ -79,8 +79,7 @@ agentscope.web.init( ) ``` -By this way, you can see all the running instances and projects in `http://127. -0.0.1:5000` as follows: +By this way, you can see all the running instances and projects in `http://127.0.0.1:5000` as follows: ![webui](https://img.alicdn.com/imgextra/i3/O1CN01kpHFkn1HpeYEkn60I_!!6000000000807-0-tps-3104-1849.jpg) From 2d351f25a569c7491189ce64ba0474eb5de6c05d Mon Sep 17 00:00:00 2001 From: garyzhang99 Date: Fri, 17 May 2024 14:16:42 +0800 Subject: [PATCH 4/6] litellm format doc --- .../en/source/tutorial/206-prompt.md | 49 +++++++++++++++++++ .../zh_CN/source/tutorial/206-prompt.md | 40 +++++++++++++++ 2 files changed, 89 insertions(+) diff --git a/docs/sphinx_doc/en/source/tutorial/206-prompt.md b/docs/sphinx_doc/en/source/tutorial/206-prompt.md index e30e8abd8..552d70406 100644 --- a/docs/sphinx_doc/en/source/tutorial/206-prompt.md +++ b/docs/sphinx_doc/en/source/tutorial/206-prompt.md @@ -300,6 +300,55 @@ print(prompt) ] ``` + +### LiteLLMChatWrapper + +`LiteLLMChatWrapper` encapsulates the litellm chat API, which takes a list of +messages as input. The litellm support different types of models, and each model +might need to obey different formats. To simplify the usage, we provide a format +that could be compatible with most models. If more specifical formats are needed, +you can refer to the specifical model you use as weel as the +[litellm](https://github.com/BerriAI/litellm) documentation to customize your +own format function for your model. + +- format all the messages in the chat history, into a single message with `"user"` as `role` + +#### Prompt Strategy + +- Messages will consist dialogue history in the `user` message prefixed by the system message and "## Dialogue History". + +```python +from agentscope.models import LiteLLMChatWrapper + +model = LiteLLMChatWrapper( + config_name="", # empty since we directly initialize the model wrapper + model_name="gpt-3.5-turbo", +) + +prompt = model.format( + Msg("system", "You are a helpful assistant", role="system"), + [ + Msg("user", "What is the weather today?", role="user"), + Msg("assistant", "It is sunny today", role="assistant"), + ], +) + +print(prompt) +``` + +```bash +[ + { + "role": "user", + "content": ( + "You are a helpful assistant\n\n" + "## Dialogue History\nuser: What is the weather today?\n" + "assistant: It is sunny today" + ), + }, +] +``` + ### OllamaChatWrapper `OllamaChatWrapper` encapsulates the Ollama chat API, which takes a list of diff --git a/docs/sphinx_doc/zh_CN/source/tutorial/206-prompt.md b/docs/sphinx_doc/zh_CN/source/tutorial/206-prompt.md index 7ed143cfe..c2767d902 100644 --- a/docs/sphinx_doc/zh_CN/source/tutorial/206-prompt.md +++ b/docs/sphinx_doc/zh_CN/source/tutorial/206-prompt.md @@ -271,6 +271,46 @@ print(prompt) ] ``` +### LiteLLMChatWrapper + +`LiteLLMChatWrapper`封装了litellm聊天API,它接受消息列表作为输入。Litellm支持不同类型的模型,每个模型可能需要遵守不同的格式。为了简化使用,我们提供了一种与大多数模型兼容的格式。如果需要更特定的格式,您可以参考您所使用的特定模型以及[litellm](https://github.com/BerriAI/litellm)文档,来定制适合您模型的格式函数。 +- 格式化聊天历史中的所有消息,将其整合成一个以`"user"`作为`role`的单一消息 +#### 提示策略 +- 消息将包括对话历史,`user`消息由系统消息(system message)和"## Dialog History"前缀。 + + +```python +from agentscope.models import LiteLLMChatWrapper + +model = LiteLLMChatWrapper( + config_name="", # empty since we directly initialize the model wrapper + model_name="gpt-3.5-turbo", +) + +prompt = model.format( + Msg("system", "You are a helpful assistant", role="system"), + [ + Msg("user", "What is the weather today?", role="user"), + Msg("assistant", "It is sunny today", role="assistant"), + ], +) + +print(prompt) +``` + +```bash +[ + { + "role": "user", + "content": ( + "You are a helpful assistant\n\n" + "## Dialogue History\nuser: What is the weather today?\n" + "assistant: It is sunny today" + ), + }, +] +``` + ### `OllamaChatWrapper` `OllamaChatWrapper`封装了Ollama聊天API,它接受消息列表作为输入。消息必须遵守以下规则(更新于2024/03/22): From 105ceca64d0bfbdde85e48fd9a531318d8419983 Mon Sep 17 00:00:00 2001 From: garyzhang99 Date: Fri, 17 May 2024 14:24:49 +0800 Subject: [PATCH 5/6] fix precommit --- docs/sphinx_doc/en/source/tutorial/206-prompt.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/sphinx_doc/en/source/tutorial/206-prompt.md b/docs/sphinx_doc/en/source/tutorial/206-prompt.md index 552d70406..bf7273bf5 100644 --- a/docs/sphinx_doc/en/source/tutorial/206-prompt.md +++ b/docs/sphinx_doc/en/source/tutorial/206-prompt.md @@ -306,12 +306,12 @@ print(prompt) `LiteLLMChatWrapper` encapsulates the litellm chat API, which takes a list of messages as input. The litellm support different types of models, and each model might need to obey different formats. To simplify the usage, we provide a format -that could be compatible with most models. If more specifical formats are needed, -you can refer to the specifical model you use as weel as the -[litellm](https://github.com/BerriAI/litellm) documentation to customize your +that could be compatible with most models. If more specifical formats are needed, +you can refer to the specifical model you use as weel as the +[litellm](https://github.com/BerriAI/litellm) documentation to customize your own format function for your model. -- format all the messages in the chat history, into a single message with `"user"` as `role` +- format all the messages in the chat history, into a single message with `"user"` as `role` #### Prompt Strategy From e156c3dc844c33929e4eed5c010ed60a6c22c610 Mon Sep 17 00:00:00 2001 From: garyzhang99 Date: Fri, 17 May 2024 15:02:15 +0800 Subject: [PATCH 6/6] fix typo --- docs/sphinx_doc/en/source/tutorial/206-prompt.md | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/docs/sphinx_doc/en/source/tutorial/206-prompt.md b/docs/sphinx_doc/en/source/tutorial/206-prompt.md index bf7273bf5..28a785ed5 100644 --- a/docs/sphinx_doc/en/source/tutorial/206-prompt.md +++ b/docs/sphinx_doc/en/source/tutorial/206-prompt.md @@ -304,13 +304,14 @@ print(prompt) ### LiteLLMChatWrapper `LiteLLMChatWrapper` encapsulates the litellm chat API, which takes a list of -messages as input. The litellm support different types of models, and each model +messages as input. The litellm supports different types of models, and each model might need to obey different formats. To simplify the usage, we provide a format -that could be compatible with most models. If more specifical formats are needed, -you can refer to the specifical model you use as weel as the +that could be compatible with most models. If more specific formats are needed, +you can refer to the specific model you use as well as the [litellm](https://github.com/BerriAI/litellm) documentation to customize your own format function for your model. + - format all the messages in the chat history, into a single message with `"user"` as `role` #### Prompt Strategy