English · 中文
LLM AI Agent
multi sessions service.
- Support OpenAPI/OpenRPC/OpenModbus/OpenTool JSON Spec.
- Support LLM Function calling to
HTTP API
/json-rpc 2.0 over HTTP
/Modbus
and more custom tools.
- Some OpenSpec json file, according to
/example/json/open*/*.json
, which is callable. - Run your tool server, which is described in json file.
- Add
.env
file in theexample
folder, and add below content in the.env
file:baseUrl = https://xxx.xxx.com # LLM API BaseURL apiKey = sk-xxxxxxxxxxxxxxxxxxxx # LLM API ApiKey
- Use below method to run agent service.
- According to
/example/agent_service_example.dart
- Support multi agent session via session id.
Future<void> main() async {
CapabilityDto capabilityDto = CapabilityDto(
llmConfig: _buildLLMConfig(), // LLM Config
systemPrompt: _buildSystemPrompt(), // System Prompt
openSpecList: await _buildOpenSpecList() // OpenSpec Description String List
);
SessionDto sessionDto = await agentService.initChat(
capabilityDto,
listen // Subscribe AgentMessage, Agent chat with User/Client/LLM/Tools Role
); // Get Session Id
String prompt = "<USER PROMPT, e.g. call any one tool>";
await agentService.startChat(
sessionDto.id, // Start chat with the Session Id
[UserMessageDto(type: UserMessageType.text, message: prompt)] // User Content List, support type text/imageUrl
);
}
- According to
/example/tool_agent_example.dart
- Pure native calling. Support single session.
- Method 1 AgentService is friendly encapsulation for this.
Future<void> main() async {
ToolAgent toolAgent = ToolAgent(
llmRunner: _buildLLMRunner(),
session: _buildSession(),
toolRunnerList: await _buildToolRunnerList(),
systemPrompt: _buildSystemPrompt()
);
String prompt = "<USER PROMPT, e.g. call any one tool>";
toolAgent.userToAgent([Content(type: ContentType.text, message: prompt)]);
}