This is a very basic maubot plugin for invoking LLMs.
It's very new and very rough. Use at your own risk, and expect problems.
The LLM must be supplied by an OpenAI-compatible server. For example, if you run LM Studio, this plugin can connect to its server. Anthropic is also supported.
You can and probably should configure it to only respond to messages from specific users.
- Setup maubot
- Clone this repo and use
mbc build -u
to build the plugin - Create a client and an instance
- Update the configuration; see base-config.yaml for documentation of the available options
Once it's added to a room, every message from any user on the allowlist will cause the bot to invoke the LLM and respond with its output.
You can configure multiple backends. One of them should be designated as the default, but the bot can also use a different backend in each room. You can also use different models and system prompts in different rooms.
The following commands are available for managing the bot in a room:
- To see the current backend, model, and system prompt, along with a list of available models (note: not supported for Anthropic), use
!llm info
. - To change to a different backend, use
!llm backend KEY
, where KEY is the key from thebackends
map in the configuration. - To use a specific model, use
!llm model NAME
. Currently the name is just passed directly as themodel
field in the request json when invoking the server. - To change the system prompt, use
!llm system WRITE YOUR PROMPT HERE
. - To clear the context (forget all past messages in the room), use
!llm clear
.