Skip to content

brokensandals/maubot-llm

Repository files navigation

This is a very basic maubot plugin for invoking LLMs.

It's very new and very rough. Use at your own risk, and expect problems.

The LLM must be supplied by an OpenAI-compatible server. For example, if you run LM Studio, this plugin can connect to its server. Anthropic is also supported.

You can and probably should configure it to only respond to messages from specific users.

Installation

Usage

Once it's added to a room, every message from any user on the allowlist will cause the bot to invoke the LLM and respond with its output.

You can configure multiple backends. One of them should be designated as the default, but the bot can also use a different backend in each room. You can also use different models and system prompts in different rooms.

The following commands are available for managing the bot in a room:

  • To see the current backend, model, and system prompt, along with a list of available models (note: not supported for Anthropic), use !llm info.
  • To change to a different backend, use !llm backend KEY, where KEY is the key from the backends map in the configuration.
  • To use a specific model, use !llm model NAME. Currently the name is just passed directly as the model field in the request json when invoking the server.
  • To change the system prompt, use !llm system WRITE YOUR PROMPT HERE.
  • To clear the context (forget all past messages in the room), use !llm clear.

About

Maubot plugin for chatting with LLMs.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages