Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement OpenRouter backend for better support of OSS LLMs #433

Open
krschacht opened this issue Jun 24, 2024 · 0 comments
Open

Implement OpenRouter backend for better support of OSS LLMs #433

krschacht opened this issue Jun 24, 2024 · 0 comments
Milestone

Comments

@krschacht
Copy link
Contributor

krschacht commented Jun 24, 2024

OpenRouter spec does not perfectly conform to OpenAI so it has it's own ruby gem. But what's great about OpenRouter is that it supports TONS of models and seems to be a much more reliable, long-term way of running Llama 3, among other LLMs. Groq isn't so good if someone actually wants to use Llama 3 long term.

Obie started this task but didn't finish. He's the author of the OpenRouter gem so we can ask him questions if we have any:

obie@250467d

This is the API docs I found:
https://openrouter.ai/docs/requests

Additional considerations for full OSS LLM support:
#389 (comment)

@krschacht krschacht added this to the 0.8 milestone Jun 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant