You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
I have used the python Langchain library in a previous project for a Django backend. Now Im working on a new project that uses Tauri and its rust backend so naturally i was looking for implementation for Langchain in rust. In python I started using the Groq chat model, I really liked its simple price model and fast inference so naturally i would want to use it in this project as well but currently there is no support for this.
Describe the solution you'd like
I would like an integration for the llm Groq model for langchain-rust like OpenAI,Claude and Ollama is supported. Currently there are some unnoffical sdk for Groq like groq-api-rust or groq-rust
Describe alternatives you've considered
The other alternative and more scalable solution for similar issues would be the ability to create a custom chat model similarly to how the python SDK works by wrapping the llm with the BaseChatModel.
Is your feature request related to a problem? Please describe.
I have used the python Langchain library in a previous project for a Django backend. Now Im working on a new project that uses Tauri and its rust backend so naturally i was looking for implementation for Langchain in rust. In python I started using the Groq chat model, I really liked its simple price model and fast inference so naturally i would want to use it in this project as well but currently there is no support for this.
Describe the solution you'd like
I would like an integration for the llm Groq model for langchain-rust like OpenAI,Claude and Ollama is supported. Currently there are some unnoffical sdk for Groq like groq-api-rust or groq-rust
Describe alternatives you've considered
The other alternative and more scalable solution for similar issues would be the ability to create a custom chat model similarly to how the python SDK works by wrapping the llm with the BaseChatModel.
https://python.langchain.com/docs/how_to/custom_chat_model/
The text was updated successfully, but these errors were encountered: