why move to REST endpoints? #257
Replies: 2 comments
-
Hi @krrishdholakia! Let me just convert this to a discussion. |
Beta Was this translation helpful? Give feedback.
-
Yes, that's all accurate. We originally started off with minichain because it exposes a very simple and intuitive API, and it saved us the work of implementing our own backend. So it was easy to kick off things by relying on minichain. langchain was added because (1) it offers a lot more features and LLM providers/models and (2) it's rather popular at the moment - we wanted our users to be able to use all langchain features, if they want do so. We added our own REST-based backend because we ultimately wanted more control over how calls are made and how error responses are processed. LLM providers/models are rather different in this regard, and we don't want to rely on (potentially unstable) third-party libraries to handle this. Since our REST-based backend is simple - when compared to the feature-rich langchain, at least - we didn't saw a need to keep maintaining support for minichain anymore. Finally, why didn't we just use the libraries supplied by OpenAI/Cohere/Anthropic? We want to keep the dependencies required to run spacy-llm as small as possible. If at one point we realize that using the existing provider's libraries makes more sense, we might switch to doing that. Right now we are good with the trade-off of fewer dependencies vs. a somewhat higher maintenance effort. |
Beta Was this translation helpful? Give feedback.
-
Hey @rmitsch @kabirkhan @ljvmiranda921
I noticed y'all started with abstraction libraries for llm calls (langchain + minichain). Why did y'all move to your REST endpoint implementation for the llm api calls?
Looks like the migration took a bit of work (handling edge cases, etc.) - and even led to y'all dropping support for minichain.
looking at this - https://github.com/explosion/spacy-llm/blob/811c47c0e1a5b54be3da51d84ad8dc70084561f2/requirements-dev.txt
Beta Was this translation helpful? Give feedback.
All reactions