Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

请问可以将LLM的运行和使用分离吗? #536

Open
yanhuixie opened this issue Dec 6, 2024 · 3 comments
Open

请问可以将LLM的运行和使用分离吗? #536

yanhuixie opened this issue Dec 6, 2024 · 3 comments
Labels
enhancement New feature or request

Comments

@yanhuixie
Copy link

  • [ x] I checked to make sure that this is not a duplicate issue

简而言之,能否考虑支持外部的LLM服务?
比如可以使用ollama运行shibing624/chinese-text-correction-7b,这个经验证是没问题的。
然后在pycorrector调用ollama的HTTP接口使用模型的服务。

@yanhuixie yanhuixie added the enhancement New feature or request label Dec 6, 2024
@shibing624
Copy link
Owner

ollama的http接口已经搭好了,不需要用pycorrector调用,用requests请求就可以,或者curl请求就行了。

@yanhuixie
Copy link
Author

ollama的http接口已经搭好了,不需要用pycorrector调用,用requests请求就可以,或者curl请求就行了。

那倒也是,只是pycorrector做了很多预处理和后处理,直接调用ollama的话,这些“福利”就享受不到了。😂

@shibing624
Copy link
Owner

request请求后,再调用pycorrector的预处理和后处理函数,pycorrector是个开源库包,内部函数是暴漏的。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants