-
Notifications
You must be signed in to change notification settings - Fork 396
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
local llm #64
Comments
@zhanpengjie which local model you are trying to use? Does that have vision capability? |
I also have problem with using llama3.2-90B-Vision with vllm. The error said environment variable missing? |
@torrischen refer the model and api_base params here and pass it accordingly in zerox: https://docs.litellm.ai/docs/providers/vllm Also refer #65 |
Thanks. That’s helpful |
how to set base_url and model in python sdk?
The text was updated successfully, but these errors were encountered: