Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

enhance: openai-model-provider proxy - combine options into single baseURL and add customPathHandleFuncs for customizability #375

Merged

Conversation

iwilltry42
Copy link
Contributor

@iwilltry42 iwilltry42 commented Jan 28, 2025

E.g. Google Gemini Vertex AI has OpenAI API compatibility, but not for everything. /embeddings and /models are not supported, but still we gain the benefit of having /chat/completions and not having to translate the completion requests/responses.
Also, Gemini as the example at hand doesn't use the /v1 API base.

With the baseURL setting we are closer to the settings we have/had in GPTScript and Knowledge

…seURL and add customPathHandleFuncs for customizability
@iwilltry42 iwilltry42 force-pushed the change/openai-model-provider-proxy branch from 4f1c2d4 to cff2c76 Compare January 28, 2025 17:24
openai-model-provider/proxy/proxy.go Outdated Show resolved Hide resolved
openai-model-provider/proxy/proxy.go Outdated Show resolved Hide resolved
openai-model-provider/proxy/validate.go Outdated Show resolved Hide resolved
@iwilltry42 iwilltry42 requested a review from cjellick January 29, 2025 15:43
@iwilltry42 iwilltry42 merged commit 5ffecc9 into obot-platform:main Jan 29, 2025
1 check passed
@iwilltry42 iwilltry42 deleted the change/openai-model-provider-proxy branch January 29, 2025 16:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants