-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Override OpenAI temperature #20805
Comments
There are a handful of POST request keys which are supported on certain OpenAI models via the
Note, for some models, notably
I'm not sure whether it makes sense to specifically add support for just |
I'm personally in favor of adding such option to OpenAI, as well as for Ollama. It applies extra, when you use non-default openAI endpoint, like LocalAI, groq or cerebras |
As reported in: Some OpenAI API compatible models ( |
Add a parameter: reasoning_effort, which determines whether o3-mini-high, o3-mini-mid, or o3-mini-low, and the differences between them are significant. https://platform.openai.com/docs/guides/reasoning |
Check for existing issues
Describe the feature
Support passing
temperature
to OpenAI models.Environment
none
The text was updated successfully, but these errors were encountered: