-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handle multiple providers/llms #45
base: main
Are you sure you want to change the base?
Conversation
@@ -43,3 +43,6 @@ KEYCLOAK_CLIENT_ID=apisix | |||
KEYCLOAK_CLIENT_SECRET=HckCZXToXfaetbBx0Fo3xbjnC468oMi4 # pragma: allowlist-secret | |||
KEYCLOAK_DISCOVERY_URL=http://${KEYCLOAK_SVC_HOSTNAME}:${KEYCLOAK_PORT}/realms/ol-local/.well-known/openid-configuration | |||
KEYCLOAK_SCOPES="openid profile ol-profile" | |||
|
|||
#AWS settings | |||
AWS_DEFAULT_REGION=us-east-1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will be required for AWS Bedrock models
# AWS_ACCESS_KEY_ID= | ||
# AWS_SECRET_ACCESS_KEY= |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Required for AWS Bedrock models
"ignore", | ||
module=".*(pydantic).*", | ||
category=UserWarning, | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pydantic warnings coming from ChatLiteLLM class
**(self.proxy.get_api_kwargs() if self.proxy else {}), | ||
**(self.proxy.get_additional_kwargs(self) if self.proxy else {}), | ||
**kwargs, | ||
) | ||
if self.temperature: | ||
if self.temperature and self.model not in settings.AI_UNSUPPORTED_TEMP_MODELS: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The new o3-mini
model does not support the temperature parameter and will raise an exception if it is passed along
What are the relevant tickets?
Closes https://github.com/mitodl/hq/issues/6622
Description (What does it do?)
ChatLiteLLM
class for all LLM models.How can this be tested?
OPENAI_API_KEY
http://ai.open.odl.local:8003
. Ask some questions, it should work. Look at the logs, you should find this:AI_DEFAULT_RECOMMENDATION_MODEL=openai/o3-mini
to your backend.local.env file and restart containers.