Skip to content

Issues: BerriAI/litellm

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

[Bug]: Basten model integration not working as expected bug Something isn't working
#7836 opened Jan 17, 2025 by usersina
[Feature]: Support OPENAI_BASE_URL environment variable enhancement New feature or request
#7829 opened Jan 17, 2025 by blairhudson
LiteLLM.Info
#7812 opened Jan 16, 2025 by filtershekanahmad
[Bug]: Cannot see the http request that was sent bug Something isn't working
#7802 opened Jan 16, 2025 by mrm1001
[Bug]: Gemini response with streaming not returning with usage bug Something isn't working
#7798 opened Jan 16, 2025 by AyrennC
[Bug]: tools not passed to databricks bug Something isn't working mlops user request
#7788 opened Jan 15, 2025 by lewiesnyder
Issue
#7786 opened Jan 15, 2025 by Harshal292004
[Bug]: Windows Compatibility Issue with uvloop bug Something isn't working
#7783 opened Jan 15, 2025 by zanynik
[Bug]: Cooldown Not Working in LiteLLM bug Something isn't working
#7779 opened Jan 15, 2025 by ZPerling
[Refactor]: Generate networking.tsx from openapi spec enhancement New feature or request
#7763 opened Jan 14, 2025 by yujonglee
[Bug]: Team base usage ui fail bug Something isn't working mlops user request
#7761 opened Jan 14, 2025 by superpoussin22
[Bug]: response_format in HF in unsupported bug Something isn't working mlops user request
#7745 opened Jan 13, 2025 by jorado
ProTip! Mix and match filters to narrow down what you’re looking for.