-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Issues: BerriAI/litellm
[Feature]:
aiohttp
migration - 10-100x Higher RPS Master ti...
#7544
opened Jan 4, 2025 by
ishaan-jaff
Open
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Bug]: Function calling with Claude models always responds with no arguments
bug
Something isn't working
#7848
opened Jan 18, 2025 by
ukanwat
[Bug]: Cannot configure db model sync job schedule
bug
Something isn't working
mlops user request
#7841
opened Jan 17, 2025 by
siddhantgawsane
[Bug]: Basten model integration not working as expected
bug
Something isn't working
#7836
opened Jan 17, 2025 by
usersina
[Bug]: ValueError: invalid literal for int() with base 10: 'generateContent' (gemini)
bug
Something isn't working
#7830
opened Jan 17, 2025 by
j0yk1ll
[Feature]: Support New feature or request
OPENAI_BASE_URL
environment variable
enhancement
#7829
opened Jan 17, 2025 by
blairhudson
[Bug]: Error spam due to prometheus metrics trying to be updated for non-premium user
bug
Something isn't working
#7817
opened Jan 16, 2025 by
mikstew
Error 400 when using pydantic objects with default options defined with Google models.
#7808
opened Jan 16, 2025 by
andrewn3
[Bug]: Fallbacks with Something isn't working
model
specified cause a TypeError: litellm.main.acompletion() got multiple values for keyword argument 'model'
bug
#7807
opened Jan 16, 2025 by
david1542
[Bug]: Gemini model parameters not set by using Something isn't working
GoogleAIStudioGeminiConfig
bug
#7804
opened Jan 16, 2025 by
mrm1001
[Bug]: Cannot see the http request that was sent
bug
Something isn't working
#7802
opened Jan 16, 2025 by
mrm1001
[Bug]: Problem with langfuse_tags when using litellm proxy with langfuse integration
bug
Something isn't working
#7801
opened Jan 16, 2025 by
yuu341
[Bug]: Gemini response with streaming not returning with usage
bug
Something isn't working
#7798
opened Jan 16, 2025 by
AyrennC
[Bug]: Anthropic usage prompt cache details missing from logging callbacks when streaming
bug
Something isn't working
#7790
opened Jan 15, 2025 by
jgregory-valence
[Bug]: tools not passed to databricks
bug
Something isn't working
mlops user request
#7788
opened Jan 15, 2025 by
lewiesnyder
[Feature]: Add RunwayML Support - https://docs.dev.runwayml.com/guides/quickstart/
enhancement
New feature or request
#7787
opened Jan 15, 2025 by
ishaan-jaff
[Bug]: together_ai/meta-llama/Llama-3.3-70B-Instruct-Turbo error using tool calls
bug
Something isn't working
#7785
opened Jan 15, 2025 by
DanielChico
[Bug]: Windows Compatibility Issue with uvloop
bug
Something isn't working
#7783
opened Jan 15, 2025 by
zanynik
[Bug]: Cannot pass provider-specific parameters to Bedrock Anthropic models
bug
Something isn't working
#7782
opened Jan 15, 2025 by
mrm1001
[Bug]: Cooldown Not Working in LiteLLM
bug
Something isn't working
#7779
opened Jan 15, 2025 by
ZPerling
[Bug]: litellm slower then python's request
awaiting: user response
bug
Something isn't working
mlops user request
#7764
opened Jan 14, 2025 by
jouDance
[Refactor]: Generate New feature or request
networking.tsx
from openapi spec
enhancement
#7763
opened Jan 14, 2025 by
yujonglee
[Bug]: Team base usage ui fail
bug
Something isn't working
mlops user request
#7761
opened Jan 14, 2025 by
superpoussin22
[Bug]: response_format in HF in unsupported
bug
Something isn't working
mlops user request
#7745
opened Jan 13, 2025 by
jorado
Previous Next
ProTip!
Mix and match filters to narrow down what you’re looking for.