Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Downpinned LiteLLM for Router.acompletion typing break #32

Merged
merged 1 commit into from
Jan 10, 2025

Conversation

jamesbraza
Copy link
Contributor

As seen in this CI run, BerriAI/litellm#7594 broke Router.acompletion's typing:

llmclient/llms.py: note: In member "achat" of class "LiteLLMModel":
llmclient/llms.py:621:74: error: Argument 2 has incompatible type
"list[dict[Any, Any]]"; expected
"list[ChatCompletionUserMessage | ChatCompletionAssistantMessage | ChatCompletionToolMessage | ChatCompletionSystemMessage | ChatCompletionFunctionMessage]"
 [arg-type]
    ...ponse = await track_costs(self.router.acompletion)(self.name, prompts)
                                                                     ^~~~~~~

This doesn't actually impact runtime usage, but it does impact type checkers. So as a short-term workaround, I just downpin LiteLLM here.

@jamesbraza jamesbraza added the bug Something isn't working label Jan 9, 2025
@jamesbraza jamesbraza requested review from sidnarayanan and a team January 9, 2025 00:59
@jamesbraza jamesbraza self-assigned this Jan 9, 2025
@jamesbraza jamesbraza merged commit ab9202c into main Jan 10, 2025
6 checks passed
@jamesbraza jamesbraza deleted the downpinning-litellm branch January 10, 2025 19:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants