Replies: 8 comments 1 reply
-
Getting the same error with together ai async client when using the new llama 3.1 7B instruct. |
Beta Was this translation helpful? Give feedback.
-
Any updates on this? |
Beta Was this translation helpful? Give feedback.
-
I can see that the deepinfra issue from here and the llama 3.1B issue from here are syntax related issues because both providers are deviating from openAI compatible function calling. Deepinfra only supports |
Beta Was this translation helpful? Give feedback.
-
So what do you think can be done about it? There's no workaround other than updating the lib? |
Beta Was this translation helpful? Give feedback.
-
I think function calling and JSON mode are 2 different things in together ai right? https://docs.together.ai/docs/json-mode How can I use json mode instead of tool mode? |
Beta Was this translation helpful? Give feedback.
-
Yeah so using JSON_SCHEMA mode works just fine. |
Beta Was this translation helpful? Give feedback.
-
I'm not using together.ai, why do you bring it up? |
Beta Was this translation helpful? Give feedback.
-
What Model are you using?
Describe the bug
When having an unrelated response the structured output fails with:
To Reproduce
BerriAI/litellm#4699
Expected behavior
Return an empty response and a warning.
Screenshots
If applicable, add screenshots to help explain your problem.
Beta Was this translation helpful? Give feedback.
All reactions