When response_model is optional #115
timconnorz
started this conversation in
General
Replies: 2 comments 2 replies
-
It should do a straight pass-through if the additional params are not provided - can you share under what circumstances this is not the case? |
Beta Was this translation helpful? Give feedback.
0 replies
-
Ah my bad, I was confused by the errors I was getting. What confused me what the fact that when response_model is specified, you get your response in the exact shape of the model, but when it's omitted, you have to revert back to choices[0].delta and append them to the string yourself. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
response_model is currently not an optional parameter, but it would be nice if it was for cases where its unnecessary.
In this case I expect Instructor would skip over the LLMValidator step.
Beta Was this translation helpful? Give feedback.
All reactions