Cannot retrieve log probabilities for generated tokens #742
Replies: 4 comments 13 replies
-
I managed to return log probs with instructor 1.2.4 and OpenAI passing these parameters to
|
Beta Was this translation helpful? Give feedback.
-
I am having the same issue after a couple hours of attempting to trouble shoot. Have tried with instructor 1.2.4 and 1.3.5. Can only return None. Can return logprobs when I cut Instructor out of the pipeline and directly call OpenAI. What are next steps in resolving this? If anyone can suggest an avenue of investigation, I am happy to take it forward. |
Beta Was this translation helpful? Give feedback.
-
I think this is a llama-cpp-python problem, simply because the API for non OpenAI models are not implemented. |
Beta Was this translation helpful? Give feedback.
-
I created an issue in #1223 with a recommendation of solution for OpenAI |
Beta Was this translation helpful? Give feedback.
-
What Model are you using?
Describe the bug
I was trying to do a binary classifier that can only answer "Yes" or "No" and I also want to retrieve the log probability of the answer.
This is as simple as setting
lobprobs=2
in thecreate_completion()
. Alas, in the raw response, the logprobs field isNone
To Reproduce
Expected behavior
logprob = response._raw_response.choices[0].logprobs.content[0].logprob # expect a number
Beta Was this translation helpful? Give feedback.
All reactions