Skip to content

Commit

Permalink
misc fix verbose printing in functionary model
Browse files Browse the repository at this point in the history
  • Loading branch information
abetlen committed Nov 24, 2023
1 parent 36048d4 commit de2e2bc
Showing 1 changed file with 4 additions and 2 deletions.
6 changes: 4 additions & 2 deletions llama_cpp/llama_chat_format.py
Original file line number Diff line number Diff line change
Expand Up @@ -955,9 +955,11 @@ def message_to_str(msg: llama_types.ChatCompletionRequestMessage):
assert isinstance(function_call, str)
assert stream is False # TODO: support stream mode

print(new_prompt)
print(completion["choices"][0]["text"])
if llama.verbose:
print(new_prompt)
print(completion["choices"][0]["text"])

# TODO: support stream mode
return llama_types.CreateChatCompletionResponse(
id="chat" + completion["id"],
object="chat.completion",
Expand Down

0 comments on commit de2e2bc

Please sign in to comment.