Skip to content

Commit

Permalink
server: Fix has_next_line in JSON response (ggerganov#10818)
Browse files Browse the repository at this point in the history
* Update server JSON response.

* Add unit test to check `has_new_line` JSON response

* Remove `has_new_line` unit test changes.

* Address code review comment: type check for `has_new_line` in unit test
  • Loading branch information
MichelleTanPY authored Dec 14, 2024
1 parent e52aba5 commit 89d604f
Show file tree
Hide file tree
Showing 3 changed files with 4 additions and 2 deletions.
2 changes: 1 addition & 1 deletion examples/server/server.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -459,7 +459,7 @@ struct server_task_result_cmpl_final : server_task_result {
int32_t n_decoded;
int32_t n_prompt_tokens;
int32_t n_tokens_cached;
int32_t has_new_line;
bool has_new_line;
std::string stopping_word;
stop_type stop = STOP_TYPE_NONE;

Expand Down
2 changes: 2 additions & 0 deletions examples/server/tests/unit/test_completion.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ def test_completion(prompt: str, n_predict: int, re_content: str, n_prompt: int,
assert res.body["timings"]["prompt_n"] == n_prompt
assert res.body["timings"]["predicted_n"] == n_predicted
assert res.body["truncated"] == truncated
assert type(res.body["has_new_line"]) == bool
assert match_regex(re_content, res.body["content"])


Expand All @@ -48,6 +49,7 @@ def test_completion_stream(prompt: str, n_predict: int, re_content: str, n_promp
assert data["timings"]["predicted_n"] == n_predicted
assert data["truncated"] == truncated
assert data["stop_type"] == "limit"
assert type(data["has_new_line"]) == bool
assert "generation_settings" in data
assert server.n_predict is not None
assert data["generation_settings"]["n_predict"] == min(n_predict, server.n_predict)
Expand Down
2 changes: 1 addition & 1 deletion examples/server/utils.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
#include <vector>
#include <memory>

#define DEFAULT_OAICOMPAT_MODEL "gpt-3.5-turbo-0613"
#define DEFAULT_OAICOMPAT_MODEL "gpt-3.5-turbo"

using json = nlohmann::ordered_json;

Expand Down

0 comments on commit 89d604f

Please sign in to comment.