From 53dbba1ce8f87dc846c08c7a198447af332ac630 Mon Sep 17 00:00:00 2001 From: Pierrick HYMBERT Date: Wed, 20 Mar 2024 05:43:28 +0100 Subject: [PATCH] server: tests: add new tokens regex on windows generated following new repeat penalties default changed in (#6127) --- examples/server/tests/features/server.feature | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/examples/server/tests/features/server.feature b/examples/server/tests/features/server.feature index 7448986e75a49..40fd01871d8c6 100644 --- a/examples/server/tests/features/server.feature +++ b/examples/server/tests/features/server.feature @@ -35,9 +35,9 @@ Feature: llama.cpp server And metric llamacpp:tokens_predicted is Examples: Prompts - | prompt | n_predict | re_content | n_prompt | n_predicted | truncated | - | I believe the meaning of life is | 8 | (read\|going)+ | 18 | 8 | not | - | Write a joke about AI from a very long prompt which will not be truncated | 256 | (princesses\|everyone\|kids)+ | 46 | 64 | not | + | prompt | n_predict | re_content | n_prompt | n_predicted | truncated | + | I believe the meaning of life is | 8 | (read\|going)+ | 18 | 8 | not | + | Write a joke about AI from a very long prompt which will not be truncated | 256 | (princesses\|everyone\|kids\|Anna)+ | 46 | 64 | not | Scenario: Completion prompt truncated Given a prompt: