Skip to content

Added max_tokens=100 to llama.cpp demo as they reduced the default nu… #142

Added max_tokens=100 to llama.cpp demo as they reduced the default nu…

Added max_tokens=100 to llama.cpp demo as they reduced the default nu… #142

Annotations

2 warnings

test

succeeded Mar 1, 2024 in 6m 35s