You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
I reviewed the Discussions, and have a new and useful enhancement to share.
Feature Description
OuteAI has released a new small model that is very coherent for its size.
I am requesting the addition of this model's chat template to llama.cpp's list of supported templates
Motivation
The model is already supported by llama.cpp. However, it's using a new chat template that isn't in the list of supported templates. As a result, llama.cpp assumes ChatML for this model. Due to the model's size, it's very sensitive to the prompt template and gives terrible results with wrong formats and users have terrible experience running it.
Possible Implementation
The model's template is very similar to ChatML so we can just copy-paste the implementation for ChatML and modify it.
The text was updated successfully, but these errors were encountered:
Prerequisites
Feature Description
OuteAI has released a new small model that is very coherent for its size.
I am requesting the addition of this model's chat template to llama.cpp's list of supported templates
Motivation
The model is already supported by llama.cpp. However, it's using a new chat template that isn't in the list of supported templates. As a result, llama.cpp assumes ChatML for this model. Due to the model's size, it's very sensitive to the prompt template and gives terrible results with wrong formats and users have terrible experience running it.
Possible Implementation
The model's template is very similar to ChatML so we can just copy-paste the implementation for ChatML and modify it.
The text was updated successfully, but these errors were encountered: