Skip to content
This repository has been archived by the owner on Dec 6, 2023. It is now read-only.

adds chatml prompt template as string to maintain configurability #124

Merged
merged 4 commits into from
Oct 26, 2023

Conversation

biswaroop1547
Copy link
Collaborator

@biswaroop1547 biswaroop1547 commented Oct 25, 2023

and has a generic stitch_prompt function which can be reused for other services for prompt template configurability.

Fixes bit-gpt/app#401

@biswaroop1547
Copy link
Collaborator Author

relevant: abetlen/llama-cpp-python#717

@biswaroop1547 biswaroop1547 temporarily deployed to internal October 25, 2023 18:34 — with GitHub Actions Inactive
@casperdcl casperdcl added the bug Something isn't working label Oct 25, 2023
filopedraz
filopedraz previously approved these changes Oct 26, 2023
cht-llama-cpp/models.py Show resolved Hide resolved
cht-llama-cpp/models.py Outdated Show resolved Hide resolved
Co-authored-by: Casper da Costa-Luis <[email protected]>
@biswaroop1547 biswaroop1547 temporarily deployed to internal October 26, 2023 07:14 — with GitHub Actions Inactive
@biswaroop1547 biswaroop1547 temporarily deployed to internal October 26, 2023 07:22 — with GitHub Actions Inactive
@casperdcl casperdcl merged commit c4b1dab into premAI-io:main Oct 26, 2023
5 checks passed
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working
Projects
No open projects
Status: Done
Development

Successfully merging this pull request may close these issues.

[Binary] Mistral model not configured properly
3 participants