We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Describe the bug https://github.com/deepset-ai/haystack-core-integrations/actions/runs/10189627850/job/28188014440
Segmentation fault (core dumped)
The bug was introduced in llama-cpp-python==0.2.84
llama-cpp-python==0.2.84
related issues/PRs: abetlen/llama-cpp-python#1636 abetlen/llama-cpp-python#1637
We can wait for the bug to be fixed or pin the dependency to a working version...
The text was updated successfully, but these errors were encountered:
_convert_message_to_llamacpp_format
Maybe pin the dependency? I need to push a small change for DynamicPromptBuilder in example file and failing tests are blocking my PR.
Sorry, something went wrong.
llama-cpp-python>=0.2.87
Successfully merging a pull request may close this issue.
Describe the bug
https://github.com/deepset-ai/haystack-core-integrations/actions/runs/10189627850/job/28188014440
The bug was introduced in
llama-cpp-python==0.2.84
related issues/PRs:
abetlen/llama-cpp-python#1636
abetlen/llama-cpp-python#1637
We can wait for the bug to be fixed or pin the dependency to a working version...
The text was updated successfully, but these errors were encountered: