Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: llama runner process has terminated: exit status 127 #12471

Open
NikosDi opened this issue Nov 30, 2024 · 2 comments
Open

Error: llama runner process has terminated: exit status 127 #12471

NikosDi opened this issue Nov 30, 2024 · 2 comments

Comments

@NikosDi
Copy link

NikosDi commented Nov 30, 2024

Hello.
Very often lately I get this message on both WIndows and Ubuntu installations of IPEX-LLM using Ollama.

Everything seems to work fine, the scripts are running without issues, but when you actually try to run a model you get this error:

(base) nikos@PC-9700:~$ ./ollama -v
ollama version is 0.3.6-ipexllm-20241116
(base) nikos@PC-9700:~$ ./ollama run mistral
Error: llama runner process has terminated: exit status 127

Sometimes I re-install the Ollama IPEX-LLM for Windows and Linux and it works, but now I'm not so lucky.

Any reason why is this keep happening and a possible workaround ?

@sgwhat
Copy link
Contributor

sgwhat commented Dec 2, 2024

Hi @NikosDi , could you please provide more detailed logs?

@NikosDi
Copy link
Author

NikosDi commented Dec 2, 2024

Unfortunately there are no detailed logs.

This thing is just happens and when it's happening, I just get this message while everything looks find in the background.

I have noticed that using Linux, it can happen after a system update.

Anyway, I re-installed oneAPI and IPEX-LLM for Linux and now it works again.

Maybe those Linux updates delete or change files necessary for oneAPI and IPEX-LLM to work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants