build llama-cpp-python #101
Annotations
1 error and 1 warning
multi
buildx failed with: ERROR: failed to solve: process "/bin/sh -c apk add --no-cache --virtual .build-deps build-base ccache cmake ninja-build && apk add --no-cache curl git gfortran openblas-dev runit tzdata && rm -rf /var/lib/apt/lists/* && python -m venv /venv && /venv/bin/pip install --upgrade pip anyio pytest scikit-build setuptools fastapi uvicorn sse-starlette pydantic-settings starlette-context huggingface-hub huggingface_hub[cli] && LLAMA_OPENBLAS=ON CMAKE_ARGS=$CMAKE_ARGS /venv/bin/pip install --no-cache-dir llama-cpp-python --verbose && apk del --no-network .build-deps && mkdir -p /runit-services/llama-cpp-python /runit-services/syslogd && echo -e \"#!/bin/sh\\nbusybox syslogd -n -O /dev/stdout\" > /runit-services/syslogd/run && echo -e \"#!/bin/sh\\n/venv/bin/python3 -B -m llama_cpp.server --model /model/model.gguf\" > /runit-services/llama-cpp-python/run && chmod +x /runit-services/syslogd/run /runit-services/llama-cpp-python/run" did not complete successfully: exit code: 1
|
multi
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
|