Skip to content

Add llama.cpp backend #283

Add llama.cpp backend

Add llama.cpp backend #283

Triggered via pull request July 24, 2024 10:44
Status Cancelled
Total duration 5m 32s
Artifacts

test_cli_cuda_py_txi.yaml

on: pull_request
run_cli_cuda_py_txi_tests
2m 49s
run_cli_cuda_py_txi_tests
Fit to window
Zoom out
Zoom in

Annotations

2 errors
run_cli_cuda_py_txi_tests
Canceling since a higher priority waiting request for 'CLI CUDA Py-TXI Tests-231' exists
run_cli_cuda_py_txi_tests
The operation was canceled.