Skip to content

Add llama.cpp backend #365

Add llama.cpp backend

Add llama.cpp backend #365

Triggered via pull request July 25, 2024 05:32
Status Cancelled
Total duration 1h 2m 36s
Artifacts
run_cli_rocm_pytorch_single_gpu_tests
0s
run_cli_rocm_pytorch_single_gpu_tests
Fit to window
Zoom out
Zoom in

Annotations

1 error
run_cli_rocm_pytorch_single_gpu_tests
The run was canceled by @baptistecolle.