Skip to content

llama-cpp multi server support #183

llama-cpp multi server support

llama-cpp multi server support #183

Triggered via pull request October 21, 2024 14:12
@cdoerncdoern
synchronize #316
Status Failure
Total duration 16m 8s
Artifacts

e2e-nvidia-t4-x1.yml

on: pull_request_target
Start external EC2 runner
2m 40s
Start external EC2 runner
Stop external EC2 runner
4s
Stop external EC2 runner
e2e-workflow-complete
0s
e2e-workflow-complete
Fit to window
Zoom out
Zoom in

Annotations

1 error
E2E Test
Process completed with exit code 1.