Skip to content

Properly support batched/non-batched with vllm/llama.cpp #148

Properly support batched/non-batched with vllm/llama.cpp

Properly support batched/non-batched with vllm/llama.cpp #148

Re-run triggered July 3, 2024 22:43
Status Success
Total duration 1h 3m 41s
Billable time 1h 4m
Artifacts

e2e.yml

on: pull_request
Fit to window
Zoom out
Zoom in