Skip to content

Actions: fairydreaming/llama.cpp

CI

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
11 workflow runs
11 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

fix: graceful shutdown for Docker images (#10815)
CI #13: Commit 11e07fd pushed by fairydreaming
December 13, 2024 17:53 43m 37s master
December 13, 2024 17:53 43m 37s
vulkan: optimize and reenable split_k (#10637)
CI #12: Commit cc98896 pushed by fairydreaming
December 3, 2024 20:15 45m 11s master
December 3, 2024 20:15 45m 11s
sycl : Reroute permuted mul_mats through oneMKL (#10408)
CI #11: Commit 266b851 pushed by fairydreaming
November 29, 2024 12:35 42m 37s master
November 29, 2024 12:35 42m 37s
cann: add doc for cann backend (#8867)
CI #10: Commit cfac111 pushed by fairydreaming
August 19, 2024 17:58 1h 15m 4s master
August 19, 2024 17:58 1h 15m 4s
Update README.md to fix broken link to docs (#8399)
CI #9: Commit fd560fe pushed by fairydreaming
July 9, 2024 19:50 1h 4m 3s master
July 9, 2024 19:50 1h 4m 3s
disable publishing the full-rocm docker image (#8083)
CI #6: Commit 8cb508d pushed by fairydreaming
June 24, 2024 07:05 46m 5s master
June 24, 2024 07:05 46m 5s
common: fix warning (#8036)
CI #4: Commit abd894a pushed by fairydreaming
June 20, 2024 17:45 54m 39s master
June 20, 2024 17:45 54m 39s
rpc : fix ggml_backend_rpc_supports_buft() (#7918)
CI #3: Commit 172c825 pushed by fairydreaming
June 13, 2024 19:21 56m 45s master
June 13, 2024 19:21 56m 45s
server : do not get prompt in infill mode (#7286)
CI #2: Commit a5cabd7 pushed by fairydreaming
June 7, 2024 08:19 41m 6s master
June 7, 2024 08:19 41m 6s
CUDA: remove incorrect precision check (#7454)
CI #1: Commit 95fb0ae pushed by fairydreaming
May 22, 2024 09:10 1h 50m 51s master
May 22, 2024 09:10 1h 50m 51s