llama : (proposal) return enum for llama_decode
and llama_encode
#15187
Triggered via pull request
September 11, 2024 13:38
Status
Failure
Total duration
1h 10m 32s
Artifacts
–
build.yml
on: pull_request
Matrix: windows-latest-cmake-cuda
Matrix: windows-latest-cmake
macOS-latest-cmake-arm64
2m 42s
macOS-latest-cmake-x64
9m 15s
ubuntu-focal-make
4m 3s
ubuntu-latest-cmake
2m 45s
macOS-latest-make
2m 57s
macOS-latest-cmake
2m 23s
ubuntu-focal-make-curl
2m 55s
ubuntu-latest-cmake-rpc
2m 26s
ubuntu-22-cmake-vulkan
2m 58s
ubuntu-22-cmake-hip
18m 47s
ubuntu-22-cmake-sycl
10m 17s
ubuntu-22-cmake-sycl-fp16
10m 9s
macOS-latest-cmake-ios
1m 53s
macOS-latest-cmake-tvos
2m 47s
windows-latest-cmake-sycl
11m 20s
windows-latest-cmake-hip
14m 24s
ios-xcode-build
1m 35s
android-build
16m 30s
Matrix: macOS-latest-swift
Matrix: ubuntu-latest-cmake-sanitizer
Matrix: windows-msys2
release
0s
Annotations
7 errors and 7 warnings