Skip to content

Harness/run 1110

Harness/run 1110 #44

Triggered via pull request November 13, 2023 04:10
Status Cancelled
Total duration 16m 6s
Artifacts 8

llm-harness-evaluation.yml

on: pull_request
llm-cpp-build  /  check-linux-amx-artifact
1s
llm-cpp-build / check-linux-amx-artifact
llm-cpp-build  /  check-linux-avx512-artifact
3s
llm-cpp-build / check-linux-avx512-artifact
llm-cpp-build  /  check-linux-avxvnni-artifact
4s
llm-cpp-build / check-linux-avxvnni-artifact
llm-cpp-build  /  check-windows-avx-artifact
3s
llm-cpp-build / check-windows-avx-artifact
llm-cpp-build  /  check-windows-avx-vnni-artifact
2s
llm-cpp-build / check-windows-avx-vnni-artifact
llm-cpp-build  /  check-windows-avx2-artifact
2s
llm-cpp-build / check-windows-avx2-artifact
llm-cpp-build  /  linux-build-amx
1m 44s
llm-cpp-build / linux-build-amx
llm-cpp-build  /  linux-build-avx512
1m 36s
llm-cpp-build / linux-build-avx512
llm-cpp-build  /  linux-build-avxvnni
1m 50s
llm-cpp-build / linux-build-avxvnni
llm-cpp-build  /  windows-build-avx
50s
llm-cpp-build / windows-build-avx
llm-cpp-build  /  windows-build-avx-vnni
3m 40s
llm-cpp-build / windows-build-avx-vnni
llm-cpp-build  /  windows-build-avx2
59s
llm-cpp-build / windows-build-avx2
Matrix: llm-harness-evalution
Fit to window
Zoom out
Zoom in

Annotations

11 errors and 1 warning
llm-harness-evalution (3.9", Llama-2-7b-chat-hf, mmlu, sym_int4)
The version '3.9"' with architecture 'x64' was not found for Ubuntu 22.04. The list of all available versions can be found here: https://raw.githubusercontent.com/actions/python-versions/main/versions-manifest.json
llm-harness-evalution (3.9", Llama2-7b-guanaco-dolphin-500, truthfulqa, sym_int4)
The version '3.9"' with architecture 'x64' was not found for Ubuntu 22.04. The list of all available versions can be found here: https://raw.githubusercontent.com/actions/python-versions/main/versions-manifest.json
llm-harness-evalution (3.9, Llama2-7b-guanaco-dolphin-500, arc, mixed_fp8)
Canceling since a higher priority waiting request for 'LLM Harness Evalution-llm-nightly-test-9425' exists
llm-harness-evalution (3.9, Llama-2-7b-chat-hf, truthfulqa, mixed_fp8)
Canceling since a higher priority waiting request for 'LLM Harness Evalution-llm-nightly-test-9425' exists
llm-harness-evalution (3.9, Llama2-7b-guanaco-dolphin-500, hellaswag, mixed_fp8)
Canceling since a higher priority waiting request for 'LLM Harness Evalution-llm-nightly-test-9425' exists
llm-harness-evalution (3.9, Llama2-7b-guanaco-dolphin-500, mmlu, mixed_fp8)
Canceling since a higher priority waiting request for 'LLM Harness Evalution-llm-nightly-test-9425' exists
llm-harness-evalution (3.9, Llama2-7b-guanaco-dolphin-500, truthfulqa, mixed_fp8)
Canceling since a higher priority waiting request for 'LLM Harness Evalution-llm-nightly-test-9425' exists
llm-harness-evalution (3.9, Llama-2-7b-chat-hf, arc, mixed_fp8)
The operation was canceled.
llm-harness-evalution (3.9, Llama-2-7b-chat-hf, arc, mixed_fp8)
no submodule mapping found in .gitmodules for path 'python/llm/dev/benchmark/harness/lm-evaluation-harness'
llm-harness-evalution (3.9, Llama-2-7b-chat-hf, mmlu, mixed_fp8)
The operation was canceled.
llm-harness-evalution (3.9, Llama-2-7b-chat-hf, hellaswag, mixed_fp8)
The operation was canceled.
llm-harness-evalution (3.9, Llama-2-7b-chat-hf, arc, mixed_fp8)
Unable to clean or reset the repository. The repository will be recreated instead.

Artifacts

Produced during runtime
Name Size
linux-amx Expired
4.94 MB
linux-avx Expired
1.77 MB
linux-avx2 Expired
1.71 MB
linux-avx512 Expired
3.31 MB
linux-avxvnni Expired
10.8 MB
windows-avx Expired
1.72 MB
windows-avx-vnni Expired
5.15 MB
windows-avx2 Expired
2.6 MB