Skip to content

fuse fp8 quant in kv copying and add flashinfer decode mla operator in the attention module #654

fuse fp8 quant in kv copying and add flashinfer decode mla operator in the attention module

fuse fp8 quant in kv copying and add flashinfer decode mla operator in the attention module #654

Triggered via pull request February 26, 2025 09:59
Status Failure
Total duration 51s
Artifacts

pre-commit.yml

on: pull_request
pre-commit
40s
pre-commit
Fit to window
Zoom out
Zoom in

Annotations

1 error
pre-commit
Process completed with exit code 1.