Skip to content

Support BF16 kvcache, rope and attentions for inference of GGUF/GGML models #4382

Support BF16 kvcache, rope and attentions for inference of GGUF/GGML models

Support BF16 kvcache, rope and attentions for inference of GGUF/GGML models #4382

Annotations

4 warnings

Check (windows-latest, stable)

succeeded Dec 28, 2024 in 6m 0s