Skip to content

Support BF16 kvcache, rope and attentions for inference of GGUF/GGML models #4370

Support BF16 kvcache, rope and attentions for inference of GGUF/GGML models

Support BF16 kvcache, rope and attentions for inference of GGUF/GGML models #4370

Annotations

5 warnings

Typos

succeeded Dec 27, 2024 in 5s