Skip to content

Support BF16 kvcache, rope and attentions for inference of GGUF/GGML models #4371

Support BF16 kvcache, rope and attentions for inference of GGUF/GGML models

Support BF16 kvcache, rope and attentions for inference of GGUF/GGML models #4371

Annotations

8 warnings

Docs

succeeded Dec 27, 2024 in 3m 3s