Skip to content

Support BF16 kvcache, rope and attentions for inference of GGUF/GGML models #4382

Support BF16 kvcache, rope and attentions for inference of GGUF/GGML models

Support BF16 kvcache, rope and attentions for inference of GGUF/GGML models #4382

Annotations

8 warnings

Docs

succeeded Dec 28, 2024 in 2m 47s