Skip to content

Commit

Permalink
fix: Remove unused LLM_KV_ATTENTION_LAYER_COUNT
Browse files Browse the repository at this point in the history
I'd added this at one point, but it's not actually needed

Branch: BambaArchitecture

Signed-off-by: Gabe Goodhart <[email protected]>
  • Loading branch information
gabe-l-hart committed Dec 12, 2024
1 parent 97e6ba8 commit b83e9a6
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion src/llama.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -310,7 +310,6 @@ enum llm_kv {
LLM_KV_ATTENTION_RELATIVE_BUCKETS_COUNT,
LLM_KV_ATTENTION_SLIDING_WINDOW,
LLM_KV_ATTENTION_SCALE,
LLM_KV_ATTENTION_LAYER_COUNT,
LLM_KV_ATTENTION_LAYER_INDICES,

LLM_KV_ROPE_DIMENSION_COUNT,
Expand Down

0 comments on commit b83e9a6

Please sign in to comment.