-
Notifications
You must be signed in to change notification settings - Fork 10.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
llama : default n_swa for phi-3 #8931
Conversation
src/llama.cpp
Outdated
// default value for Phi-3-medium-128k-instruct | ||
hparams.n_swa = 131072; | ||
} | ||
ml.get_key(LLM_KV_ATTENTION_SLIDING_WINDOW, hparams.n_swa, false); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Might want to handle get_key
returning false
and hparams.n_swa
remaining uninitialized to 0 by throwing an error, though probably never going to happen
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
* default n_swa for phi-3 * fix * double check swa
* default n_swa for phi-3 * fix * double check swa
Related to #8627
phi3.attention.sliding_window
is marked as required, this break many existing models as some users can't re-convert the new version (ref: ngxson/wllama#106)This PR propose default value for certain models: