-
Notifications
You must be signed in to change notification settings - Fork 6
Issues: ikawrakow/ik_llama.cpp
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Bug: Quantized KV cache produces garbage in situation where llama.cpp does not
#92
opened Oct 17, 2024 by
saood06
Feature Request: Elliminate/reduce unnecessary copies
enhancement
New feature or request
#67
opened Sep 28, 2024 by
ikawrakow
4 tasks done
Feature Request: Improve CPU processing speed for large contexts
enhancement
New feature or request
#26
opened Aug 22, 2024 by
ikawrakow
4 tasks done
ProTip!
Exclude everything labeled
bug
with -label:bug.