You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to get various models to load on Linux with LM studio version 3.5. I can do so, but only if I keep the context size below ~8192.
I'm using CPU only mode and I have 128GB RAM (so that cannot be an issue for a 32B 4Q model)
7B models can be used with that context size, but as I said I got plenty of RAM, so it cannot be the issue here.
Loading with vanilla llama.cpp (build from source) loading a 32B model or even bigger models with 32768 context sizes or bigger works with no issues.
The text was updated successfully, but these errors were encountered:
I'm trying to get various models to load on Linux with LM studio version 3.5. I can do so, but only if I keep the context size below ~8192.
I'm using CPU only mode and I have 128GB RAM (so that cannot be an issue for a 32B 4Q model)
7B models can be used with that context size, but as I said I got plenty of RAM, so it cannot be the issue here.
Loading with vanilla llama.cpp (build from source) loading a 32B model or even bigger models with 32768 context sizes or bigger works with no issues.
The text was updated successfully, but these errors were encountered: