Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cmake .. -DLLAMA_CLBLAST=ON 报错 #3

Closed
lumiamilk opened this issue Jul 17, 2024 · 4 comments
Closed

cmake .. -DLLAMA_CLBLAST=ON 报错 #3

lumiamilk opened this issue Jul 17, 2024 · 4 comments

Comments

@lumiamilk
Copy link

lumiamilk commented Jul 17, 2024

termux 版本0.119.0-beta.1
版本CLBlast version 1.6.3
vivo IQOO z8 天玑8200

根据Based on OpenCL + CLBlast(Recommend)步骤执行命令,
当执行 cmake .. -DLLAMA_CLBLAST=ON 后出现:
CMake Warning:
Manually-specified variables were not used by the project:

LLAMA_CLBLAST
如果无视这个报错,继续执行cmake --build . --config Release
那么clinfo -l命令没有反应。
识别不到GPU

@JackZeng0208
Copy link
Owner

天玑芯片目前应该是不支持的

@JackZeng0208
Copy link
Owner

我这里也没有天玑芯片的手机,没办法做具体的测试

@lumiamilk
Copy link
Author

lumiamilk commented Jul 18, 2024

我这里也没有天玑芯片的手机,没办法做具体的测试

我又搜了一下,发现
ggerganov/llama.cpp#7735

ggerganov/llama.cpp#8139
发现
“devops : remove clblast + LLAMA_CUDA -> GGML_CUDA #8139”

“ggml : remove OpenCL #7735”
我一直git clone最新版的😵

@JackZeng0208
Copy link
Owner

明白了,我之后抽空重写一下这个教程

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants