-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Crash on loading specifc model #57
Comments
same here. It used to work a few weeks ago. After a recent update, I noticed this issue. Since I was testing Android version using the emulator, I didn't notice, so I'm not sure which update caused this. |
If this is still an issue, you might want to check the build types for your phone's arch (like the one here: #67). This solved my issue on Android device: android/src/main/CMakeLists.txt ....
# Default target (no specific CPU features)
build_library("rnllama" "")
if (${ANDROID_ABI} STREQUAL "arm64-v8a")
# ARM64 targets
build_library("rnllama_v8_4_fp16_dotprod" "-march=armv8.4-a+fp16+dotprod")
build_library("rnllama_v8_2_fp16_dotprod" "-march=armv8.2-a+fp16+dotprod")
build_library("rnllama_v8_2_fp16" "-march=armv8.2-a+fp16")
build_library("rnllama_v8" "-march=armv8-a")
elseif (${ANDROID_ABI} STREQUAL "x86_64")
# x86_64 target
build_library("rnllama_x86_64" "-march=x86-64" "-mtune=intel" "-msse4.2" "-mpopcnt")
endif () android/src/main/java/com/rnllama/LlamaContext.java ...
static {
Log.d(NAME, "Primary ABI: " + Build.SUPPORTED_ABIS[0]);
if (LlamaContext.isArm64V8a()) {
String cpuFeatures = LlamaContext.getCpuFeatures();
Log.d(NAME, "CPU features: " + cpuFeatures);
boolean hasFp16 = cpuFeatures.contains("fp16") || cpuFeatures.contains("fphp");
boolean hasDotProd = cpuFeatures.contains("dotprod") || cpuFeatures.contains("asimddp");
boolean isAtLeastArmV82 = cpuFeatures.contains("asimd") && cpuFeatures.contains("crc32") && cpuFeatures.contains("aes");
boolean isAtLeastArmV84 = cpuFeatures.contains("dcpop") && cpuFeatures.contains("uscat");
if (isAtLeastArmV84 && hasFp16 && hasDotProd) {
Log.d(NAME, "Loading librnllama_v8_4_fp16_dotprod.so");
System.loadLibrary("rnllama_v8_4_fp16_dotprod");
} else if (isAtLeastArmV82 && hasFp16 && hasDotProd) {
Log.d(NAME, "Loading librnllama_v8_2_fp16_dotprod.so");
System.loadLibrary("rnllama_v8_2_fp16_dotprod");
} else if (isAtLeastArmV82 && hasFp16) {
Log.d(NAME, "Loading librnllama_v8_2_fp16.so");
System.loadLibrary("rnllama_v8_2_fp16");
} else {
Log.d(NAME, "Loading librnllama_v8.so");
System.loadLibrary("rnllama_v8");
}
} else if (LlamaContext.isX86_64()) {
Log.d(NAME, "Loading librnllama_x86_64.so");
System.loadLibrary("rnllama_x86_64");
} else {
Log.d(NAME, "Loading default librnllama.so");
System.loadLibrary("rnllama");
}
}
private static boolean isArm64V8a() {
return Build.SUPPORTED_ABIS[0].equals("arm64-v8a");
}
private static boolean isX86_64() {
return Build.SUPPORTED_ABIS[0].equals("x86_64");
}
private static String getCpuFeatures() {
File file = new File("/proc/cpuinfo");
StringBuilder stringBuilder = new StringBuilder();
try {
BufferedReader bufferedReader = new BufferedReader(new FileReader(file));
String line;
while ((line = bufferedReader.readLine()) != null) {
if (line.startsWith("Features")) {
stringBuilder.append(line);
break;
}
}
bufferedReader.close();
return stringBuilder.toString();
} catch (IOException e) {
Log.w(NAME, "Couldn't read /proc/cpuinfo", e);
return "";
}
}
...
|
Related to:
Vali-98/ChatterUI#20
Model used:
https://huggingface.co/Crataco/stablelm-2-1_6b-chat-imatrix-GGUF/blob/main/stablelm-2-1_6b-chat.Q4_K_M.imx.gguf
llama.rn version:
0.3.1
Error provided my llama.rn:
From what I can tell its attempting to memory outside its address space. Oddly enough, this doesn't occur in emulator, only built apk's.
The text was updated successfully, but these errors were encountered: