-
Hi, I noticed that the inference logic of omnivision (libomni_vlm_shared) seems to be downloaded during install. |
Beta Was this translation helpful? Give feedback.
Answered by
zhiyuan8
Nov 29, 2024
Replies: 1 comment 1 reply
-
@pinyin libomni_vlm is generated during compilation of the project, you can find more from our code implementations here: |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
zhiyuan8
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
@pinyin libomni_vlm is generated during compilation of the project, you can find more from our code implementations here:
https://github.com/NexaAI/nexa-sdk/blob/main/nexa/gguf/nexa_inference_vlm_omni.py
https://github.com/NexaAI/nexa-sdk/blob/main/nexa/gguf/llama/omni_vlm_cpp.py