Replies: 1 comment 1 reply
-
Given LlamaFile exposes an OpenAI API compatible proxy. You can have Khoj use any Llamafile based chat model. See the Khoj OpenAI Proxy Docs for how to set it up P.S: Not sure how I missed replying before, my notification for discussion topics maybe wonky |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Not sure if those are just hype or not, but LLamaFile recently noted that they found a way to speed up CPU inference. Would be nice to do fast Chatbots with docs on laptop.
Beta Was this translation helpful? Give feedback.
All reactions