-
Notifications
You must be signed in to change notification settings - Fork 456
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to actually launch? #481
Comments
Hi, there. The AppImage has ollama binaries included, so I think it will try to start it, and if you have another instance available locally this may cause a problem. You can try to run the AppImage with the option Disclaimer: I am in no way associated with Reor project, just a regular user. |
My problem isn't with having additional Ollama instances running, so much as running from a headless system there's no way to actually launch the app. From what I'm reading everywhere it looks like it needs to be installed locally, which is unfortunate because I use multiple computers/phone to add notes to my current note taking system (TrilliumNext). I would absolutely love to use this as it looks like exactly what I'm looking for, but unfortunately not having a web based front end is a non-starter for me, unless I'm misunderstanding something. |
@Someguitarist Ah, yes, you are correct. Reor is an Electron app, not a regular web app. It must be run locally, only the AI model can be accessed remotely as far as I can tell. What you could do is use Reor locally, and then use a ssh tunnel to access the remote file vault and ollama instance. I am not aware about other FOSS projects that allow you to do the same as Reor using a web app besides, maybe, latest NextCloud versions, but then I do not know if it can use a local LLM model. |
Dang, yeah, that's what I feared. There s bunch of FOSS note taking projects, and a few that involve AI, but nothing with tree based notetaking and AI questioning. I'd really, really like to use Reor, but I add notes constantly from my phone, work computer, home computer, etc. |
@Someguitarist My suggestion: use your headless server to host the vault and LLM model, use Reor on desktop and Obsisian on mobile, both would have access the same vault as both use just regular markdown files. If you also want a web app, use NextCloud to host the vault. |
This is definitely a dumb question but I don't see any kind of answer; I'm running a headless AI server that I SSH into remotely. I've installed the Appimage, but how can I connect to it? Does it need to be actually used on a machine with a K/V/M attached to use, or can I access it remotely through any other way that I'm missing?
The text was updated successfully, but these errors were encountered: