Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to actually launch? #481

Open
Someguitarist opened this issue Nov 12, 2024 · 5 comments
Open

How to actually launch? #481

Someguitarist opened this issue Nov 12, 2024 · 5 comments

Comments

@Someguitarist
Copy link

This is definitely a dumb question but I don't see any kind of answer; I'm running a headless AI server that I SSH into remotely. I've installed the Appimage, but how can I connect to it? Does it need to be actually used on a machine with a K/V/M attached to use, or can I access it remotely through any other way that I'm missing?

@LFd3v
Copy link

LFd3v commented Nov 13, 2024

Hi, there.

The AppImage has ollama binaries included, so I think it will try to start it, and if you have another instance available locally this may cause a problem. You can try to run the AppImage with the option --appimage-extract in the terminal, delete the directory binaries inside resources and then run AppRun to start the app. If you have ollama available on localhost:11434 then it should be picked automatically.

Disclaimer: I am in no way associated with Reor project, just a regular user.

@Someguitarist
Copy link
Author

My problem isn't with having additional Ollama instances running, so much as running from a headless system there's no way to actually launch the app. From what I'm reading everywhere it looks like it needs to be installed locally, which is unfortunate because I use multiple computers/phone to add notes to my current note taking system (TrilliumNext).

I would absolutely love to use this as it looks like exactly what I'm looking for, but unfortunately not having a web based front end is a non-starter for me, unless I'm misunderstanding something.

@LFd3v
Copy link

LFd3v commented Nov 15, 2024

@Someguitarist Ah, yes, you are correct. Reor is an Electron app, not a regular web app. It must be run locally, only the AI model can be accessed remotely as far as I can tell. What you could do is use Reor locally, and then use a ssh tunnel to access the remote file vault and ollama instance.

I am not aware about other FOSS projects that allow you to do the same as Reor using a web app besides, maybe, latest NextCloud versions, but then I do not know if it can use a local LLM model.

@Someguitarist
Copy link
Author

Dang, yeah, that's what I feared. There s bunch of FOSS note taking projects, and a few that involve AI, but nothing with tree based notetaking and AI questioning. I'd really, really like to use Reor, but I add notes constantly from my phone, work computer, home computer, etc.

@LFd3v
Copy link

LFd3v commented Nov 15, 2024

@Someguitarist My suggestion: use your headless server to host the vault and LLM model, use Reor on desktop and Obsisian on mobile, both would have access the same vault as both use just regular markdown files. If you also want a web app, use NextCloud to host the vault.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants