You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
SpeziLLM supports local LLMs. Using a local LLM seems particularly important when dealing with health data.
Solution
I see two possibilities: either allow for on-device LLMs with SpeziLLM, or allow customization of the OpenAI API URL, so that the app can be used with, for example, a local ollama server.
Additional context
No response
Code of Conduct
I agree to follow this project's Code of Conduct and Contributing Guidelines
The text was updated successfully, but these errors were encountered:
Problem
SpeziLLM supports local LLMs. Using a local LLM seems particularly important when dealing with health data.
Solution
I see two possibilities: either allow for on-device LLMs with SpeziLLM, or allow customization of the OpenAI API URL, so that the app can be used with, for example, a local ollama server.
Additional context
No response
Code of Conduct
The text was updated successfully, but these errors were encountered: