-
Notifications
You must be signed in to change notification settings - Fork 62
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for models extra path #58
Comments
Yes, would be a nice to use the models from the local Ollama installation. |
#88 |
I was just about to ask why the models are hardwired i.e., won't work with 'extra_model_paths.yaml' then I found this post. |
Any updates on this ? # Comfy Utils
import folder_paths
comfy_model_dir = os.path.join(folder_paths.models_dir, "LLM") This seems to be enough to work with (Edit: nevermind it's a bit more complex and the PR #88 works just fine) |
Hi Kijai,
Thank you for creating this node.
ComfyUI allows to provide a custom path to store models in a single location and load them from there:
https://github.com/comfyanonymous/ComfyUI/blob/master/extra_model_paths.yaml.example
I have been using it successfully with other models, but it seems the Florence 2 nodes do not support this feature:
My
extra_model_paths.yaml
is as follow:ComfyUI-Florence2 returns the error:
It should search the models in the path
D:\alexis\ComfyUI-Models\models\LLM\
Cheers!
The text was updated successfully, but these errors were encountered: