Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for models extra path #58

Open
alexisrolland opened this issue Aug 19, 2024 · 4 comments
Open

Add support for models extra path #58

alexisrolland opened this issue Aug 19, 2024 · 4 comments

Comments

@alexisrolland
Copy link

Hi Kijai,
Thank you for creating this node.

ComfyUI allows to provide a custom path to store models in a single location and load them from there:
https://github.com/comfyanonymous/ComfyUI/blob/master/extra_model_paths.yaml.example

I have been using it successfully with other models, but it seems the Florence 2 nodes do not support this feature:

My extra_model_paths.yaml is as follow:

comfyui:
    base_path: D:\alexis\ComfyUI-Models\
    checkpoints: models\checkpoints\
    clip_vision: models\clip_vision\
    controlnet: models\controlnet\
    depthanything: models\depthanything\
    insightface: models\insightface\
    ipadapter: models\ipadapter\
    LLM: models\LLM\
    loras: models\loras\
    upscale_models: models\upscale_models\

ComfyUI-Florence2 returns the error:

FileNotFoundError: [WinError 3] The system cannot find the path specified: 'D:\alexis\ComfyUI-Creative-Upscale\ComfyUI\models\LLM'

It should search the models in the path D:\alexis\ComfyUI-Models\models\LLM\

Cheers!

@VicRejkia
Copy link

Yes, would be a nice to use the models from the local Ollama installation.

@CPPAlien
Copy link

#88
You can look this PR. Or you can use this repo https://github.com/CPPAlien/ComfyUI-Florence2

@vootox
Copy link

vootox commented Nov 17, 2024

I was just about to ask why the models are hardwired i.e., won't work with 'extra_model_paths.yaml' then I found this post.

@franckdsf
Copy link

franckdsf commented Jan 21, 2025

Any updates on this ?

# Comfy Utils
import folder_paths

comfy_model_dir = os.path.join(folder_paths.models_dir, "LLM")

This seems to be enough to work with LLM in extra_path

(Edit: nevermind it's a bit more complex and the PR #88 works just fine)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants