Getting Error Box on ComfyUI startup #5301
Closed
chnisar515
started this conversation in
General
Replies: 1 comment 24 replies
-
Do you really need xformers? As you said, ComfyUI is working good because now it uses Pytorch cross attention. |
Beta Was this translation helpful? Give feedback.
24 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello Experts,
Now its 2nd time i am getting this error box and i am trying to fix it with old method but it did not work.
Today i just updated my ComfyUI after that when i try to restart my ComfyUI then on startup its showing me an error box i have attached the file so please guide me how i can fix this issue. by the way my ComfyUI working good but that error just irritating me. and also when my CMD loaded fully it showing me the URL (http://127.0.0.1:8188) to run ComfyUI on browser manually but before this it will automatically launch on my default browser also guide me how i can fix this too.
below are the log of my cmd
`C:\ComfyUI_windows_portable>.\python_embeded\python.exe -s ComfyUI\main.py --windows-standalone-build
[START] Security scan
[DONE] Security scan
ComfyUI-Manager: installing dependencies done.
** ComfyUI startup time: 2024-10-21 02:06:12.666732
** Platform: Windows
** Python version: 3.11.9 (tags/v3.11.9:de54cf5, Apr 2 2024, 10:12:12) [MSC v.1938 64 bit (AMD64)]
** Python executable: C:\ComfyUI_windows_portable\python_embeded\python.exe
** ComfyUI Path: C:\ComfyUI_windows_portable\ComfyUI
** Log path: C:\ComfyUI_windows_portable\comfyui.log
Prestartup times for custom nodes:
0.0 seconds: C:\ComfyUI_windows_portable\ComfyUI\custom_nodes\rgthree-comfy
0.0 seconds: C:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Easy-Use
1.5 seconds: C:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Manager
Total VRAM 12288 MB, total RAM 32607 MB
pytorch version: 2.5.0+cu124
WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built for:
PyTorch 2.4.1+cu124 with CUDA 1204 (you have 2.5.0+cu124)
Python 3.11.9 (you have 3.11.9)
Please reinstall xformers (see https://github.com/facebookresearch/xformers#installing-xformers)
Memory-efficient attention, SwiGLU, sparse and more won't be available.
Set XFORMERS_MORE_DETAILS=1 for more details
xformers version: 0.0.28.post1
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 3060 : cudaMallocAsync
Using pytorch cross attention`
Beta Was this translation helpful? Give feedback.
All reactions