Replies: 2 comments
-
I guess nobody knows... |
Beta Was this translation helpful? Give feedback.
0 replies
-
While the performance is the same, occasionally some custom nodes' dependencies require xformers. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi everyone,
Could someone tell me what is the difference between pytorch cross attention and xformers cross attention?
What are the pros and cons of both?
And why did you choose one rather than the other?
Beta Was this translation helpful? Give feedback.
All reactions