YAML input for Sage Attention node? #12
Replies: 5 comments 1 reply
-
No need to apologize. There isn't really specific documentation for using it as it is an advanced feature that allows you to directly pass (or override) arguments for the SageAttention function(s) which are provided by the SageAttention package. So basically, if something in the package changes, that would also change what you can/should pass. This reply got pretty long so if you want a TL;DR: If you don't already know you need to change something there then it's probably not something you need to worry about/mess with. If you still want to know what you could use it to do, then read on! You pretty much need to look at the source code for SageAttention to see what it parameters you can change: https://github.com/thu-ml/SageAttention/blob/main/sageattention/core.py The Most keys you specify will just be passed to the function directly but there are a couple exceptions. Just for example, the only head dimension in SDXL that SageAttention currently supports is 64, Note that I'm not saying it's better to change that or anything, just that you can for a different effect (I like having knobs to turn!). Passing incorrect parameters, forcing a kernel your architecture doesn't expect, etc are all things that are likely to cause SageAttention to have unintended behavior/perform worse/maybe even crash ComfyUI or Torch. In other words, if you mess with that then you shouldn't bug the SageAttention authors if you run into problems. Hope this helps answer your question! |
Beta Was this translation helpful? Give feedback.
-
By the way, I recommend using The sampler specific version has less sharp edges. |
Beta Was this translation helpful? Give feedback.
-
Thank you for the helpful explanation. |
Beta Was this translation helpful? Give feedback.
-
@EnragedAntelope I just merged some changes for the bleh SageAttention support that should make it work for Flux now (in my limited testing I didn't notice a massive performance increase though). I recommend using the recently released SageAttention 2.0.1 if possible. Will also work for some other models like SD 1.5 which it didn't before. |
Beta Was this translation helpful? Give feedback.
-
Thank you! FWIW Kijai nodes also do this so now two sage attn options for flux/other. |
Beta Was this translation helpful? Give feedback.
-
Apologies if I missed it but I read the readme and didn't see info on what the YAML input is expecting specifically for the Sage Attention new nodes? Please share guidance on how best to use this... I tried SageAttn2 with CogVideoX via Kijai's nodes there and am very impressed at the speedup so looking forward to using it more. Thanks!
Beta Was this translation helpful? Give feedback.
All reactions