Skip to content

Is it possible to use opt-sdp-attention instead of xformers? #293

Discussion options

You must be logged in to vote

Yes, the command line option to use it is: --use-pytorch-cross-attention

The reason I don't really advertise it is because according to tests performance is worse than xformers for most people at higher resolutions where speed is usually more important than at lower resolutions.

Replies: 3 comments 6 replies

Comment options

You must be logged in to vote
6 replies
@WASasquatch
Comment options

@comfyanonymous
Comment options

@WASasquatch
Comment options

@comfyanonymous
Comment options

@WASasquatch
Comment options

Answer selected by bitcrusherrr
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
5 participants