You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I try to train a stripedhyena model I keep getting issues with the stripedhyena modules seemingly trying to import modules from Flash Attention in an outdated way.
example:
AttributeError: module 'dropout_layer_norm' has no attribute 'dropout_add_ln_fwd'
For some modules, I could solve this by creating mock classes that redirect the module import to the correct modules within the current flash-attention implementation. But others (like 'dropout_add_ln_fwd') I can't find in flash-attn at all.
on a related note the current repo seems to be of StripedHyena 0.2.1 while the newest implementation is 0.2.2? (but issue presists on both 0.2.2 and 0.2.1)
Could you provide guidance on how to resolve these import issues or update the repository to ensure compatibility with the latest Flash Attention modules?
Thank you!
The text was updated successfully, but these errors were encountered:
I also had the error for module 'dropout_layer_norm'. resolved by pip install git+https://github.com/HazyResearch/flash-attention.git#subdirectory=csrc/layer_norm
When I try to train a stripedhyena model I keep getting issues with the stripedhyena modules seemingly trying to import modules from Flash Attention in an outdated way.
example:
AttributeError: module 'dropout_layer_norm' has no attribute 'dropout_add_ln_fwd'
For some modules, I could solve this by creating mock classes that redirect the module import to the correct modules within the current flash-attention implementation. But others (like 'dropout_add_ln_fwd') I can't find in flash-attn at all.
on a related note the current repo seems to be of StripedHyena 0.2.1 while the newest implementation is 0.2.2? (but issue presists on both 0.2.2 and 0.2.1)
Could you provide guidance on how to resolve these import issues or update the repository to ensure compatibility with the latest Flash Attention modules?
Thank you!
The text was updated successfully, but these errors were encountered: