You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, FLA only supports interleaving sliding window attention with linear attention layers. We expect more interesting hybrid patterns, such as those used in Hamba, Titans, and others, to be implemented in the future.
Rationale
No response
The text was updated successfully, but these errors were encountered:
Proposal
Currently, FLA only supports interleaving sliding window attention with linear attention layers. We expect more interesting hybrid patterns, such as those used in Hamba, Titans, and others, to be implemented in the future.
Rationale
No response
The text was updated successfully, but these errors were encountered: