Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Efficient Transformers: A Survey #934

Open
junxnone opened this issue Feb 22, 2021 · 0 comments
Open

Efficient Transformers: A Survey #934

junxnone opened this issue Feb 22, 2021 · 0 comments
Labels

Comments

@junxnone
Copy link
Owner

junxnone commented Feb 22, 2021

Reference

  • 2020-09 Efficient Transformers: A Survey [Paper]

Brief

Motivation

  • Transformer.Self-Attention 计算复杂度高 O(n^2)
  • 降低计算复杂度的方法研究 - Efficient Transformers
  • Sparse Attention
    • Fixed/Factorized/Random patterns
      • Blockwise Patterns -chunking - 序列分块,只在块内计算 attention
      • Strided Patterns -Dilated-window - 每个token 只计算和相邻 token 之间的 attention
      • Compressed Patterns -Down-sample - 序列降采样后再做 attention
    • Learnable patterns - 引入学习 sequences 分割 机制
    • Memory
    • Low rank/Kernels - 使用低秩矩阵近似 减少计算量
    • Recurrence - block + 循环连接

Key Papers

image

image

Evaluation

Tricks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant