We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
why informer call a Fullattnetion but not a ProbAttention layer?
为什么probAttention自定义层木有实现呀,还是我木有找到
The text was updated successfully, but these errors were encountered:
Because I think it's just an efficiency setup for better training. Do you have scenario to use ProbAttanetion, if so, I can add it.
Sorry, something went wrong.
yes,thanks.感谢大佬
@aanxud888
prob attention implementation is here
But the test only covers the layer yet, not the informer model 可以参考
感谢
No branches or pull requests
❔Question
why informer call a Fullattnetion but not a ProbAttention layer?
Additional context
为什么probAttention自定义层木有实现呀,还是我木有找到
The text was updated successfully, but these errors were encountered: