Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

why informer call a Fullattnetion but not a ProbAttention layer? #30

Open
aanxud888 opened this issue Jun 2, 2023 · 4 comments
Open
Labels
question Further information is requested

Comments

@aanxud888
Copy link

❔Question

why informer call a Fullattnetion but not a ProbAttention layer?

Additional context

为什么probAttention自定义层木有实现呀,还是我木有找到

@aanxud888 aanxud888 added the question Further information is requested label Jun 2, 2023
@LongxingTan
Copy link
Owner

Because I think it's just an efficiency setup for better training. Do you have scenario to use ProbAttanetion, if so, I can add it.

@aanxud888
Copy link
Author

yes,thanks.感谢大佬

@LongxingTan
Copy link
Owner

@aanxud888

prob attention implementation is here

But the test only covers the layer yet, not the informer model
可以参考

@aanxud888
Copy link
Author

感谢

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants