Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Didn't concat node's vector with hyperedge's vector before computing the attention of UniGATConv #46

Open
ShangGuanShiHu opened this issue Jan 9, 2024 · 0 comments

Comments

@ShangGuanShiHu
Copy link

ShangGuanShiHu commented Jan 9, 2024

Describe the bug
In the paper 'UniGNN: a Unified Framework for Graph and Hypergraph Neural Networks,' the input to the attention module of UniGAT is the vector obtained by concatenating the node vector and the hyperedge vector. However, during the implementation of the approach, in the forward function of UniGATConv, the input for atten_e is the hyperedge vector alone, rather than the vector formed by concatenating the node vector and the hyperedge vector.

I appreciate your contributions to such a valuable project, which has been immensely helpful to me in studying hypergraph techniques. Thank you for your attention.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant