Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About mini-batch training and edge bucket #120

Open
YijianLiu opened this issue Oct 27, 2022 · 1 comment
Open

About mini-batch training and edge bucket #120

YijianLiu opened this issue Oct 27, 2022 · 1 comment
Labels
question Further information is requested

Comments

@YijianLiu
Copy link

Whether each bucket will perform mini-batch training?
On your paper, whether each bucket performs 4(bound) mini-batch training? Is my understanding correct?
Thanks a lot!

@YijianLiu YijianLiu added the question Further information is requested label Oct 27, 2022
@JasonMoho
Copy link
Collaborator

For link prediction, batching is not performed over individual edge buckets, but rather over the in-memory subgraph, which is a union of the edge buckets currently in memory. From the in-memory subgraph, a set of training edges are selected and mini-batches are created over this selection. The reasoning of using this selection process and mini-batch generation are located in the final paragraph of Section 5.1. in our GNN paper.

Batches can be processed synchronously, or asynchronously with some configurable staleness bound.

Let me know if that answers your question.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants