Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why did each batch has a memory? #21

Open
MrYaoH opened this issue May 20, 2023 · 1 comment
Open

Why did each batch has a memory? #21

MrYaoH opened this issue May 20, 2023 · 1 comment

Comments

@MrYaoH
Copy link

MrYaoH commented May 20, 2023

Dear author:
Your code of NTM is a pretty work, and its structure is concise and easy to follow. But I have a little confusion about why each batch has a memory? Why not using only a memory for every batch, just like a LSTM but just expanding the memory cell size? Could you help me address this confusion? Thank you very much!

@wabbajack1
Copy link

Consider why this makes sense on your computer: your computer often has multiple RAM modules, so an NTM could also be designed with additional RAM. Additionally, processing in batches is much more efficient than processing without batching.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants