-
Notifications
You must be signed in to change notification settings - Fork 45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About parameter number and in_dim #10
Comments
Thanks for your feedback. |
Thank you so much for your quick reply! |
Your problems are not accidental. We also find it tricky to deal with large model parameters The information that can be provided is that Latent_AE_cnn_big under core/module/modules/autoencoder.py is the autoencoder model for large model parameters. But it also takes a long time to train, and our future task is to solve large-scale parameter generation. Thanks for your feedback again. |
Thank you!!! I will try to use Latent_AE_cnn_big for training autoencoder. Thanks again!! |
Sorry to hear that you are having such difficulty. But I never try more checkpoints. |
Hi @1zeryu , would you mind adding more details about your code in the README? I am having trouble understanding the hierarchical structure and how to make modifications. |
Sure, I will do it soon. |
Hi, I find that in the experiments for entire networks, the parameter number is very large (more than 50,000 or 100,000).
How to use the encoder to train the parameter embedding? If the in_dim = parameter number?
Would you like to provide the training details (especially the autoencoder) in the entire network generation?
Thank you for your reply!!!
The text was updated successfully, but these errors were encountered: