Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to avoid large batch been killed #132

Open
lxrswdd opened this issue Sep 18, 2024 · 1 comment
Open

How to avoid large batch been killed #132

lxrswdd opened this issue Sep 18, 2024 · 1 comment

Comments

@lxrswdd
Copy link

lxrswdd commented Sep 18, 2024

I have 5176 images with 5176 prompts

(dynamicrafter) xiangrui@ward-lambda01:~/video_generators/DynamiCrafter$ sh scripts/run_1024.sh 1024
@DynamiCrafter cond-Inference: 2024-09-18-15-48-40
Global seed set to 666
AE working on z of shape (1, 4, 32, 32) = 4096 dimensions.
>>> model checkpoint loaded.
Inference with 16 frames
Killed
(dynamicrafter) xiangrui@ward-lambda01:~/video_generators/DynamiCrafter$ 

The model worked well with 8 samples but once I replaced the default prompts with my 5000+ prompts. The task is killed.

@Doubiiu
Copy link
Owner

Doubiiu commented Sep 18, 2024

Hi. It may be due to the memory leakage issue, as we load all input data into memory at the same time. I think a solution is to use a data loader to fetch the input data by batch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants