Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why does the memory usage keep increasing along with steps? #184

Open
cocktailpeanut opened this issue Nov 15, 2024 · 1 comment
Open

Why does the memory usage keep increasing along with steps? #184

cocktailpeanut opened this issue Nov 15, 2024 · 1 comment

Comments

@cocktailpeanut
Copy link
Contributor

I've been doing some experiments, here are my observations (when using the cpu offload, which is the default):

  1. At the beginning it uses low memory
  2. But as it reaches more steps, the memory increases a lot

This is especially more pronounced with the 768p model. On my 4090 when I run a text2vid or img2vid inference with 768p, it starts out using only like 8G VRAM, but as it reaches around 13 steps, it uses up the entire VRAM (24GB) and at that point significantly slows down the inference speed. Basically, the first 13 steps take like 5 minutes, whereas the last 2 steps alone (14 and 15) take like 15 minutes.

Is this the intended behavior? If we're using cpu offload, shouldn't it keep increasing and have a certain threshold where it never exceeds?

@feifeiobama
Copy link
Collaborator

This is because Pyramid Flow is an autoregressive video generation model, and its VRAM increases as the number of history frames increases. Perhaps you can try empty the cuda cache at the end of each frame and see if that saves VRAM.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants