You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the above iteration I haven't initialized the model.
2. Now I run the cell the model is loaded and it is showing 6GB of vram occupied right.
3. Now when I run the cell again the vram usage is doubled.
4. In the consequent runs the model is not occupying more than 12GB but what's interesting thing I have observed is when I am running that inside a loop for suppose I want to create an Index for each file I have, I don't have any other option than do this but this is causing the model to give me vram issues. How do I remove them from vram, I tried torch cuda cache free, tried to delete the variable none isn't working for me. Can you please help or is there something I am doing wrongly ?
The text was updated successfully, but these errors were encountered:
Hey colab notebook this is just a basic observation I had. You might have more understanding than me, I added comments in the notebook on what I observed.
I am trying to run the model in Jupyter notebook.
2. Now I run the cell the model is loaded and it is showing 6GB of vram occupied right.
3. Now when I run the cell again the vram usage is doubled.
4. In the consequent runs the model is not occupying more than 12GB but what's interesting thing I have observed is when I am running that inside a loop for suppose I want to create an Index for each file I have, I don't have any other option than do this but this is causing the model to give me vram issues. How do I remove them from vram, I tried torch cuda cache free, tried to delete the variable none isn't working for me. Can you please help or is there something I am doing wrongly ?
The text was updated successfully, but these errors were encountered: