You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To practice, I copied and modified the example code from the tutorial to create a .npy file on disk, using an array of approximately 3.9GB (to be on the safe side I started with a small array). The code I used is as follows (I am using Spyder as my IDE):
When I run the script, the memory usage increases until my computer runs out of memory (My computer has 16Gb memory).
The memory usage does not drop even after the script finishes execution.
2. Unexpected Memory Spike During Import (This is even more interesting)
After restarting Spyder (to release RAM) and start typing a new script that begins with from tsai.all import *, memory usage starts rising again until the system runs out of memory.
This behavior occurs even when I am simply TYPING code, not executing the script.
Attempts to Troubleshoot
I reproduced this issue on two different PCs, and the results were identical.
Request for Assistance
Is this a known issue, or is there a specific reason why from tsai.all import * causes such behavior?
The text was updated successfully, but these errors were encountered:
I am learning how to work with arrays that exceed memory capacity and have been following the tutorial notebook:
tutorial_nbs/11_How_to_train_big_arrays_faster_with_tsai.ipynb.
To practice, I copied and modified the example code from the tutorial to create a
.npy
file on disk, using an array of approximately 3.9GB (to be on the safe side I started with a small array). The code I used is as follows (I am using Spyder as my IDE):Observed Behavior
1. High Memory Usage During Execution
2. Unexpected Memory Spike During Import (This is even more interesting)
Attempts to Troubleshoot
Request for Assistance
The text was updated successfully, but these errors were encountered: