Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory Usage Spike with create_array() function #926

Open
codegunx opened this issue Nov 19, 2024 · 0 comments
Open

Memory Usage Spike with create_array() function #926

codegunx opened this issue Nov 19, 2024 · 0 comments

Comments

@codegunx
Copy link

codegunx commented Nov 19, 2024

I am learning how to work with arrays that exceed memory capacity and have been following the tutorial notebook:
tutorial_nbs/11_How_to_train_big_arrays_faster_with_tsai.ipynb.

To practice, I copied and modified the example code from the tutorial to create a .npy file on disk, using an array of approximately 3.9GB (to be on the safe side I started with a small array). The code I used is as follows (I am using Spyder as my IDE):

from tsai.all import *

path = Path('data')
X_large = create_array((100_000, 10, 1000), fname='X_large', path='data', mode='r+')

Observed Behavior

1. High Memory Usage During Execution

  • When I run the script, the memory usage increases until my computer runs out of memory (My computer has 16Gb memory).
  • The memory usage does not drop even after the script finishes execution.

2. Unexpected Memory Spike During Import (This is even more interesting)

  • After restarting Spyder (to release RAM) and start typing a new script that begins with from tsai.all import *, memory usage starts rising again until the system runs out of memory.
  • This behavior occurs even when I am simply TYPING code, not executing the script.

Attempts to Troubleshoot

I reproduced this issue on two different PCs, and the results were identical.

Request for Assistance

  • Is this a known issue, or is there a specific reason why from tsai.all import * causes such behavior?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant