-
-
Notifications
You must be signed in to change notification settings - Fork 41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
⚡️ Make memory estimates data-dependent #1480
Comments
MM suggested using [1200, 600, 300, 150, 75] TRs × a subject from [HCP, HBN, HNU, NYU test/retest] to generate the initial estimates to generate the formula for estimating. |
|
Per @anibalsolon's suggestion, I'm trying to override For the first iteration, I'm only considering the time dimension as variable. Once that works, I'll refactor to include x, y, and z dimensions and see if I can figure out a way to simplify the paramaterization. Currently, C-PAC/CPAC/pipeline/nipype_pipeline_engine/engine.py Lines 68 to 79 in 0626a66
like C-PAC/CPAC/generate_motion_statistics/generate_motion_statistics.py Lines 196 to 199 in 0626a66
for a memory estimate of It would of course be better to
For now, I just plugged in a couple of these and kicked off a couple runs to test the proof of concept. |
Related problem
Originally posted by @shnizzedy in #1479 (comment)
Related #1166, #1301, #1404, #1453
Proposed feature
Starting with the memory-hungriest nodes (those in the table above),
Additional context
Some of these estimates could potentially get complicated as the pipeline progresses through transforms and resamplings.
The text was updated successfully, but these errors were encountered: