-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Data transfer and HPSS storage of CS catalogs #394
Comments
I believe these directories comprise the whole of |
I think we would need to check with @evevkovacs and @yymao to see if all the catalogs that will remain on disk should be referenced in GCR. |
No, not all of the subdirectories that CS asks to keep active on CFS need to be made available in |
The other directories are mostly auxiliary data that were used to make the catalogs and need to be kept. We can rethink this if need be, but for now I think it makes sense to keep them in the catalogs directory. They are all relevant to past or ongoing work. |
With the new filesystem, has this all been sorted out? If not, what else needs to be done? @heather999 @yymao @evevkovacs. Maybe it's possible to write a very brief conclusion and close this issue? Thanks! |
We still want to back up all the catalogs and those that have been identified for removal can be removed from CFS. |
@heather999 Can this be marked as done? |
Unfortunately, not yet. Hoping to get this completed over the next couple of weeks. |
As discussed this week at the CS meeting and in issue LSSTDESC/desc-help#11, there is a planned data transfer at NERSC from projecta to CFS. This includes the cosmoDC2 catalogs currently stored under
/global/projecta/projectdirs/lsst/groups/CS
. The CS group has identified those catalogs that should remain active on CFS (as well as stored to NERSC HPSS) and those that can be copied to HPSS and removed.Here is the list of catalogs, and their sizes, that will remain available on the CFS:
The list of catalogs that will be copied to NERSC HPSS and removed:
The text was updated successfully, but these errors were encountered: