Recommended approach for cleaning old logs and test files #803
-
This is a question and not a bug report. I run the validator locally in a docker container. The validator will store logs and test files in a few places under ~etf/ After a few weeks of intensive testing there is a large number of older logs and test files that are not anymore relevant. What is the best approach for cleaning them? I already see that one can manually delete each report in the interface but this is not practical when there are too many. Also test files will persist after doing this. Copies of data are stored in "etf/testdata/" and reports and logs are stored in various directories under "etf/ds/". Perhaps it is not a good idea to do a brute force "rm -r *" there. Perhaps there is a better way. I appreciate recommendations on how to do this gracefully. Hernán |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Dear @dhdeangelis It is normal that the files would grow exponentially. What we would recommend to handle this is to save the HTML report everytime a validation is made, to have a backup of this; and setup a nightly process to
To execute the removal, you can find here this very same process that is performed on the production instance every night to keep the file size limited. You can adapt this cron script to your necessities
It is very important to execute this process with the container stopped, otherwise you will get errors due to file descriptors being used by the validator process. |
Beta Was this translation helpful? Give feedback.
-
Thank you, @carlospzurita That's very helpful! |
Beta Was this translation helpful? Give feedback.
Dear @dhdeangelis
It is normal that the files would grow exponentially. What we would recommend to handle this is to save the HTML report everytime a validation is made, to have a backup of this; and setup a nightly process to
To execute the removal, you can find here this very same process that is performed on the production instance every night to keep the file size limited. You can adapt this cron script to your necessities