You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a 20-year-old 15G repo that filter-repo brings down to 1.9G. When i run git gc --agressive after git filter-repo, the repo goes down to 1.5G, which is a 21% decrease in size. This is a one-time operation. I plan to create a new repo out of this and have my team clone the new repo.
git gc docs mention some concerns about --aggressive:
This will throw away any existing deltas and re-compute them, at the expense of spending much more time on the repacking.
The effects of this are mostly persistent, e.g. when packs and loose objects are coalesced into one another pack the existing deltas in that pack might get re-used, but there are also various cases where we might pick a sub-optimal delta from a newer pack instead.
It seems in my case the time spent is not significant, and the effect of possibly picking a sub-optimal delta do not matter much since there is an overall reduction in size.
They also mention:
It’s probably not worth it to use this option on a given repository without running tailored performance benchmarks on it. It takes a lot more time, and the resulting space/delta optimization may or may not be worth it. Not using this at all is the right trade-off for most users and their repositories.
I am not sure what they mean by "tailored performance benchmarks", but I understand there were some issues in the past with --aggressive using a --depth of 250 affecting the performance of future git operations, but it's now defaulting to 50 similar to the non-aggressive gc.
What I am looking to know is if there is some other reason git filter-repo doesn't support the --aggressive option in git gc ?
I searched previous issues and looked at the history of the repo all the way back to the commit where git gc was added to filter-repo but couldn't find context around it.
As i mentioned, this is a one-time operation my team is doing and having that extra bit of reduction in size will be beneficial in the long-term, so i would like to be aware of any additional pit-falls to using --aggressive.
The text was updated successfully, but these errors were encountered:
I have a 20-year-old 15G repo that filter-repo brings down to 1.9G. When i run git gc --agressive after git filter-repo, the repo goes down to 1.5G, which is a 21% decrease in size. This is a one-time operation. I plan to create a new repo out of this and have my team clone the new repo.
git gc docs mention some concerns about
--aggressive
:It seems in my case the time spent is not significant, and the effect of possibly picking a sub-optimal delta do not matter much since there is an overall reduction in size.
They also mention:
I am not sure what they mean by "tailored performance benchmarks", but I understand there were some issues in the past with
--aggressive
using a--depth
of 250 affecting the performance of future git operations, but it's now defaulting to 50 similar to the non-aggressive gc.What I am looking to know is if there is some other reason git filter-repo doesn't support the
--aggressive
option ingit gc
?I searched previous issues and looked at the history of the repo all the way back to the commit where git gc was added to filter-repo but couldn't find context around it.
As i mentioned, this is a one-time operation my team is doing and having that extra bit of reduction in size will be beneficial in the long-term, so i would like to be aware of any additional pit-falls to using
--aggressive
.The text was updated successfully, but these errors were encountered: