-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature request: Expanding on the deduplication function #64
Comments
What happens to your files in this scenario currently? Which one is kept? It's been a long time since I wrote the de-dupe aspects so I cant remember now how it determines this. But aside from that point, yes, I understand what you are asking. |
Hi, For the issue mentioned above. Lets say I have a folder with the following files with exact same hash: After using the built in de-dup function, the only file left behind is: So I assume the program keep the first file and remove the rest that comes after. For my case, I would prefer to keep the original file, a.txt. Another thing I find useful would be a refresh function? So that the user wouldn't have to "Select Folder" again to refresh the list. Thank you. |
Keeping the oldest file would make more sense I guess. The name sorting order would differ depending on how you sort. The program could also ask which of the two duplicates it found you want to keep/delete. |
It would be great if the software can extend it's functionality to what you outlined! =D |
Hi,
I would like to request a supplementary function the to de-duplication feature.
For my case I have multiple files with the same name so windows will automatically add on a (1), (2), (3) and so on behind the file name.
If the hash turns out to be identical, I want the choice to de-duplicate and remove those files with the numbers behind them while retaining the ones with original file name.
Maybe we can do it via simple sorting of file name in descending order so the files without the numbers ranks on top and thus will be retained?
Thank you.
The text was updated successfully, but these errors were encountered: