Currently Being ModeratedMar 13, 2012 11:19 AM (in response to deborah90)
What kind of files. Finding duplicate image files is different from finding duplicate text file which is different from finding duplicate audio files......
You can see where that is going. So if you described in a bit more detail what it is you;re looking to do someone might come up with something.
Currently Being ModeratedMar 13, 2012 11:41 AM (in response to deborah90)
The link provided by DCJ1 looks like it would be worth a test drive. They hav a free trial so you could give it a try and see if it does what you need it to do.
Let us know how it works for you if you try it,
Currently Being ModeratedMay 14, 2012 10:05 PM (in response to deborah90)
Did you try singlemizerapp?
I happened to run "locate protractor" to find an image of a protractor I had long ago filed away for future use and needed.
By doing so, I stumbled upon around 40 megabytes of wasted space in two nearly identical directories due to ~1200 duplicate images; I think these are found in a stock Lion install:
diff -r "/Volumes/SSD/Library/Dictionaries/New Oxford American Dictionary.dictionary/Contents/Images/" "/Volumes/SSD/Library/Dictionaries/Oxford Dictionary of English.dictionary/Contents/Images/" | wc
6 52 981
[21:26][elvey@Computer-of-ME /Downloads]$ diff -rs "/Volumes/SSD/Library/Dictionaries/New Oxford American Dictionary.dictionary/Contents/Images/" "/Volumes/SSD/Library/Dictionaries/Oxford Dictionary of English.dictionary/Contents/Images/" | wc
1207 15326 282827
I don't see why apple lets this slide, expecially with so many folks using SSDs to hold their OS install these days.
I'm hoping to run a dedupe that replaces duplicates with a hard link to the original. It would be cool if the filesystem was enhanced to support copy-on-write for hardlinks. I wonder if some of the cooler filesystems like ZFS or ReiserFS support that. Even without copy-on-write, in nearly all cases, replacing duplicates with a hard link to an original copy shouldn't cause problems. Normally (on HFS+ partitions, if two+ files point to the same inode and one is deleted, the refcount is decremented, but the non-deleted files remain intact. The likely only way this would be a problem is if a file was altered in place - all files using the same inode would change.
I haven't come to the point where I have a script ready to alpha test, but did start scheming.
http://code.activestate.com/recipes/362459/=Dupinator (in Python) and fdupes.pl are good starting points.
I also ran a handful of commands and scripts to address the most piggish duplicates found by MacKeper's "Duplicates Finder".
Currently Being ModeratedMay 14, 2012 11:53 PM (in response to MrElvey)
Singlemizer seems to be a cut above! :
"With Singlemizer you can save disk space occupied by such duplicates via replacing them with links to originals. All variants at your service: aliases, symlinks and even geeky hardlinks!"
And the version for 10.5 and 10.6 is currently free!