Why not defrag?
OS X only defrags files that are 10 MBs or under.
OS X does not defrag larger data chunks.
If you have files that are large in data size and pieces are scattered about the drive that are larger than 10 MBs in size, OSX isn't going to do anything to fix these files and put the datanback into a contiguous stream. This can and does become a problem over time.
So, if a HD has never been regularly maintained, I submit that over time there is much fragmented data that occurs over time that only a program that has defragmentation capabilities can do!
Video files can get very large and I bet the data can get fragmented into pieces that are greater than 10 MBs in size.
Since I started participating in these particular iMac forums (used to be a PPC Mac user) I see more posts from older iMac users who comlain about their iMacs being slow, have tens of Gigabytes of data or more on their iMac drives and who have never done any kind of disk maintenance on their iMac drives since they originally bought the iMac, Which can be, at least 3+ years or more!
I believe drive defragging at some point is absolutely necessary!
Despite what some people believe here, TechTool Pro is a good program for doing file and HD maintenance.
This program has the ability to defrag an HD.
You have the option to defrag the entire drive ( data files and free space on the drive) or just the data, itself on the drive.
You can bet that once file data returns to a contiguous state, the Mac I question will be faster at accessing that data .
Including the data and files of OS X , itself.
Make sure you have a recent backup before using a utility app, like TechTool Pro, before you use it though.
OS X only defrags files that are 10 MBs or under.
OS X does not defrag larger data chunks.
Where did you get that piece of info?
Unix-based file systems with their i-node file access form of disk access doesn't care where the data is. I don't recommend defraggers of any kind. I don't trust them and I never will. The last thing I want is some third-party program shoving my stuff around a disk. I don't care if it is 10 bytes of 1TB they are totally unnecessary in this file system. And yes, I do keep backups (two).
People's machines slow down over time not because data is scattered but because they don't do proper housekeeping like checking their drives for disk errors periodically, not leaving enough space for the OS (10GB-15GB over the 10GB needed for the OS itself). Any additional head movement for read a large data file is insignificant.
This is the best way to defrag a dive and I still use this method today. (I only do this because, when testing software and before running the software on the main drive, I run it along with the OS on an external drive. If everything ok I clone the testing software along with the OS over to the main drive.)
In the old days (before Nortons) we used to clone the drive over to another drive and clone it back over to defrag a drive. This method will pack (defrag) the HD correctly. An few added benefits to this is you have a backup and this can be done at no added software cost. Not going into it but, I've read that defragging OSX can cause problems. Google it.
I've ran my old PC Mac for years, never defragging it, never a problem. With fast computers today along with fast drives and Os's I see no need to defrag a HD.
do i need to defrag or clean up files ,Terry
Defragmentation in OS X:
http://support.apple.com/kb/HT1375 which states:
You probably won't need to optimize at all if you use Mac OS X. Here's why:
- Hard disk capacity is generally much greater now than a few years ago. With more free space available, the file system doesn't need to fill up every "nook and cranny." Mac OS Extended formatting (HFS Plus) avoids reusing space from deleted files as much as possible, to avoid prematurely filling small areas of recently-freed space.
- Mac OS X 10.2 and later includes delayed allocation for Mac OS X Extended-formatted volumes. This allows a number of small allocations to be combined into a single large allocation in one area of the disk.
- Fragmentation was often caused by continually appending data to existing files, especially with resource forks. With faster hard drives and better caching, as well as the new application packaging format, many applications simply rewrite the entire file each time. Mac OS X 10.3 onwards can also automatically defragment such slow-growing files. This process is sometimes known as "Hot-File-Adaptive-Clustering."
- Aggressive read-ahead and write-behind caching means that minor fragmentation has less effect on perceived system performance.
Whilst 'defragging' OS X is rarely necessary, Rod Hagen has produced this excellent analysis of the situation which is worth reading:
Most users, as long as they leave plenty of free space available , and don't work regularly in situations where very large files are written and rewritten, are unlikely to notice the effects of fragmentation on either their files or on the drives free space much.
As the drive fills the situations becomes progressively more significant, however.
Some people will tell you that "OSX defrags your files anyway". This is only partly true. It defrags files that are less than 20 MB in size. It doesn't defrag larger files and it doesn't defrag the free space on the drive. In fact the method it uses to defrag the smaller files actually increases the extent of free space fragmentation. Eventually, in fact, once the largest free space fragments are down to less than 20 MB (not uncommon on a drive that has , say only 10% free space left) it begins to give up trying to defrag altogether. Despite this, the system copes very well without defragging as long as you have plenty of room.
Again, this doesn't matter much when the drive is half empty or better, but it does when it gets fullish, and it does especially when it gets fullish if you are regularly dealing with large files , like video or serious audio stuff.
If you look through this discussion board you will see quite a few complaints from people who find that their drive gets "slow". Often you will see that say that "still have 10 or 20 gigs free" or the like. On modern large drives by this stage they are usually in fact down to the point where the internal defragmentation routines can no longer operate , where their drives are working like navvies to keep up with finding space for any larger files, together with room for "scratch files", virtual memory, directories etc etc etc. Such users are operating in a zone where they put a lot more stress on their drives as a result, often start complaining of increased "heat", etc etc. Most obviously, though, the computer slows down to a speed not much better than that of molasses. Eventually the directories and other related files may collapse altogether and they find themselves with a next to unrecoverable disk problems.
By this time, of course, defragging itself has already become just about impossible. The amount of work required to shift the data into contiguous blocks is immense, puts additional stress on the drive, takes forever, etc etc. The extent of fragmentation of free space at this stage can be simply staggering, and any large files you subsequently write are likely to be divided into many , many tens of thousands of fragments scattered across the drive. Not only this, but things like the "extents files", which record where all the bits are located, will begin to grow astronomically as a result, putting even more pressure on your already stressed drive, and increasing the risk of major failures.
Ultimately this adds up to a situation where you can identify maybe three "phases" of mac life when it comes to the need for defragmentation.
In the "first phase" (with your drive less than half full), it doesn't matter much at all - probably not enough to even make it worth doing.
In the "second phase" (between , say 50% free space and 20% free space remaining) it becomes progressively more useful, but , depending on the use you put your computer to you won't see much difference at the higher levels of free space unless you are serious video buff who needs to keep their drives operating as efficiently and fast as possible - chances are they will be using fast external drives over FW800 or eSata to compliment their internal HD anyway.
At the lower end though (when boot drives get down around the 20% mark on , say, a 250 or 500 Gig drive) I certainly begin to see an impact on performance and stability when working with large image files, mapping software, and the like, especially those which rely on the use of their own "scratch" files, and especially in situations where I am using multiple applications simultaneously, if I haven't defragmented the drive for a while. For me, defragmenting (I use iDefrag too - it is the only third party app I trust for this after seeing people with problems using TechToolPro and Drive Genius for such things) gives a substantial performance boost in this sort of situation and improves operational stability. I usually try to get in first these days and defrag more regularly (about once a month) when the drive is down to 30% free space or lower.
Between 20% and 10% free space is a bit of a "doubtful region". Most people will still be able to defrag successfully in this sort of area, though the time taken and the risks associated increase as the free space declines. My own advice to people in this sort of area is that they start choosing their new , bigger HD, because they obviously are going to need one very soon, and try to "clear the decks" so that they maintain that 20% free buffer until they do. Defragging regularly (perhaps even once a fortnight) will actually benefit them substantially during this "phase", but maybe doing so will lull them into a false sense of security and keep them from seriously recognising that they need to be moving to a bigger HD!
Once they are down to that last ten per cent of free space, though, they are treading on glass. Free space fragmentation at least will already be a serious issue on their computers but if they try to defrag with a utility without first making substantially more space available then they may find it runs into problems or is so slow that they give up half way through and do the damage themselves, especially if they are using one of the less "forgiving" utilities!
In this case I think the best way to proceed is to clone the internal drive to a larger external with SuperDuper, replace the internal drive with a larger one and then clone back to it. No-one down to the last ten percent of their drive really has enough room to move. Defragging it will certainly speed it up, and may even save them from major problems briefly, but we all know that before too long they are going to be in the same situation again. Better to deal with the matter properly and replace the drive with something more akin to their real needs once this point is reached. Heck, big HDs are as cheap as chips these days! It is mad to struggle on with sluggish performance, instability, and the possible risk of losing the lot, in such a situation.