Looks like no one’s replied in a while. To start the conversation again, simply ask a new question.

Aperture 3 Library Fragmentation causes problems

Hi All,

I have read a number of the complaints in A3 conversion, and I think this may help solve some of them.

We have 6 systems with Libraries from 500GB to 1.2TB here and converted them all last weekend without serious problems, and all are enjoying superb increases in performance as I write this.

We did not have the conversion problems others have suffered, and I think I know why:

1) Disable Faces - Saves time in conversion, can be done afterwards.

2) DO NOT reprocess masters in the conversion, it can be done as needed once you are running, PLUS - WARNING - it will adversely effect many v2 images, changing exposure and colors, etc. PLUS, the conversion will go MUCH faster, as you have given it less to do. Check out the Apple Knowledgebase piece: "Aperture 3: Discussion about differences in exposure levels with Aperture 3 RAW decoding"
http://support.apple.com/kb/HT3948

3) Use an EMPTY freshly formatted volume that is at least double the size needed.

4) When conversion is complete, COPY your converted library to to another new freshly-formatted volume with at least 40% free space before you use it.

Why?.....

The normal operation of Aperture has always resulted in some disk fragmentation, the larger the libraries the worse the problem. Always working from a copy that has just been made eliminates the great majority of the fragmentation, and ensures it does not become a performance issue.

Conversely, if someone just keeps using the same Library, it will just get more fragmented, and slower and slower until real problems develop. If they are also running out of disk space, then the fragments become fragments as the file system feverishly tries to fit all your data into a shrinking space. All this eats time, and given the size and number of files to deal with it has become a problem that NO AMOUNT of fast hardware will fix entirely. We need to deal with some database best practices....but trust me it works and A3 simply screams on our systems here 2-3 times as fast as A2.

To test this premise that conversion of a large library is going to result in a tremendously fragmented mess on the volume, I reconverted my last Aperture 2 library today from a backup from last week. It is 500GB and it completed conversion to A3 as described above in about 4.5 hours. I then looked at the volume with iDefrag, and over 50% of the file was fragmented. My original A2 file, was not fragmented of course as it had come from a backup. I then opened the converted file in A3, and it began to process previews VERY slowly, with really lousy disk activity read/written numbers reported in Activity monitor.

I stopped the process, quit Aperture3 and made a copy of the file to a fresh volume. I immediately noted the backup was going VERY slowly...I had not noticed this last week as all our first backups were done overnight. As a benchmark, an unfragmented 500GB file will copy in about 2.5 hours and a bit....this backup took over 6 hours! It had a lot of file fragments to assemble to put the copy together, and it all takes time and resources.

When the copy was complete, I opened it again in A3, and the preview processing raced right along. Even before it was complete, the data was snappy and available...when the preview finished, it was a screamer....as our systems are today.

We use iDefrag, by Coriolis Systems to look at the fragmentation on these large files. It is a $30USD utility, and invaluable in providing some reality into why your system is slow. I invite you view you Library and see where your performance is gone. http://www.coriolis-systems.com/iDefrag.php

Apple tries to make all of their programs looks simple and carefree to use...fine...I submit when a database is hundreds of gigs NOTHING is simple and carefree. Performance requires understanding and some simple maintenance. Would you buy a Porsche and not change the oil, run it in a small parking lot and then complain to Porsche about performance? Kinda the same thing...

For those who are interested...

Our basic daily operation has the Aperture library running from a volume that consists of a RAID 0 pair in slots 1 and 2 an 5-bay eSATA array. There is an identical RAID 0 pair that carries a backup of the first one in slots 3-4. There is a rotating single backup mechanism that is used for daily offsite rotation in bay 5.

Every night a full finder copy is made from the day's working RAID 0 pair to the other pair, and to the offsite disk.

The next morning, the operator will verify the backups have occurred without incident, swap the offsite mechanism out and then start the day's work ON THE OTHER RAID 0 PAIR that was the backup of yesterday's data. Why?.....The normal operation of Aperture has always resulted in some disk fragmentation, the larger the libraries the worse the problem. Always working from a copy that has just been made eliminates the great majority of the fragmentation, and ensures it does not become a performance issue. We actually use 3 pairs to have redundant backups, but I won't confuse the basic issue here with that discussion.

Conversely from our methods, if someone just keeps using the same Library every day, it will just get more fragmented, gradually becoming slower and slower until real problems develop. If on top of that they are running out of disk space, then the fragments become fragments as the file system feverishly tries to fit all your data into a shrinking space. All this eats time, and given the size and number of files to deal with it has become a problem that NO AMOUNT of fast hardware will fix entirely.

Given the volume of data we digital photographers keep collecting, we need to take responsibility and deal with some database best practices....but trust me it works and A3 simply screams on our systems here 2-3 times as fast as A2.

Will this fix everything, for everyone???

Of course not...the wide variety of machines and configs make it impossible to predict this. I do feel pretty confident I can reliably make A3 run really fast on our systems here... and to note, the 6 workstations I have discussed are operating on 2008 MBPs (4,1) 6G RAM with 30" displays, not even Mac Pros and we are enjoying excellent throughput.

Hope this helps,

Sincerely,


K.J. Doyle

MBP 17" Glossy HiRes 2.6 6GB RAM, NVIDIA 8600 GT Video w 512MB, Mac OS X (10.6.2), 30" Cinema Display and External eSATA RAID for Library

Posted on Feb 19, 2010 7:43 PM

Reply
164 replies

Mar 3, 2010 11:46 PM in response to enchantedman

enchantedman wrote:
Kevin...you've been great. I'm an avid photographer with a relatively small database (about 110 GB). I have been using Aperture from the get-go and am extremely unhappy with current performance of 3.01, especially for panorama processed in Photoshop. Two questions:

Do you think my 4 GB (machine max) of memory will be enough going forward if I follow all of your steps?

If not, will any generation of iMac be able to handle Aperture as it develops, or do I need to look at Mac Pro?

You're advice has been great to follow.


Thanks for your kind words.

Your iMac6,1 came out in Sept 2006, in 2.16 and 2.33 models. It is 64bit architecture but actually only addresses 3GB of RAM, although 4 can be installed. 3GB just ain't very much space with a pro app like this.

Also, you have either a Nvidia 7300 GT w 128MB of VRAM or a 7600 GT with 256MB of VRAM...even the larger one is going to have slowdowns doing adjustments with the OpenCL-heavy design of A3.

What makes A3 sing for machines with enough RAM and VRAM is going to make you wait...quite a lot, I am afraid.

In my experience, 6GB is the sweet spot for my MBP running Aperture to go all day and avoid pageouts.

While I would of course leave it to the user themselves to monitor this on their own machine and configuration, I am going to bet you will be hitting pageouts quite a bit on this machine, and that cuts into the responsiveness of the app a great deal.

So, bottom line, even if you do what I am suggesting...I don't think you will be a happy camper given the relative insufficiency of space and processing horsepower.

Time for a new machine...=)

"If not, will any generation of iMac be able to handle Aperture as it develops, or do I need to look at Mac Pro?"

Frankly, if it had a decent disk subsystem iMac11,1 is the deal of the century. Quad core 3.06, 16GB RAM, Radeon 4870 with 512 VRAM(same as MacPro)...awesome, if it was not for the lame performance of FW800 as your only choice for external drive.

TO BE FAIR...you are not doing this professionally, and I guess if you made a 250GB partition on the internal drive DEDICATED to only your managed Aperture Library on that 11,1 beast it would be pretty sweet. I would buy one single FW800 enclosure, with removable drive trays and use it strictly for backup. Aperture as a computing experience is processing, not disk intensive, and I am sure this rig would outperform my MBP in regular Aperture use.

Remember, the high speed disk rig for me is a professional necessity to prevent downtime at all costs, and to have daily offsite disks with all the work on it. I am paying salaries, and don't want to lose ANYTHING.

As an enthusiast however, with a much smaller library and assuming you are not at this every day, all day...delaying the offsite by a day or the restore to operating status after a failure by a day or 2 is not the end of the world. Of course, maintaining your rig with my methods (or similar) is assumed to keep the performance you paid for.

So yes, I think you would be happy with the 11,1 rig...BUT, if you can wait a bit (isn't that always the deal, lol)...

USB 3 is beginning to trickle out onto the market. While it only works for PCs at the moment, no Mac drivers currently exist that I am aware of...if it works without incident it will eclipse FW as the interface for high speed disks... It is capable of 4.8GB/s theoretically, compared to eSATA 2 3GB/s, and eSATA 3 6/GBsec. Of course hard drive mechanisms top out at about 136MB/sec each, so this spec will need to connect to SSDs of an array of drives to tap that bandwidth.

Given how these things develop, it may be a year away or more before you can get a lot in USB 3, as Intel said that it would not officially support it until 2011. http://www.eetimes.com/news/latest/showArticle.jhtml?articleID=220700486

No, that is quite a wait...maybe a new machine now is best. Of course, as an investment (of sorts) the Mac Pro can continue to be updated and live longer than a fixed configuration, plus you have eSATA out of the gate...but you pay for that privilege up front.

Hope this helps,

Sincerely,

K. J. Doyle

Mar 4, 2010 8:54 AM in response to Kevin J. Doyle

Kevin,

Superb...Just the answer I was hoping for. I have done some housecleaning and for most things Aperture is acceptable now...just barely...so I may be willing to wait.

To understand my need to wait...I believe you are saying FW800 is too slow to use for an external hard drive. Would I be using this for backups only, or are you suggesting that eventually my working library will be on an external drive, and in that case it would be too slow to work from?

You have been very kind with your advice and I thank you for you help.

Rick

Mar 4, 2010 10:09 AM in response to enchantedman

enchantedman wrote:
Kevin,

Superb...Just the answer I was hoping for. I have done some housecleaning and for most things Aperture is acceptable now...just barely...so I may be willing to wait.

To understand my need to wait...I believe you are saying FW800 is too slow to use for an external hard drive. Would I be using this for backups only, or are you suggesting that eventually my working library will be on an external drive, and in that case it would be too slow to work from?

You have been very kind with your advice and I thank you for you help.

Rick


Hi Rick,

OK, just to be clear...if you are willing to wait until Apple ships an iMac with USB 3 support, from what I saw at CES it would be a SMOKIN' external solution.

The speed of eSATA (actually a bit faster theoretically) and the consumer features of USB...for instance it will supply power to devices over the bus, eSATA is simply a disk interface. If that was on the back of an iMac, given the quad core and 16GB RAM with the same video card as the Mac Pro...wow, no excuses there.

Actually, given the 2TB mechanism in the iMac we are talking about, you could continue to resize the partition for quite some time and do just fine with your library. Assuming you only need 200GB for a boot partition, I would have a 250GB partition dedicated to Library only, and the rest as a big scratch disk for whatever you like. Run Time Machine on a USB drive and have just one device on the FW800 bus for backups of your Library, something with removable disk case so you can store copies offsite.

Like this:

http://www.amazon.com/Sans-Digital-MS1CT-Interface-Enclosure/dp/B002AKKDD2/ref=s r11?ie=UTF8&s=electronics&qid=1267726007&sr=8-1

Just buy extra trays and you can have a couple of rotating backups.

Hope this helps,

K.J. Doyle

Mar 4, 2010 12:28 PM in response to Kevin J. Doyle

Kevin, thanks for the suggestions. They were extremely beneficial and now my Aperture 3.01 is singing along. One other item that reinforces the value of your workflow of doing a Finder copy as a way to clean and defragment. As a test, a ran the iDefragment on one of the 2TB drives. This worked just fine, but it took almost 3 days. The Finder copies take much less time and accomplish defragmenting the giant 500+GB Aperture files just fine. Plus you still need a backup since any defragmenting is risky and the Finder copy handles that just fine.

Mar 4, 2010 2:02 PM in response to Aperture Failure

Aperture Failure wrote:
Kevin, thanks for the suggestions. They were extremely beneficial and now my Aperture 3.01 is singing along. One other item that reinforces the value of your workflow of doing a Finder copy as a way to clean and defragment. As a test, a ran the iDefragment on one of the 2TB drives. This worked just fine, but it took almost 3 days. The Finder copies take much less time and accomplish defragmenting the giant 500+GB Aperture files just fine. Plus you still need a backup since any defragmenting is risky and the Finder copy handles that just fine.


Thanks for writing, I noticed your name in some posts above, lol...gotta change it to Aperture Singing, now =)

Seriously, I am glad to hear you are enjoying A3 now. You got the essence of what I am doing with the defrag/backup twofer.The key for me is ZERO downtime if one of my main drive pairs fails. No restoring, rebuilding...just point Aperture at the spare pair and get back to work in seconds. Fix the main pair and reload it from the offsite backup in the background. No lost deadline, or client, no excuses.

Full defrag is valuable in special cases involving executable files. Large datafiles just need to be contiguous on the disk so the heads can read sequentially, instead of random. Writing a file-by-file copy to an empty disk is gonna do that by default.

Keep the directories in good shape (run DiskWarrior for that), the files contiguous, you can grow that setup more than you can shoot pictures to fill it.

Enjoy A3,

Sincerely,

K.J. Doyle

Mar 5, 2010 7:58 AM in response to dokgibbs

dokgibbs wrote:
Hi Kevin
I do not have A3 yet, but I think this is one of the best posts I have read about Aperture and treating Aperture as the database it is, could solve a lot of problems users have been having over the years and in the future.

Thanks. I will now get A3 with more confidence.

Ken


Hi Ken,

Thanks for the kind words, please let me know how you are faring with A3.

Sincerely,

K.J. Doyle

Mar 5, 2010 9:11 AM in response to Kevin J. Doyle

Kevin,
I just posted this to another thread and it's a setup based somewhat on what you have been talking about.

"I've recently been trying a new workflow which addresses the posted topic.

I Used to use referenced files to my AP2 library on the MBP local drive.
When AP3 got here I found some performance issues SO I decided to try this:

An external 2TB drive SATA from OWC with a 7200rpm drive.
I got a Sonnet eSata card (2 port) for my MBP.

I now am trying the AP3 library ON the external drive and Managed (so no more referenced).
What I have found so far is that my Zoom (Z) and my skin smoothing quick brush are far smoother in the responsiveness.
Especially the Skin Smoothing quick brush.

I'm still testing other variables like image export or even referenced files with the AP3 library on the external and original RAW files on the second 1TB drive in this enclosure.

The OWC 2TB has two 1TB drives in this enclosure.

I will post any results as I move along.

Conclusion so far is that I'm experiencing better performance so far without a doubt
having the Library (Managed) on an external SATA drive."



Any way right now I am just debating how to use the other 1TB drive in this OWC enclosure.
1)store original RAW and just reference them
2)Use second drive in enclosure for Vaults
3)Simply drag drop AP3 lib from drive A to drive B

Either way I have another external which will be used for backups to go offsite.

Message was edited by: Falcon01

Mar 5, 2010 4:21 PM in response to Falcon01

Though I can't remember who said it, "Correlation is not causation," and I believe it applies here.

Yes, further testing is needed. Of note, though I haven't read all of the recent, related threads, I have noted that there is an apparent increase of disk fragmentation over time and with an ever increasing amount of data, by copying your library over to a new disk drive, you have effectively de-fragged it. So, referenced or managed may not be the (most) significant factor.

It would help to know the degree of fragmentation before the library move; it may be an increasing reality that we need to more closely monitor how upgrades, usage over time, as well as particular Aperture functions, etc., cause fragmentation.

Users of both Aperture and other products are increasingly encountering the digital data-volume-management realities that until recently were mostly in the realm of video data (but we're adding that into the mix, as well).

I, myself, am about to purchase a 2 TB G-Tech G-Safe mirrored drive to double my storage and give myself an easy way to create an offsite backup copy. Recent analysis of my nearly 1 TB of pics shows that, among other frag-factors, the largest segment of contiguous free-space is 1 MB or less even though there is over 10% free-space.

It's a Pro application. That also means we need to be operating at a Pro-level of disk management.

Let us know if you can track the degree of fragmentation, especially as it relates to rapid increases due to upgrades, new features, few vs. many adjustments, etc. You're on the right track; however, this is exactly the kind of volume testing that should've been done prior to release (but when it works, I'm thrilled).

Mar 6, 2010 6:50 AM in response to Kevin J. Doyle

I'd also like to send my thanks to Kevin.

Without his input I think I'd still be sitting in front of Aperture 3 gnashing my teeth. This Thread ought to be a compulsory read for all who want to complain about Aperture 3 as I'm sure it would solve may problems. Before the upgrade I had my Mac Pro set up with a Master HDD with OS on it (300GB) a back up set up with duplicated OS etc on a 1TB WD Black HDD and a further 2 WD Black 1TB drives set up as a mirrored RAID with my main Library on it (460 GB of EOS 10D, 5D and 5DMkII RAWs (35,000 or so) with a few scanned tiffs and psds. Back up was on LaCie Drives of 300GB (Time Machine) 500GB and 1TB. The last 2 are D2 Quadra's & all drives were attached via FW800. I've seen all the issues listed on the fora.

Having latched on to this thread, I've:

Bought Ap3: Upgrade Cost £79.00
Bought a WD 2 TB HDD: Cost £245.00
Exported all Ap2 projects as Projects and reimported them into a new Ap3 library. This with Faces, Places, Automatic Preview generation and sharing previews with iLife and iWork all turned off.
Also turned off were the screen saver and energy saver. Somehow having these on killed any lengthy import of multiple projects and turned the Mac's performance into slow moving treacle.
As Kevin recommends reprocess Masters was also turned off and done later.
Let the system do its Magic on several other Libraries (220GB, 140GB, 70GB & 40GB and then back up and spread them across the drives. I partitioned the 2TB into a Boot and Data only partition too.

Until I saw Kevin's posts I thought FW800 was OK. Now I've changed my opinion of that.

I've bought one of the NewerTech eSATA PCI Express cards. Cost £65.00 The 2 D2 Quadra's are connected via that card and the non Quadra could be daisy chained through one of the quadra's although its attached by Firewire instead. I did try the LACie 2 port card. My advice to anyone with SL is don't. It relies on drivers whereas the NewerTech card doesn't. Game over if you aren't bothered about PM technology yet.

Finally, to add to all this I've just acquired a 2nd hand 23" Alu Cinema Display as one came up through one of the guys in our local Mac User Group. Cost £275.00 plus £21.00 for a DVI-miniport adapter for my Radeon 4870.

This was previously a "Show" display leased from Apple by a company that went bust. Comprehensively calibrated last night with my existing 23" Cinema Display, it shows its lack of use as its brighter than my original monitor.

So Aperture has forced a major tidy up on me and I think the whole system is zippier. OK its cost me nigh on £700.00 but I think as its running it should work great provided I keep fragmentation down.

Also, Kevin swears by iDefrag and Superduper, but don't forget Drive Genius too....

Once again Kevin many thanks for the help.

Mar 6, 2010 7:09 AM in response to Kevin J. Doyle

Like many others chiming in, I'd like to express profound thanks to Kevin for the evident painstaking care and intelligence, and sheer time, devoted to this thread.

I'm learning a lot.

I'm currently at about 650GB library (representing about 2.5 years' work and assorted personal stuff) on a 1TB drive. I have two Vaults on separate 1TB drives backing up same. Ran the defrag process using Drive Genius (acquired with the drives, from OWC).

Hardware: I'm painted into a corner, but still up and running, as a one-man studio. (Nowhere near the same workplace demands as Kevin and others.) iMac for editing. Voyager Q to swap different 1TB drives. My main challenge: how to expand. 650GB will quickly increase (I'd estimate by half over the next 6 months).

Currently I separate my system etc. and photo data by holding the Ap3 library on the 1TB drive, daisy chained with the VoyagerQ, via Firewire 800. I did a quick test yesterday, building a new library back on the internal drive of the iMac, doing a white balance on a dozen pictures (using "lift and stamp adjustments"), which process took 68s. Then I repeated same via the FW800 library (the same dozen pictures, etc.), and it took the same time. A small "pilot" test. But enough for me not to rush to put my Ap3 library back inside my iMac.

So I'm looking at another 6 to 12 month work around: buy 3 x 2TB drives, and use them for working library, copy of same--to take advantage of defragging via finder copying--and a vault. Or an enclosure with 4 x 1TB drives, Raid 0 striping to take advantage of speed increase.

Or I'm looking at (dreaming about) an upgrade to a MacPro, or a MacBook Pro. So many variables. So many moving parts. What to do? Too difficult for anyone to give me an easy answer!

That brings me to another and I hope less ambitious question for this thread: how do I read Activity Monitor to measure rate of data transfer (i.e. how many MB/s the Ap3 via the FW800 connected library manages to process, etc.)? Alternatively, how should I be measuring performance speed?

(Other than that, I'm curious if anyone else has had problems in Ap3 incorporating video clips (from Canon 7D) in slideshows? That's probably a topic for a separate thread, however.)

One more thing: I would like to point readers towards the excellent pages by Lloyd Chambers: http://macperformanceguide.com/index.html

Thanks again, Kevin.

Patrick Snook

Mar 6, 2010 10:50 AM in response to Steven Jefferson1

Steven Jefferson1 wrote:
I'd also like to send my thanks to Kevin.

Without his input I think I'd still be sitting in front of Aperture 3 gnashing my teeth. This Thread ought to be a compulsory read for all who want to complain about Aperture 3 as I'm sure it would solve may problems. Before the upgrade I had my Mac Pro set up with a Master HDD with OS on it (300GB) a back up set up with duplicated OS etc on a 1TB WD Black HDD and a further 2 WD Black 1TB drives set up as a mirrored RAID with my main Library on it (460 GB of EOS 10D, 5D and 5DMkII RAWs (35,000 or so) with a few scanned tiffs and psds. Back up was on LaCie Drives of 300GB (Time Machine) 500GB and 1TB. The last 2 are D2 Quadra's & all drives were attached via FW800. I've seen all the issues listed on the fora.

Having latched on to this thread, I've:

Bought Ap3: Upgrade Cost £79.00
Bought a WD 2 TB HDD: Cost £245.00
Exported all Ap2 projects as Projects and reimported them into a new Ap3 library. This with Faces, Places, Automatic Preview generation and sharing previews with iLife and iWork all turned off.
Also turned off were the screen saver and energy saver. Somehow having these on killed any lengthy import of multiple projects and turned the Mac's performance into slow moving treacle.
As Kevin recommends reprocess Masters was also turned off and done later.
Let the system do its Magic on several other Libraries (220GB, 140GB, 70GB & 40GB and then back up and spread them across the drives. I partitioned the 2TB into a Boot and Data only partition too.

Until I saw Kevin's posts I thought FW800 was OK. Now I've changed my opinion of that.

I've bought one of the NewerTech eSATA PCI Express cards. Cost £65.00 The 2 D2 Quadra's are connected via that card and the non Quadra could be daisy chained through one of the quadra's although its attached by Firewire instead. I did try the LACie 2 port card. My advice to anyone with SL is don't. It relies on drivers whereas the NewerTech card doesn't. Game over if you aren't bothered about PM technology yet.

Finally, to add to all this I've just acquired a 2nd hand 23" Alu Cinema Display as one came up through one of the guys in our local Mac User Group. Cost £275.00 plus £21.00 for a DVI-miniport adapter for my Radeon 4870.

This was previously a "Show" display leased from Apple by a company that went bust. Comprehensively calibrated last night with my existing 23" Cinema Display, it shows its lack of use as its brighter than my original monitor.

So Aperture has forced a major tidy up on me and I think the whole system is zippier. OK its cost me nigh on £700.00 but I think as its running it should work great provided I keep fragmentation down.

Also, Kevin swears by iDefrag and Superduper, but don't forget Drive Genius too....

Once again Kevin many thanks for the help.


Hi Steven,

Thanks for your kind words, makes this all worth it when more folks are getting their performance back and enjoying A3.

I wanted to add a couple of suggestions if I may...

First the Newer card if paired with some SATA 3 6Gb/s mechanisms, like the WD1002FAEX 1TB 64MB cache, configured in a RAID 0 striped pair is the fastest 2GB of non-enterprise storage you can get today.

The only drawback of the Newer card is that it cannot handle PM enclosures, and with all the backups and offsite swaps I do using the 5 bay arrays...PM is important. Also, I have not been able to verify if the Newer card is full 64 bit boot operation. I should check with Newer to verify, as all elements in the storage subsystem need to be operating in full 64bit (not just tolerating it and actually running 32) to see max performance.

My personal choice, although I have not tried them all of course, is the SeriTek/2ME4-E by Firmtek.
It is Snow Leopard Compatible, full 64-bit driver, 4-Port, eSATA Host Adapter with Port Multiplier Compatibility for PCI-Express Equipped Apple Macintosh Computers for $197. This card gives you lots of options for what to run for drive enclosures.
http://eshop.macsales.com/item/Firmtek/SATA2ME4E/
Now you can run a discrete 4 channel 4 bay array for maximum single drive performance using all 4 ports and cables to connect…
OR…You can run up to four 5 bay port multiplied arrays, for a total of 20 disks and a combined possible throughput of 700MB/sec, enough to run 4 streams of uncompressed HD video,
OR… since these external removable drives in this design are for offsite backups and Vaults, one 5 disk bay is the best and my cheapest solution at this point, my personal choice.

Once again, as others have mis-interpreted this, I am not advocating defragging with a defrag program...that takes too long to do. Copying a managed library onto an empty, freshly-formatted disk that is permanently dedicated to only this one Library defrags, and makes a full backup...a TWOFER, if you will. ALSO MAKE CERTAIN that you have shut off Spotlight indexing for that drive, as well as Time Machine on that volume, or you will be bleeding performance and increasing fragmentation all day long.

I originally suggested iDefrag because the free demo would let you view fragments before and after you made the copy I described above. While it is a very well-written program, and I would personally pay them for it because I am using it to do real work (even if the free demo would suffice), I would only advocate using the program for an optimize of a disk containing executable code, like the boot disk. A disk with simply data and a directory like an Aperture library will do fine with a copy "defrag" and I follow it up with a run of DiskWarrior for some extra directory performance.

Sincerely,

K.J. Doyle

Mar 6, 2010 12:40 PM in response to Patrick Snook

Patrick Snook wrote:
Like many others chiming in, I'd like to express profound thanks to Kevin for the evident painstaking care and intelligence, and sheer time, devoted to this thread.

I'm learning a lot.

I'm currently at about 650GB library (representing about 2.5 years' work and assorted personal stuff) on a 1TB drive. I have two Vaults on separate 1TB drives backing up same. Ran the defrag process using Drive Genius (acquired with the drives, from OWC).

Hardware: I'm painted into a corner, but still up and running, as a one-man studio. (Nowhere near the same workplace demands as Kevin and others.) iMac for editing. Voyager Q to swap different 1TB drives. My main challenge: how to expand. 650GB will quickly increase (I'd estimate by half over the next 6 months).

Currently I separate my system etc. and photo data by holding the Ap3 library on the 1TB drive, daisy chained with the VoyagerQ, via Firewire 800. I did a quick test yesterday, building a new library back on the internal drive of the iMac, doing a white balance on a dozen pictures (using "lift and stamp adjustments"), which process took 68s. Then I repeated same via the FW800 library (the same dozen pictures, etc.), and it took the same time. A small "pilot" test. But enough for me not to rush to put my Ap3 library back inside my iMac.

So I'm looking at another 6 to 12 month work around: buy 3 x 2TB drives, and use them for working library, copy of same--to take advantage of defragging via finder copying--and a vault. Or an enclosure with 4 x 1TB drives, Raid 0 striping to take advantage of speed increase.

Or I'm looking at (dreaming about) an upgrade to a MacPro, or a MacBook Pro. So many variables. So many moving parts. What to do? Too difficult for anyone to give me an easy answer!

That brings me to another and I hope less ambitious question for this thread: how do I read Activity Monitor to measure rate of data transfer (i.e. how many MB/s the Ap3 via the FW800 connected library manages to process, etc.)? Alternatively, how should I be measuring performance speed?

(Other than that, I'm curious if anyone else has had problems in Ap3 incorporating video clips (from Canon 7D) in slideshows? That's probably a topic for a separate thread, however.)

One more thing: I would like to point readers towards the excellent pages by Lloyd Chambers: http://macperformanceguide.com/index.html

Thanks again, Kevin.

Patrick Snook


Hi Patrick,

Thanks for your support and compliments.

Looks like you have an expansion problem...you are very close to exceeding the safe zone of 70% full with 650GB on a 1TB, so that is job one. I would probably suggest getting another TB drive at a minimum, and setting up two RAID 0 stripes, 2TB each. Then operate between them, with an offsite backup as copy #3. I use SoftRAID www.softraid.com to make the pairs, but Apple's RAID in dDisk Utility is OK too, just less tuning and options.

Vaults are fine and I use them as secondary Backup, but they require rebuilding to use as a library, and in the event of your primary library media failing that rebuilding means downtime. Having a secondary full Library as backup on the same storage config as your other Library means you are back working immediately, simply switch Libraries in A3. Copying between them nightly and using the opposite Library in the next session ensures you are always operating a maximally defragged data disk.

What are the exact specs on your MBP? Model ID i.e.: MacBookPro6,1, RAM DRIVE, VRAM, etc.
Since all but the latest non-17" MBPs can run an eSATA host, like a Sonnet Tempo Pro, this may be your best bet...sending the iMac out to pasture for now.

I believe in a year or so iMacs (actually all Macs) will have a USB 3 (4.8Gb/sec) interface, and that will be great, but until then for folks with big data, FW800 is not a professional solution.

"That brings me to another and I hope less ambitious question for this thread: how do I read Activity Monitor to measure rate of data transfer (i.e. how many MB/s the Ap3 via the FW800 connected library manages to process, etc.)? Alternatively, how should I be measuring performance speed?"

I plan on doing a post on benchmark software and performance monitors, but until then...just open Activity Monitor, select the Disk Activity tab and look at the Data read/sec and the Data written/sec for the actual counts of throughput in the currently operating application(s) or system tasks. This will be real world, and a lower number than what a dedicated benchmark app will say.

I also suggest getting the atMonitor app, free donationware, I use this for permanent monitoring.
http://www.atpurpose.com/atMonitor/
Once you realize what it is doing you will see what a awesome app this is, invest some reading time here, it will pay off.

Measuring performance I use 3 apps:

the Disktester suite from Lloyd Chambers $26.95
http://macperformanceguide.com/Software-DiskTester.html,

the SpeedTools Test Suite for Mac from intech $14.95
http://www.SpeedTools.com/TestSuite.html

and the AJA system test from Kona (free download)
http://www.aja.com/ajashare/AJASystem_Testv601.zip.

I would probably suggest the inTech to start with, as it requires very little explanation to get working with it, the others you will need a bit of help to interpret.

What do you do with all this data...??

To begin with, general comparison of device/bus/interface performance. Test your internal drive vs FW, or USB, or eSATA. Keep track of the data from each.

Also it is critically important to test and burn-in new drives before using them. Not all drives are good out of the box, and there are many production failures that don't fail per se, just underperform their specs all their life. All disk manufacturers are happy to replace such drives if identified, but sadly most people never do identify them because they do not test. Doing a test and burn-in for 48 hours and looking at their performance before using it for your data is well worth the effort, ESPECIALLY if it is going to be a member of a striped pair or array, where the slowest drive determines overall performance.

I am not going to go into further detail here, that will come after more have the software and learned its basic use. Please feel free to ask specific questions after you get the software and start testing and monitoring.

Sincerely,


K.J. Doyle

Mar 6, 2010 2:14 PM in response to Kevin J. Doyle

Thanks for the suggestion about the NewerTech Card and Spotlight.
I'm not a pro just a very enthusiastic amateur so the seritek card is way beyond what I need as I've just got two LaCie D2 Quadras and one FW800 and currently have no need for PM Multiple drives. The attraction about the NewerTech Card is that there are no drivers needed so I assume 64 bit operation is a given. Just switched to 64bit & still working, and I can't see any 32 bit processes suggesting otherwise in Activity monitor.

I currently have enough space across the internal and external drives to cope with my anticipated usage for a while without exceeding 70% capacity.

For the future I plan on having a library solely for importing my initial selections which I will export as complete projects that can be imported into the "final" library to minimise fragmentation, though I plan on following your copy suggestions.

Mar 6, 2010 4:36 PM in response to Kevin J. Doyle

What a great post! Kevin, and all the others who have added to it, thanks.

Now I am going to take it to a simple level with 2 questions. I asked earlier but didn't get a response so I want to try again.

1. It was stated: "Make a Finder Copy" Does that mean drag and drop, or is an application like SuperDuper involved?

2. It was stated: "On a freshly reformatted drive" Does that mean use Disk Utility and erase and reformat? Quick erase? Repartition? What is the best method for this execution?

Thanks,
Jerry

Mar 6, 2010 8:59 PM in response to Steven Jefferson1

Steven Jefferson1 wrote:
Thanks for the suggestion about the NewerTech Card and Spotlight.
I'm not a pro just a very enthusiastic amateur so the seritek card is way beyond what I need as I've just got two LaCie D2 Quadras and one FW800 and currently have no need for PM Multiple drives. The attraction about the NewerTech Card is that there are no drivers needed so I assume 64 bit operation is a given. Just switched to 64bit & still working, and I can't see any 32 bit processes suggesting otherwise in Activity monitor.

I currently have enough space across the internal and external drives to cope with my anticipated usage for a while without exceeding 70% capacity.

For the future I plan on having a library solely for importing my initial selections which I will export as complete projects that can be imported into the "final" library to minimise fragmentation, though I plan on following your copy suggestions.


Hi Steven,

Reason I am asking about the ability to have multiple drives is the critical offsite backup drive(s0, at least one, lol...

As long as your workflow puts a copy of your Library somewhere else and updates the copy on a regular basis (which is why I usually talk about having 2 mechanisms that rotate locations). Bottom line is once you have created an offsite copy, there must never again be a point in time when all copies of your data are share the same geographic location.

"The attraction about the NewerTech Card is that there are no drivers needed so I assume 64 bit operation is a given. Just switched to 64bit & still working, and I can't see any 32 bit processes suggesting otherwise in Activity monitor. "

OK, unfortunately you cannot assume 64 bit internal operation is a given. All processes, internal or external can be written and configured to work without hindering 64 bit operation, but actually only operating in 32. It is a confusing distinction, and up until recently it was not really important, as most apps were not 64 bit anyway. Since this has no OS layer driver, it will be more difficult to verify 64-bit internal operation from the OS layer. I will check with Newer and they should have the information. If it is full 64, then nowadays that fact would be an important selling point for the device, if it is simply compatible, then they probably won't mention the processing status unless asked, as it isn't hurting anything, just not operating at maximum possible performance.

"For the future I plan on having a library solely for importing my initial selections which I will export as complete projects that can be imported into the "final" library to minimise fragmentation, though I plan on following your copy suggestions."

OK, what you are describing does not minimize fragmentation, it creates it just in a slightly different way than using a single library. Any application-level manipulation of the images involves updating the directory, the thumbnails and the previews, the folder structure within the Library and more. While there may be one way that does it more efficiently than another, it frankly does not matter, as not all fragmentation is bad. Fragmentation is just a common byproduct of computer operation of any application, Aperture being no exception. It will not hurt you, unless not understood and maintained on a regular basis.

Think of fragmentation on a daily basis like preparing meals...pots and pans and dishes all get soiled with use and as long as they get a regular washup, all is to be expected. Conversely, if you don't washup on a regular basis, lots of bad things occur to people using and eating off those pots and dishes.

Only an operation like copying the files to a new empty volume reverses the application-generated fragmentation, and does a washup, if you will.

It accomplishes this because file-by-file copying has not the faintest idea what kind of data it is copying, or how the application that created this data uses it. Copying simply walks the contents of the source volume directory, finds a file, then finds all the pieces of that file on the source, then copies it in one contiguous action to the very next piece of open contiguous space on the target drive. SInce we are using an empty drive for our purpose, there is no mucking about in where the copy operation will store the data, it will be in one long string until all is copied.

For success we need the purest, simplest, copy operation. If a background process like Spotlight is running, it will interfere with out purposes and fragment our copy operations, so we must make sure nothing else is running on our target.

I will caution you that in Apple's infinite wisdom...or at least as it regards Spotlight, they of course allow you to exclude drives that you do not want to be indexed, by adding the drives in the Privacy tab of the Spotlight System Preference. Unfortunately, if you erase a volume, or re-partition or change its global status in the drive table, Spotlight decides it is OK to go ahead and index it again....grrrr! It took me quite a bit of time to figure that out, because Spotlight was messing up my solid defrag copy and I could not figure out why.

Just verify on a regular basis that your target drive is excluded in Spotlight's Privacy list and you will be OK.

The superb Cocktail utility, http://www.maintain.se/cocktail/index.php, also has a function to disable Spotlight indexing and erase the index Spotlight had made on a volume basis.

Sincerely,

K.J. Doyle

Aperture 3 Library Fragmentation causes problems

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple ID.