Previous 1 2 Next 16 Replies Latest reply: Feb 14, 2012 6:39 PM by DiploStrat
DiploStrat Level 2 Level 2 (345 points)

Anyone have any experience using Aperture with an SSD?


I have a 120 GB SSD, running Lion, with all applications and a 13 GB Library on board. There are 175GB of masters on a conventional 7200 RPM HD. Mac Pro 1,1 with 2x2.66 Dual-Core Xeon and 13 GB of RAM.


My first experiences indicate that the speed improvements are minimal at best.


How are you set up and what have been your results?

  • Shuttleworth125 Level 2 Level 2 (415 points)

    To get a meaningful reply you need to upgrade like you did rather than buy a new computer with an SSD in. We went from a 2.8GHz C2D iMac with 4GB ram, to the latest i7 with SSD+2TB HDD, 2GB video card and 16GB ram, the difference always was going to be night and day!! Needless to say there are not really any performance issues, though we still get Aperture locking up occasionally (it doesn't like TM backups for some reason).


    One thing I've not checked though is which library set up is best. All libraries are on the SSD and all are referenced onto the internal HDD except current jobs which are managed until complete. I just set it up this way as it seemed to make best use of the resources, but I've not tested any real world difference.

  • DiploStrat Level 2 Level 2 (345 points)

    You may have hit on a very important point. For a pro or serious amateur, or anyone pressed for time, the following might work well:


    -- Shoot pictures.


    (Optional: Make a deep archive copy of all camera cards. Some use CD/DVD, others a HD.)


    -- Import ALL images into Aperture as Managed masters. This assumes they they are now on your SSD.


    -- Cull/adjust/print/burn CD/etc.


    This should have blinding speed as the master reads will now be on the SSD. When the work is done, that is, deadlines (professional or familial) met, then relocate the masters to your masters folder on the HD. (That is convert them to Referenced.)


    This way you would have the advantages of the full speed of the SSD during the most intense part of the work flow. But then, the keepers are migrated to the HD, keeping your SSD empty. This also has the benefit of allowing Aperture to clean up your masters, rename them, or whatever. (Indeed, Rob Boyer recommended exactly this workflow some time back, not specifically because he wanted Referenced masters, but because Aperture was such a good clean up tool. Better to wash the files through Aperture and then export them, neatly organized, than to leave then scattered across your HD.)


    Hmmm, methinks I will to reimport some large TIFF and see what speeds I can achieve.


    Thanks for the good idea!

  • Kirby Krieger Level 6 Level 6 (12,510 points)


    It's a great workflow -- if your Library fits on your SSD.  Mine and several I've set up don't  .



    I experimented with Referenced Files on the system drive, on FW400, and on FW800.  In practice, I couldn't tell where the Masters were.  That was with a 7200 rpm system drive.  On that kind of set-up, performance bottle-necks seem to be in reading/writing the Library itself, or just in CPU speed -- and I tend to think as much the latter as the former.

  • Shuttleworth125 Level 2 Level 2 (415 points)

    The workflow you describe is exactly how we work. The "optional" masters backup is done by Aperture at import to an external (FW800) drive.


    We have a seperate library for personal/family work and that is referenced all the way from import onwards.

  • DiploStrat Level 2 Level 2 (345 points)

    Since my last post, I pulled (consolidated) some 40 masters, old scanned slides, about 100 MB each, back into Aperture on the SSD.


    Guess what. Things were no faster; there was still a one to two second pause to load and perhaps again as many seconds to zoom to full resolution. To be sure, I went back and tried the same experiments with the same sort of TIFF where the masters were still on the HD. The lags were a bit longer, but not dramatically so.


    The major edits were sharpening on/off and different levels and brushing blur and skin smoothing into pictures of the sky where emulsion cracks and roller scars were very evident. With 13 GB of RAM, the brushing blur or smoothing with a Wacom tablet was virtually instantaneous at full resolution, The pauses came when switching to a a different quadrant of the image.


    Based on this, I will probably keep the SSD as it is a bit faster over all. (Really nice when doing long reads like software update or permissions repair.) But I will go out on a limb and say that for most users, a bigger hard drive to allow you to stay with managed masters and more, importantly, more RAM to reduce page outs, may be a better investment for real world Aperture speed.


    Usual caveats, my experience to date, YMMV, etc.


    So, how about this for a thesis. For Aperture speed:


    -- Keep your system/library disk lightly loaded, below say, 70%.


    -- Add as much RAM as you can.


    Any thoughts or comments? 

  • 1 Open Loop Level 2 Level 2 (350 points)

    That makes sense. Especially if most of your images are photos, not video. Even if they are RAW image files. It shouldn't take more than a second to open up a single Master. But browsing Aperture needs a fast processor, a good graphics GPU and a fast drive that holds the Library (Versions, Previews and Thumbnails.)

  • Kirby Krieger Level 6 Level 6 (12,510 points)

    DiploStrat wrote:


    So, how about this for a thesis. For Aperture speed:


    -- Keep your system/library disk lightly loaded, below say, 70%.


    -- Add as much RAM as you can.


    Any thoughts or comments? 

    I would add: get the fastest processor you can afford.  From the systems I've looked at, after 8 GB RAM are on-board, money is probably better spent on CPU/GPU than additional RAM.  (This is for stills.  I can't comment on using Aperture for video.)

  • DiploStrat Level 2 Level 2 (345 points)



    Granted that all of the this is anecdotal and not scientific, but the results are interesting. There is no way that I can afford a new Mac and certainly not a new MacPro. It does appear, however, that with a 100 MB file, I can reach the limits of CPU and buss speed. (The GPU was updated to a 5770 and so I am guessing that that is reasonably healthy as Apple still offer it with new MacPros.)


    I am going to make some new Versions, one each of a TIFF on the SSD and on the HD and start tearing them up with Activity Monitor open. I was pleased to notice that, when doing a mass rerender of all my previews, all four cores showed even activity. Would lead me to believe that with Lion, at least, the work is getting spread around all of the hardware.


    And least it sound like I am whining or trashing the SSD:


    -- Aperture start to display first screen is under four wall clock seconds.


    -- ALL scrolling, open project, shift views, etc. is absolutely instantaneous. Nothing takes longer than a second.


    -- Interestingly, slamming sliders back and forth (e.g. exposure, shadows, etc.) displays results on screen faster than the trailing "Loading" and "Processing" wheels, which seem to lag by about one second. Real world adjustments, made by nudging the sliders and considering the effect are instantaneous.


    All of this on a healthy, but old machine. Sooooo, I would postulate:


    -- Get yer RAM up to the 8 GB reagion. (At some point, there may be dimishing returns unless you really like to keep a lot of programs running. All of the "tests" cited above were with Safari and Mail runnnig.)


    -- Really old MacPros will benefit from a newer video card. Ironically, I noticed this when I was running on a 4 GB RAM Mini - it was actually faster than the MacPro with 5 GB of RAM and the original video card.


    -- After that, you are more likely to be bound by processor speed and buss throughput than you are by HD read/write time. There might be some benefit to putting all of the masters on a dedicated drive, but, using iDefrag, I note that while the masters are fragmented after a reading in a full CF card, at worst, the files are usually only two extents and iDefrag's "Quick-On Line" setting will fix that very quickly. Does a similarly good job of cleaning up the Thumbs and Previews.


    Of course, commen sense reminds that none of this means anything if your OS is messed up or your HD is too full.


    Bottom line: SSD are expensive and it can take a bit of work to shoehorn your OS, Apps, and Aperture Library onto one, but the overall increase in system response in noticable and nice. Frankly, the hardest part was converting to Referenced Masters and fighting those parts of OSX which are designed to guide you into a nice, need, single disk system. (I am still learning how to clean up the Sidebar.)


    As always, comments welcome.

  • 1 Open Loop Level 2 Level 2 (350 points)

    Notice in this benchmark, the slower clock speed quad core i7 MBP ran the Aperture test faster than the dual core i7 MBP.


    Also, interesting is the fact that the 3.4 GHZ quad core iMac came out the same as the 2.5 Ghz quad core MBP. That could be because they might have the same GPU.


    Although I'm not sure how representative this particular Aperture test is for the average workflow. I would assume that other tasks would be improved with the iMac's faster CPU as well as the addition of an SSD drive, if you're running your Library from there. (Referenced)


  • Kirby Krieger Level 6 Level 6 (12,510 points)

    -- Interestingly, slamming sliders back and forth (e.g. exposure, shadows, etc.) displays results on screen faster than the trailing "Loading" and "Processing" wheels, which seem to lag by about one second.


    I assume (that is to say, the following is unsupported conjecture) that Aperture always acts first on the file (albeit a temporary file) being displayed, and then on the actual full-res Master.  I take this as another example of how cleverly Aperture is engineered (which is, I suspect, Apple's true interest in the program).  Somewhat similarly, I'm not at all surprised to find out (in this thread) that Aperture is well-engineered to work with multiple CPU cores.  (I have not, myself, seen any GPU bottlenecks {using iStat, fwiw}.)


    It took me a while to get used to the idea that unless I am working at 100%, I am working with a _representative_ of the Master, and not the Master itself.  The results of adjustments are shown for that representative file first, and only later for the full-res Master.


    A simple (and well-known) example is the Edge Sharpening adjustments.  On a 27" monitor, with a 4,000 x 6,000 px Image shown, the default Edge Sharpening adjustments barely register (if at all).  At 100% I often have halos.


    If you really want to max the CPU/GPU, hold "{Option}" while clicking and dragging over the Image at 100% with the White Balance eye-dropper active.

  • 1 Open Loop Level 2 Level 2 (350 points)

    Kirby Krieger wrote:

    If you really want to max the CPU/GPU, hold "{Option}" while clicking and dragging over the Image at 100% with the White Balance eye-dropper active.


    Why is that? Is that something that stresses both?

  • Kirby Krieger Level 6 Level 6 (12,510 points)

    Sorry -- I didn't mean it to be that specific.  It's simply an operation that forces continuous rewrites of every pixel quickly, with the comcommitant system stress.

  • behnamfromgatineau Level 1 Level 1 (0 points)

    Just something to share with you;

    When I bought my iMac i7 with SSD+2TB HDD, 16GB, I knew from the beginning that I didn't want to use SSD for storing my files. I also didn't want to spend too much time managing my files in terms of relocating and opening room in my SSD. It took me a while to discover this, but here is what I did:

    First, I dragged my user folder from SSD to the conventional HD.

    Second, I opened Accounts in System Preferences, unlocked it, right clicked on my account (as Admin), and clicked on 'Advanced options...'

    Third, I didn't change anything there except in 'Home directory' box, I clicked on 'Choose...' button and selected the user folder I just moved to the conventional HD, then OK and closed.


    This modification means that any files belonging to me, will be located in the conventional HD from get go. Aperture default library (or user folder of any application) will be placed in the other HD and SSD is only used for the OS, application itself, and temprary files it creates for processing an item. I don't think reading the original files from the other HD is the source of any substantial decrease in the process. Aperture takes its time to produce the preview and all relative files for a newly imported RAW image but they are all done in full speed of SSD where the application and its temprary files reside.

    It seems that lowering the 'Photo Preview Quality' in Aperture Preferences from its default, made my Aperture work faster.

  • DiploStrat Level 2 Level 2 (345 points)

    Brilliant! Thank you very much!


    It is ironic that while UNIX simply loves to spread stuff over multiple disks and even multiple computers, OSX is very much set up to guide a user with a single HD system. It is taking me a bit of a slog to sort through exactly this kind of stuff. (To be fair, the last time I had to get under the hood of a UNIX box I was working with three outstanding programmers and THEY did all of that nasty command line stuff!)


    Thank you again! 

Previous 1 2 Next