Looks like no one’s replied in a while. To start the conversation again, simply ask a new question.

Aperture 3 Library Fragmentation causes problems

Hi All,

I have read a number of the complaints in A3 conversion, and I think this may help solve some of them.

We have 6 systems with Libraries from 500GB to 1.2TB here and converted them all last weekend without serious problems, and all are enjoying superb increases in performance as I write this.

We did not have the conversion problems others have suffered, and I think I know why:

1) Disable Faces - Saves time in conversion, can be done afterwards.

2) DO NOT reprocess masters in the conversion, it can be done as needed once you are running, PLUS - WARNING - it will adversely effect many v2 images, changing exposure and colors, etc. PLUS, the conversion will go MUCH faster, as you have given it less to do. Check out the Apple Knowledgebase piece: "Aperture 3: Discussion about differences in exposure levels with Aperture 3 RAW decoding"
http://support.apple.com/kb/HT3948

3) Use an EMPTY freshly formatted volume that is at least double the size needed.

4) When conversion is complete, COPY your converted library to to another new freshly-formatted volume with at least 40% free space before you use it.

Why?.....

The normal operation of Aperture has always resulted in some disk fragmentation, the larger the libraries the worse the problem. Always working from a copy that has just been made eliminates the great majority of the fragmentation, and ensures it does not become a performance issue.

Conversely, if someone just keeps using the same Library, it will just get more fragmented, and slower and slower until real problems develop. If they are also running out of disk space, then the fragments become fragments as the file system feverishly tries to fit all your data into a shrinking space. All this eats time, and given the size and number of files to deal with it has become a problem that NO AMOUNT of fast hardware will fix entirely. We need to deal with some database best practices....but trust me it works and A3 simply screams on our systems here 2-3 times as fast as A2.

To test this premise that conversion of a large library is going to result in a tremendously fragmented mess on the volume, I reconverted my last Aperture 2 library today from a backup from last week. It is 500GB and it completed conversion to A3 as described above in about 4.5 hours. I then looked at the volume with iDefrag, and over 50% of the file was fragmented. My original A2 file, was not fragmented of course as it had come from a backup. I then opened the converted file in A3, and it began to process previews VERY slowly, with really lousy disk activity read/written numbers reported in Activity monitor.

I stopped the process, quit Aperture3 and made a copy of the file to a fresh volume. I immediately noted the backup was going VERY slowly...I had not noticed this last week as all our first backups were done overnight. As a benchmark, an unfragmented 500GB file will copy in about 2.5 hours and a bit....this backup took over 6 hours! It had a lot of file fragments to assemble to put the copy together, and it all takes time and resources.

When the copy was complete, I opened it again in A3, and the preview processing raced right along. Even before it was complete, the data was snappy and available...when the preview finished, it was a screamer....as our systems are today.

We use iDefrag, by Coriolis Systems to look at the fragmentation on these large files. It is a $30USD utility, and invaluable in providing some reality into why your system is slow. I invite you view you Library and see where your performance is gone. http://www.coriolis-systems.com/iDefrag.php

Apple tries to make all of their programs looks simple and carefree to use...fine...I submit when a database is hundreds of gigs NOTHING is simple and carefree. Performance requires understanding and some simple maintenance. Would you buy a Porsche and not change the oil, run it in a small parking lot and then complain to Porsche about performance? Kinda the same thing...

For those who are interested...

Our basic daily operation has the Aperture library running from a volume that consists of a RAID 0 pair in slots 1 and 2 an 5-bay eSATA array. There is an identical RAID 0 pair that carries a backup of the first one in slots 3-4. There is a rotating single backup mechanism that is used for daily offsite rotation in bay 5.

Every night a full finder copy is made from the day's working RAID 0 pair to the other pair, and to the offsite disk.

The next morning, the operator will verify the backups have occurred without incident, swap the offsite mechanism out and then start the day's work ON THE OTHER RAID 0 PAIR that was the backup of yesterday's data. Why?.....The normal operation of Aperture has always resulted in some disk fragmentation, the larger the libraries the worse the problem. Always working from a copy that has just been made eliminates the great majority of the fragmentation, and ensures it does not become a performance issue. We actually use 3 pairs to have redundant backups, but I won't confuse the basic issue here with that discussion.

Conversely from our methods, if someone just keeps using the same Library every day, it will just get more fragmented, gradually becoming slower and slower until real problems develop. If on top of that they are running out of disk space, then the fragments become fragments as the file system feverishly tries to fit all your data into a shrinking space. All this eats time, and given the size and number of files to deal with it has become a problem that NO AMOUNT of fast hardware will fix entirely.

Given the volume of data we digital photographers keep collecting, we need to take responsibility and deal with some database best practices....but trust me it works and A3 simply screams on our systems here 2-3 times as fast as A2.

Will this fix everything, for everyone???

Of course not...the wide variety of machines and configs make it impossible to predict this. I do feel pretty confident I can reliably make A3 run really fast on our systems here... and to note, the 6 workstations I have discussed are operating on 2008 MBPs (4,1) 6G RAM with 30" displays, not even Mac Pros and we are enjoying excellent throughput.

Hope this helps,

Sincerely,


K.J. Doyle

MBP 17" Glossy HiRes 2.6 6GB RAM, NVIDIA 8600 GT Video w 512MB, Mac OS X (10.6.2), 30" Cinema Display and External eSATA RAID for Library

Posted on Feb 19, 2010 7:43 PM

Reply
164 replies

Mar 6, 2010 9:46 PM in response to Merged Content 1

Jerry Shankin wrote:
What a great post! Kevin, and all the others who have added to it, thanks.

Now I am going to take it to a simple level with 2 questions. I asked earlier but didn't get a response so I want to try again.

1. It was stated: "Make a Finder Copy" Does that mean drag and drop, or is an application like SuperDuper involved?

2. It was stated: "On a freshly reformatted drive" Does that mean use Disk Utility and erase and reformat? Quick erase? Repartition? What is the best method for this execution?

Thanks,
Jerry


Hi Jerry,

I apologize if your previous questions got lost in the shuffle, just a lot of things going on...ok let's give you a good answer now. =)

"1. It was stated: "Make a Finder Copy" Does that mean drag and drop, or is an application like SuperDuper involved?"

OK, simple answer...you can accomplish the end result full defragged copy operation I specified without any external applications, PERIOD. Meaning drag and drop to an empty target will work, with certain caveats.

If you choose to do this manually, you are now responsible for verifying the condition of the source file before backup, and for verifying a solid backup was accomplished, 2 things backup apps do for you. I suggested SuperDuper! because it is simple and cheap with great support, but I also use other more complex systems like Synchronize Pro X to automate a number of operations, and log and report by email, etc.

Utility apps just make things easier, are scriptable, report on results and do other valuable things. They also do the interim Smart Updates, which would be a pain to do manually, the level of effort basically making that operation impossible to do simply on a regular basis.

A Smart Update is a changes-only backup from target to source, runs really fast, like coffee break fast and we do this 3-4 times a day as a rule, to two sources, the spare RAID 0 pair and the offsite rotating disk. Also...although we do the full defragging copies every night in our workflow, does not mean that you have to.

Maybe full defragging copies done one evening a week (and switching your operation to the copy the next session, of course) would be fine to prevent you ever seeing the effects of bad fragmentation. Use your defrag app to view the status and make that decision yourself.

"2. It was stated: "On a freshly reformatted drive" Does that mean use Disk Utility and erase and reformat? Quick erase? Repartition? What is the best method for this execution?"

OK...FIRST...before we erase or reformat anything, let's review our rules...

1) you need 3 (or more, at least one being offsite, I hope) copies of your data before any operation of this kind.

2) Ideally, the copies should not be connected to the computer so that a possible driver corruption-based event could not effect any more than the one you are dealing with.

3) Clean your target volume. An erase from Disk Utility will do. Other more extreme operations like reformat will trigger Spotlight to start indexing your target again, so beware.

4) VERIFY that stupid Spotlight has not decided on its own to start indexing your target volume again...grrr! Go to the Spotlight System Preferences and make sure the drive is listed in the exclude list under the Privacy tab....or use Cocktail to verify the same thing PLUS erase any indexing that Spotlight may have started. (Index file is invisible in the Finder)

5) Make sure you have no other processes that are accessing the target in the background, like Spotlight does.

6) Make your copy by drag and drop or via utility.

6a) Optional, but I like it...run DiskWarrior on finished copy to tweak the directory for that last bit of enhanced performance.

7) Operate from the finished copy starting with your next Aperture session, and alternate each time thereafter.

Hope I got it all this time, let me know if you have other questions.


Sincerely,


K.J. Doyle

Mar 7, 2010 6:49 AM in response to Kevin J. Doyle

Kevin,

Thanks again for all that. I've downloaded the atMonitor. I have been relying on the Activity Monitor's floating cpu-monitor to give me a rough idea on the run of operating load. Gosh, that looks paleolithic in comparison to atMonitor's graphic interface! I have an older copy of disktester, which I could update, and learn how to use, but I've also got a current Drive Genius, which I think offers similar information, and which, I've recently discovered, has excellent technical support. I will get Lloyd Chambers' suite of testing software too. I've a lot of reading to do. . . .

My MacBook (5,2), which I mainly use as my portfolio when I meet with potential clients, and as my large "iPhone" (with Mail, Safari, Skype, etc. and no contract!), does not include an express card slot (it's not a MacBook Pro). I chose this low-end device deliberately because, among other things, I wanted to have a common consumer-level computer, so that I can be sensitive to the computing power, and monitor quality, of my customers. When I hand them a disk of my pictures, I want them to be able to enjoy them with minimum fuss and no inconvenience, and the best way for me to check that everything looks more or less the way it should--and quickly--is to use the same computer that they might have. Or so goes my thinking. . . .

I'm facing that familiar quandary: upgrade now, or limp along, waiting a little longer (guessing how long that might be) for more bang-for-the-buck with the next significant hardware advance--i.e. USB3--to be standard equipment?

I note your points about copying the library, disk burning and testing, etc. I will find a way to do all that asap. Also see that FW800 is a weak link in the chain. Not much I can do, until either I buy a MacPro (or previous generation MacBookPro, with express card slot, etc.), or until I buy the next generation with USB3.

My immediate concern: storage and library size, and anticipated increase.

Perhaps best way forward is a MacPro now, which would allow me to use the four internal drive bays to make two 2TB raid 0, twice, (and the system etc. on a fifth drive, in one of the optical bays), and back up to vaults via the (eSATA-capable) VoyagerQ and removable drives. Do I gamble (trust) that USB3 will be able to be added to the MacPro?

Is now a good time to go with the MacPro?

Or go part way, and buy an enclosure such as the one you recommended earlier, allowing me to put the 2 by 2TB drives, connected albeit non-ideally via FW800, etc?

In sum, I think I can prioritize by saying that my chief concern has to be the size of the library, and the back-up strategy (which should be working with 2 identical libraries, asap), both of which I can address with the external enclosure. I don't notice performance speed issues that much. Most of my adjustments are quick (white balance, exposure, b&w conversion), and use the brushes very lightly, if at all. I do not use Photoshop. Importing is fairly slow. Exporting from Ap3 takes a while too (I add borders via BorderFX), but that's the opportunity for me to walk away from my workstation and attend to some fun domestic chore or other. Or read Apple Discussions. . . .

Thanks again, Kevin.

Patrick

Mar 7, 2010 11:04 AM in response to Patrick Snook

Patrick Snook wrote:
Kevin,

Thanks again for all that. I've downloaded the atMonitor. I have been relying on the Activity Monitor's floating cpu-monitor to give me a rough idea on the run of operating load. Gosh, that looks paleolithic in comparison to atMonitor's graphic interface! I have an older copy of disktester, which I could update, and learn how to use, but I've also got a current Drive Genius, which I think offers similar information, and which, I've recently discovered, has excellent technical support. I will get Lloyd Chambers' suite of testing software too. I've a lot of reading to do. . . .

My MacBook (5,2), which I mainly use as my portfolio when I meet with potential clients, and as my large "iPhone" (with Mail, Safari, Skype, etc. and no contract!), does not include an express card slot (it's not a MacBook Pro). I chose this low-end device deliberately because, among other things, I wanted to have a common consumer-level computer, so that I can be sensitive to the computing power, and monitor quality, of my customers. When I hand them a disk of my pictures, I want them to be able to enjoy them with minimum fuss and no inconvenience, and the best way for me to check that everything looks more or less the way it should--and quickly--is to use the same computer that they might have. Or so goes my thinking. . . .

I'm facing that familiar quandary: upgrade now, or limp along, waiting a little longer (guessing how long that might be) for more bang-for-the-buck with the next significant hardware advance--i.e. USB3--to be standard equipment?

I note your points about copying the library, disk burning and testing, etc. I will find a way to do all that asap. Also see that FW800 is a weak link in the chain. Not much I can do, until either I buy a MacPro (or previous generation MacBookPro, with express card slot, etc.), or until I buy the next generation with USB3.

My immediate concern: storage and library size, and anticipated increase.

Perhaps best way forward is a MacPro now, which would allow me to use the four internal drive bays to make two 2TB raid 0, twice, (and the system etc. on a fifth drive, in one of the optical bays), and back up to vaults via the (eSATA-capable) VoyagerQ and removable drives. Do I gamble (trust) that USB3 will be able to be added to the MacPro?

Is now a good time to go with the MacPro?

Or go part way, and buy an enclosure such as the one you recommended earlier, allowing me to put the 2 by 2TB drives, connected albeit non-ideally via FW800, etc?

In sum, I think I can prioritize by saying that my chief concern has to be the size of the library, and the back-up strategy (which should be working with 2 identical libraries, asap), both of which I can address with the external enclosure. I don't notice performance speed issues that much. Most of my adjustments are quick (white balance, exposure, b&w conversion), and use the brushes very lightly, if at all. I do not use Photoshop. Importing is fairly slow. Exporting from Ap3 takes a while too (I add borders via BorderFX), but that's the opportunity for me to walk away from my workstation and attend to some fun domestic chore or other. Or read Apple Discussions. . . .

Thanks again, Kevin.

Patrick


Hi Patrick,

USB 3 will be available as a card interface for Mac Pro (cards already exist,

http://www.startech.com/item/PEXUSB3S2-2-Port-PCI-Express-SuperSpeed-USB-3-Card- Adapter.aspx

but drivers do not at this point for OS X), although that is moot, because on a Mac Pro I would go with SATA III, which is even faster. USB 3 will be significant mostly for iMacs, who are stuck with FW 800 at the moment.

You can get an older MBP, like mine for short money now, one of the guys bought one for $1000 on Craigslist a couple of weeks ago with a year of Applecare remaining. Add some RAM and the eSATA card and you are good to go for cheap.

The new 12 core Mac Pro should be out soon, making the Nahalem ones drop in price too. Once you are on eSATA, the size of the Library will not matter. Our biggest single library at 1.4 TB copies in less than 6 hours from stripe to stripe. My library is a little over 500 and copies in much less time.

Sincerely,


K.J. Doyle

Mar 14, 2010 5:03 PM in response to Kevin J. Doyle

Hey Kevin. So, with your guidance I've successful upgraded all of the necessary hardware (including installing my OS drive into the optical bay) and purchased all of your suggested software for maintenance and monitoring and now Aperture 3 is running like a dream. Thanks again for all of your help and enlightening information. I'm finally driving that F-1 I always wanted.

I just have a quick question: besides the 2 sets of raided drives with my library that I'm frequently switching back and forth from, I want to now instate a 3rd backup. So here's the question; for that back up, do you prefer an Aperture Vault or a SuperDuper Smart Update? It seems like the SuperDuper Smart Update would be the way to go because then you have a fully working updated copy of the library, rather than a vault that you would need to restore your library from. Whattaya think?

-Greg

Mar 15, 2010 3:56 PM in response to geewonder

geewonder wrote:
Hey Kevin. So, with your guidance I've successful upgraded all of the necessary hardware (including installing my OS drive into the optical bay) and purchased all of your suggested software for maintenance and monitoring and now Aperture 3 is running like a dream. Thanks again for all of your help and enlightening information. I'm finally driving that F-1 I always wanted.

I just have a quick question: besides the 2 sets of raided drives with my library that I'm frequently switching back and forth from, I want to now instate a 3rd backup. So here's the question; for that back up, do you prefer an Aperture Vault or a SuperDuper Smart Update? It seems like the SuperDuper Smart Update would be the way to go because then you have a fully working updated copy of the library, rather than a vault that you would need to restore your library from. Whattaya think?

-Greg


Hi Greg,

That is great to hear that everything is running so well.

I would recommend the SmartUpdate, for exactly the reason you stated, no rebuilding = no downtime.

I have been working on coming up with a definitive referenced vs. managed solution for the past couple of weeks. I have made some pretty startling discoveries, regarding how some things in Aperture actually work. I will be writing up these conclusions when I am done, but suffice it to say, you are set for the hardware and will be able to take advantage of the additional performance.

My actual work is cutting into this, but my partner is adamant about me actually contributing billable work...jeez...

Anyway, you know how to reach me if you have questions.

Sincerely,

K.J. Doyle

Mar 16, 2010 5:10 PM in response to Kevin J. Doyle

WOW, What a read...10 pages. Great work Kevin.

I've been meaning to put together some sort of storage/backup system but never knew where to start. But you have given me a **** good idea how to approach it now and improve Aperture in the process.

I've been relying to much on one FW drive, and several years ago lost 2 years of photos after a hard drive failure (a brand new hard drive which died after a matter of 48 hours)... After that you would think I would of done something sooner?

I currently have my Aperture library in the ol' Picture Folder, and have a 'sort' folder where images live when I first import. Everything is referenced, so no masters in Aperture Library. Once I'm happy that I finished with sorting/processing/metadata the masters get relocated onto the FW drive (so it's not always on and only accessed if I need to access older images).

How have you found the software raid? With this setup can you swap it to another computer easily?
I have an IT friend who says to steer clear and he is dead set against it.

Mar 20, 2010 12:08 PM in response to nigelch

I have to take issue with what has been said here. I am no expert but it does not make sense and the logical conclusion, if what it is said here is true, is that Aperture DB is awfully bad.

The Main reason a switched from PC to MAc was to get away for having to defrag my hard drives. On my PC days I had to do it 2 or 3 times a week to get performance back and I didn't have the same workload volume that I now have.

While the article has many good tidbits and it is true that disk fragmentation is a normal on every disk, some OS system deal better with it making the need for defrag rare.

If what he says is true it would make sense it would affect all apps in the system, specially those who depend on DB, such as Lightroom. Yet I have used LR 2 for the last 3 years without any noticeable performance issues. And never once defragmented my HDs. As a matter of fact, I have never own a copy of defrag utility until this morning and only based on this thread. I have been using MACs since 2003.

Because I want to make Ap 3 work for me I am willing to try anything and because what it waas said made some sense I downloaded and purchased iDefrag.

I ran it on my Ap and LR library disk which has been running for over 4 years without any issues and according to iDefrag, it is only 1% fragmented!!!

Think about it, I have continually added and deleted files at the rate of 30 weddings a year with an average of 2,000 image per wedding on the same HD (since 2007) without ever defragmenting it and it is only 1% fragmented. I never had an issue with LR 2 either. Just when I install Ap 3 and create a new library, mind you, that is alleged that frgamentation is the cause of poor performance. Before now, fragmentation has never been an issue with LR nor Ap 2 performance.

As a matter of fact, in Ap 2 I let Ap to manage the library of close to 300GB of data and never encountered the performance issues I am facing with Ap 3. Ap 2 was as slow as the first time I installed it as it is now.

I am no expert and may be wrong, but if 1% fragmentation affects Aperture to the degree that it does on my machine but not LR, then Ap is junk.

Yet, I am giving him the benefit of the doubt, as I said, I am no expert, but If that does the trick and it means that I will have to regularly defrag the HD in order to effectively use Ap 3, they lost me for good and I will most definitively go back to LR.

Message was edited by: DavidPR

Message was edited by: DavidPR

Message was edited by: DavidPR

Mar 21, 2010 12:55 PM in response to Kevin J. Doyle

It is amazing to see the help here! (Esp. thank you, Kevin.) The only problem is that I need BAAAABY steps. Just tell me what to do/buy in order and I will follow the instructions.

I really, really want A3 to work (upgraded from A2 which was fine). I just installed it today, and it crashes even just upon opening a photo.

I have now uninstalled it and am wondering what is the best way to proceed from here.

Use iDefrag first?... Before opening the newly installed A3? And before I do that, am I supposed to get more back-up...(Just one connected HD and Time Capsule at present)? If so, what?

Instructions at
http://support.apple.com/kb/HT3805?viewlocale=en_US
Necessary?

I loved using A2, but unfortunately I don't really know HOW it works and the lingo here. If anyone can give me mucho simple instructions of how to proceed/what to buy, I will try it. (I need photos for a job this week)

Thank you!!!
SKJ








I surely have nothing that you have recommended so far, hence, I do not know where to start. In any case, I need to work on my photos this week!

Mar 21, 2010 4:39 PM in response to DavidPR

DavidPR wrote:
I have to take issue with what has been said here. I am no expert but it does not make sense and the logical conclusion, if what it is said here is true, is that Aperture DB is awfully bad.

The Main reason a switched from PC to MAc was to get away for having to defrag my hard drives. On my PC days I had to do it 2 or 3 times a week to get performance back and I didn't have the same workload volume that I now have.

While the article has many good tidbits and it is true that disk fragmentation is a normal on every disk, some OS system deal better with it making the need for defrag rare.

If what he says is true it would make sense it would affect all apps in the system, specially those who depend on DB, such as Lightroom. Yet I have used LR 2 for the last 3 years without any noticeable performance issues. And never once defragmented my HDs. As a matter of fact, I have never own a copy of defrag utility until this morning and only based on this thread. I have been using MACs since 2003.

Because I want to make Ap 3 work for me I am willing to try anything and because what it waas said made some sense I downloaded and purchased iDefrag.

I ran it on my Ap and LR library disk which has been running for over 4 years without any issues and according to iDefrag, it is only 1% fragmented!!!

Think about it, I have continually added and deleted files at the rate of 30 weddings a year with an average of 2,000 image per wedding on the same HD (since 2007) without ever defragmenting it and it is only 1% fragmented. I never had an issue with LR 2 either. Just when I install Ap 3 and create a new library, mind you, that is alleged that frgamentation is the cause of poor performance. Before now, fragmentation has never been an issue with LR nor Ap 2 performance.

As a matter of fact, in Ap 2 I let Ap to manage the library of close to 300GB of data and never encountered the performance issues I am facing with Ap 3. Ap 2 was as slow as the first time I installed it as it is now.

I am no expert and may be wrong, but if 1% fragmentation affects Aperture to the degree that it does on my machine but not LR, then Ap is junk.

Yet, I am giving him the benefit of the doubt, as I said, I am no expert, but If that does the trick and it means that I will have to regularly defrag the HD in order to effectively use Ap 3, they lost me for good and I will most definitively go back to LR.

Message was edited by: DavidPR

Message was edited by: DavidPR

Message was edited by: DavidPR


Hi David,

The length of the thread may have caused the basic message to be lost, and I find myself correcting folks to make sure they understand the simplicity of the solution.

I want to clarify, there is no specific babysitting/defragging involved, just normal use.

Remember, I did not say to do anything except make your daily backups in Aperture, and use volumes dedicated to just Aperture data. The only difference is I said to operate from the backup the next session, and continue this cycle. You will have no operational problems, PLUS you will have no downtime in the event of a media failure, just switch to the copy in Aperture and keep working!

*THE ACTION OF MAKING THE BACKUP COPY DOES A FILE BY FILE DEFRAG, NO SEPARATE OPERATION NEEDED.*

That's it... of course if you are not backing up each session, then you have bigger problems than just performance.

*AGAIN, I NEVER SAID RUN A DEFRAG PROGRAM TO DEFRAG!!!*

Just use it to view the file list and see if you have Fragments...NOT the statistics screen, a percentage of total files is not a meaningful report for large static data.

If you Google large database operation in LR, you see things are not automatic there either. Any database system once it grows to a certain size must be maintained.

Sincerely,

K.J. Doyle

Mar 23, 2010 11:35 PM in response to m stan

m stan wrote:
Kevin J. Doyle wrote:


I am also assuming everyone with a Mac Pro has a UPS, if not you gotta buy one and plug all the arrays into it as well. I can help with that too, just not tonight.

Kevin, I would be interested in your thoughts on UPS'.


Ahh...plumbing talk.

A UPS is to protect against power outage and brownouts causing data loss, don't expect more.

First of all, everyone running a computer professionally needs a UPS, and a simple power strategy. The Mac OS has made the interface simple with built-in software for major brands of UPS.

There is a discussion to be had regarding all the options available, but it is really not necessary, and might even delay the single most important fact:

*Frankly, the only mistake you can make to start is not to have one at all.*

Today the market leader is APC, and the Mac OS has a built-in USB connected UPS monitoring system in the EnergySave pref pane. You see the tab appear when you plug the UPS into your system via USB. Most major brands are supported, but I have always used APC. Basically, loss of mains power will signal the computer and trigger a controlled shutdown. Picture is worth 1000 words:

!http://www.cyberpowersystems.com/images/faqs/OSX_EnergySaver.jpg!

Obviously, plug your computer, monitor and all storage into the UPS. DON'T put printers or non-mission critical items on the UPS, as it will kill it quicker. All UPS manufacturers give you a power calculator to help you select the right unit for your computer.

The basic UPS units will give you about 10-15 minutes of battery uptime, plenty of time to close everything and shut down. Buying the bigger units cost more, can give you more time...but why? Data security and safety is what this is purchased for in an Aperture rig, not acting as a backup generator. Remember you will also be replacing the battery in 3 years or so, another reason to go with the biggest seller.

This model is $187 at Amazon, free shipping and will handle most any Aperture rig for a controlled shutdown: http://www.amazon.com/APC-Back-UPS-Watts-Output-System/dp/B000NDA5E0/ref=pdcp_e0

!http://www.apcmedia.com/resource/images/500/Front Left/532365BD-5056-AE36-FE86E1CD2436B4F5pr.jpg!

!http://g-ecx.images-amazon.com/images/G/01/ciu/67/6b/ba9f225b9da050c47a4b1110.L .jpg!

I could go into great depth on all the various safeguards and strategies these companies have available, but that is not necessary for an Aperture rig.

Offsite backups and disconnected unpowered backups are the only things that will reliably survive lightning...trust me on this. Even with 6 figure server room class isolated power systems, lightning can still win, so unless you are stuck running a server farm, its best not to fight at all. Just keep copies of the data in different spots, lightning as not learned how to find them.

Hope this helps,

Sincerely,


K.J. Doyle

Mar 27, 2010 9:58 AM in response to Kevin J. Doyle

Kevin,

I just bought the APC Smart UPS 1500 at a decent discount from the local Fry's here in SOCAL. It is large (weighs over 50 lbs) and has lots of runtime but I didn't buy it for that reason. I bought it for its pure sine wave capability apparently needed to prevent the MAC Pro's large (980 watt) power supply from overloading the UPS. The other APC UPS' (BR1500 and BX1500) don't have this capability which probably accounts for the buzzing from the MAC Pro power supply when on battery with these units (see this APC article for an explanation http://www.apcmedia.com/salestools/RMUZ-7DTKRCR1EN.pdf ). Maybe the buzzing is just an irritant to be tolerated those few times when on battery power but I am concerned by the MAC Pro being on substandard power for even a few minutes.

Now I wish there was a way to get access to the APC configuration settings on OS X without having to go through Windows. I understand that OS X has built in UPS settings to do the essentials of shutting down but it would be nice to access some of the other settings. In Windows access is provided through a web browser and according to the install notes should work under any browser regardless of OS but I can't get it to work under Safari. Have you been able to do it?

Thanks for your so very thorough and thoughtful posts - some of the best I have read on any forum.

Stan

Message was edited by: m stan

Message was edited by: m stan

Mar 27, 2010 10:18 AM in response to Kevin J. Doyle

Kevin,

Your screenshot of the UPS energy saver settings triggered me to ask a question I have been wondering about all the years since Apple put the UPS shut down options in OSX:

You have ticked all three shut down options:

1. Shut down after 7 mins on UPS power
2. Shut down when time left on UPS is 3 mins
3. Shut down when UPS power level is 35%

Obviously one of these is going to take priority. Does it do which ever happens first?

I have always only ticked the first box and used 5 mins, plenty of time (if I am at machine) to stop whatever and shut down manually. I have always thought that option 1 is the most trustworthy, being a simple timer. The other two require some estimation/calculation. OTOH, if the battery is near the end of its life, and unable to do 5 mins, then options 2 and 3 would be take over. So if I knew it was "whatever happens first" I would tick all three.

BTW a small tip about replacing the UPS battery which I have just done after four years on my APC Smart 1500. (The weekly self test process failed). I accidentally ordered a cheaper third party battery made by a company called CSB, instead of a genuine APC one (I usually go for the OEM part).

After replacing the battery, I had a look at the original APC one I had taken out, and peeled off the APC sticker, and guess what it had a CSB sticker underneath !

Message was edited by: Mike Boreham

Message was edited by: Mike Boreham

Aperture 3 Library Fragmentation causes problems

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple ID.