12 Replies Latest reply: Jun 14, 2014 8:07 AM by shammyd
wombat2007 Level 1 Level 1 (5 points)

Q for the group

I've switched over to referenced masters and cut down the size of my .aplibrary by 70%

It is however, still around 78GB for working conditions.


My question is, when I include the library in a ChronoSynch backup over the network it takes quite a while,

becasue it sees the file as modified everytime and it has to copy over the entire file //


I do keep vaults on two seperate external drives ==

Any suggestions on workflow from ChronoUsers ? Should I simply exclude the .aplibrary from my backup ?




iMac, Mac OS X (10.7.4), 32 GB DDR3, Triple TB Display
  • Shuttleworth125 Level 2 Level 2 (415 points)

    We use it in the manner you suggest, Chronosync only backs up the referenced masters to the external drives, and we use vaults on the externals to back up the library. We also run Time Machine to a Time Capsule, though I'm not certain that this backs up Aperture libraries when they are open.


    I think you can set Chronosync to backup a file when an application closes, this might be another option for you.

  • wombat2007 Level 1 Level 1 (5 points)

    I wouldn't doubt it - CS gets pretty intense, but you answered my Q.

    The size of the Lib ( whether referenced or not ) is just too large to synch everytime theirs a change.

    The vaults do seem to go faster, so your choice of external backs make sense.


    The reason I ask in the first place, is I have an external off-site that is backing up daily. CS is smart enough to detect changes, but the AP library throws a wrench as it seems to have to copy the entire file, even if one photo is changed.


    Thanks again ==



  • Kirby Krieger Level 6 Level 6 (12,510 points)

    JJ --


    Fwiw, SuperDuper! is, afaict, smart enough to copy only the changed files within the Library.  At the same time, it still has to read all the files, and this can take as long as it takes Finder to simply copy the files.


    I set SuperDuper! to run when none of the Libraries or machines are in use (viz.: overnight), and to update sparse bundle files of every Library and every set of Referenced Originals.  I only do this once per week, however, on alternating back-up drives.  I back up all critical Libraries every day using Time Machine.  (TM worked _poorly_ with early editions of Aperture 3.x.  It has become usable -- kudos, I guess, to Apple's engineers.)


    I administer one very large and unwieldy Library (finishing the upgrade to 3.3.x took 170 hours of full-on processing on a 5,5 MBP).  I'm just getting to the point of setting this back up for back-ups.  I will be looking at efficient ways to do this.  I gave up on Vaults long ago -- I couldn't get consistent results on any Vault over 50 GB (iirc).  Afaik, this limitation has been removed in the most recent upgrades.


    My biggest Library contains over 3 million files (not Images -- files).


    Fwiw, I _think_ that TM now backs up at least part of open Aperture Libraries.  Have not taken the time to pin this down, however.  TM certainly exhibits much better behavior with Aperture -- but this may be the result of (programmatically) ignoring it.


    So my suggestions:

    - make sure your Vaults are working (verify!).

    - see if SuperDuper! is better are updating back-ups.  (Iirc, a trial is available.)


    I am semi-happily running a rMBP, and may move all external storage to USB3 or T-bolt.  As it is, I can't sell my old MBP until Apple releases the Firewire<>Thunderbolt adapter.  (Yo!!)  I am not savvy enough about hardware to know if the faster bus (is it the bus?) will increase overall performance.

  • jderuvophotography Level 1 Level 1 (70 points)

    I believe you can set ChronoSync to dissect Packages so that the whole Library does not get copyed over every time

    This is how we have it set up

  • wombat2007 Level 1 Level 1 (5 points)

    jderuvo /// you 'da man !!!!!

    you are corrrect - my backups went from 1.7 days to 2.4 minutes !!! thanks - yeehaw.


    kirby - in answer to your previous. the bus helps but my understanding is its more the drive. Specs say that FW800 can handle transfer rates peer to peer up to 3200 ( 400 MB ), but reality check on forums say 50/MB/s is reasonable. In my experience, half that is the norm depending on your drive.

    The bus on my new iMac will handle up to 800 Mb ( 100 MB ) , and both external RAIDS seem to handle 100 but are usually undercut significantly.

    In terms of TBolt, this will blow everything away, so I would simply couint on getting a new machine ; > The pricepoints IMO are still too high. I'm surviving on 800 - for now.

    Funny, I spent the weekend getting in a new NanoBridge up and running and I'm getting a blazing 300 MB/s assymetrics but my network will only take 100-


    sometimes the technology simply leap-frogs itself ===




  • jderuvophotography Level 1 Level 1 (70 points)

    Happy to help

    Been using ChronoSync for years just for this purpose

  • wombat2007 Level 1 Level 1 (5 points)

    well then, let me fire another across the bow, if you dont mind.

    what really attracted me to CSynch was the ability to backup over a WLAN.

    I have an external drive that is offsite, and on the same LAN but the best I can get off the drive is USB to get it on a network.

    How does one hook an external into a LAN with a worthwhile connection ?



  • jderuvophotography Level 1 Level 1 (70 points)

    For us!

    We have a computer acting as the Library host/server with a couple of Addonic cases plugged into it

    This all runs WiFi

    When I come back from a job with a portable hard drive I will back it up over the network

    But I am not using the wireless as part of my backup routine

    To get things off site we go the swap out harddrive route on a regular basis depending on how many jobs are running at the time

    Hope this helped a bit, not the exact surnario you were looking for but it's how we work and finding something that works for you and sticking to it goes a long way! :-)

  • wombat2007 Level 1 Level 1 (5 points)

    ya, I'm not a big fan of swapping out hardware. Also, the latency, otr time delay is tantamount to risking all that info in-between swaps.

    I hooked into a neighbors WAN and backup via CS. Do the initial baclup local and then the updates take like minutes.

    Never have to touch a drive.

    Of course, if you're in the feild, your situation makes perfect sense, as you risk the same loss transporting the drive.

    Have tried the WiFi connect witht eh Canon's ?

  • jderuvophotography Level 1 Level 1 (70 points)

    The Addonics make the swap easy.

    Also this is our offsite (3rd) backup

    We also have a cradle running for the offsite stuff and to clone harddrives with when needed

    Running the Canons on WiFi? No never, closest is using the EyeFi card with ShutterSnitch to iPad when we have Clients in the Studio

    Shooting RAW with only small jpegs being sent to the iPad runs pretty smoothly for the way we shoot


  • wombat2007 Level 1 Level 1 (5 points)

    thats really cool. I love the shuttersnitch idea.

    i also love that you;re shooting in black and white.


    my setup seems to be working. the secret is that if you initially synch the libraries locally, the remote

    CS synch is minimal ( mine is like seconds if not too much has been uploaded )

    I took your advice and made everything referenced, except the currrent working projects.


    You had mentioned before you were shooting RAW. I have just been shooting JPGS.

    What do you find yourself using RAW for ? Exposure ? I just can't grovk the benefit.

  • shammyd Level 1 Level 1 (0 points)

    I have an iMac and a MacBook.  I have some Aperture file in Dropbox.  I would like to use Chronosync to synchronize (not backup) the two versions of the Aperture library, if I put a new version on Dropbox from my laptop.  Any experience with using CS to sync the Aperture library.  I would sure like this to work....