Looks like no one’s replied in a while. To start the conversation again, simply ask a new question.

ChronoSynch and Aperture

Q for the group

I've switched over to referenced masters and cut down the size of my .aplibrary by 70%

It is however, still around 78GB for working conditions.


My question is, when I include the library in a ChronoSynch backup over the network it takes quite a while,

becasue it sees the file as modified everytime and it has to copy over the entire file //


I do keep vaults on two seperate external drives ==

Any suggestions on workflow from ChronoUsers ? Should I simply exclude the .aplibrary from my backup ?



JJ

iMac, Mac OS X (10.7.4), 32 GB DDR3, Triple TB Display

Posted on Jul 5, 2012 11:33 AM

Reply
Question marked as Best reply

Posted on Jul 5, 2012 12:12 PM

We use it in the manner you suggest, Chronosync only backs up the referenced masters to the external drives, and we use vaults on the externals to back up the library. We also run Time Machine to a Time Capsule, though I'm not certain that this backs up Aperture libraries when they are open.


I think you can set Chronosync to backup a file when an application closes, this might be another option for you.

12 replies
Question marked as Best reply

Jul 5, 2012 12:12 PM in response to wombat2007

We use it in the manner you suggest, Chronosync only backs up the referenced masters to the external drives, and we use vaults on the externals to back up the library. We also run Time Machine to a Time Capsule, though I'm not certain that this backs up Aperture libraries when they are open.


I think you can set Chronosync to backup a file when an application closes, this might be another option for you.

Jul 5, 2012 4:10 PM in response to Shuttleworth125

I wouldn't doubt it - CS gets pretty intense, but you answered my Q.

The size of the Lib ( whether referenced or not ) is just too large to synch everytime theirs a change.

The vaults do seem to go faster, so your choice of external backs make sense.


The reason I ask in the first place, is I have an external off-site that is backing up daily. CS is smart enough to detect changes, but the AP library throws a wrench as it seems to have to copy the entire file, even if one photo is changed.


Thanks again ==


JJ

Jul 5, 2012 5:09 PM in response to wombat2007

JJ --


Fwiw, SuperDuper! is, afaict, smart enough to copy only the changed files within the Library. At the same time, it still has to read all the files, and this can take as long as it takes Finder to simply copy the files.


I set SuperDuper! to run when none of the Libraries or machines are in use (viz.: overnight), and to update sparse bundle files of every Library and every set of Referenced Originals. I only do this once per week, however, on alternating back-up drives. I back up all critical Libraries every day using Time Machine. (TM worked _poorly_ with early editions of Aperture 3.x. It has become usable -- kudos, I guess, to Apple's engineers.)


I administer one very large and unwieldy Library (finishing the upgrade to 3.3.x took 170 hours of full-on processing on a 5,5 MBP). I'm just getting to the point of setting this back up for back-ups. I will be looking at efficient ways to do this. I gave up on Vaults long ago -- I couldn't get consistent results on any Vault over 50 GB (iirc). Afaik, this limitation has been removed in the most recent upgrades.


My biggest Library contains over 3 million files (not Images -- files).


Fwiw, I _think_ that TM now backs up at least part of open Aperture Libraries. Have not taken the time to pin this down, however. TM certainly exhibits much better behavior with Aperture -- but this may be the result of (programmatically) ignoring it.


So my suggestions:

- make sure your Vaults are working (verify!).

- see if SuperDuper! is better are updating back-ups. (Iirc, a trial is available.)


I am semi-happily running a rMBP, and may move all external storage to USB3 or T-bolt. As it is, I can't sell my old MBP until Apple releases the Firewire<>Thunderbolt adapter. (Yo!!) I am not savvy enough about hardware to know if the faster bus (is it the bus?) will increase overall performance.

Jul 5, 2012 7:39 PM in response to jderuvophotography

jderuvo /// you 'da man !!!!!

you are corrrect - my backups went from 1.7 days to 2.4 minutes !!! thanks - yeehaw.


kirby - in answer to your previous. the bus helps but my understanding is its more the drive. Specs say that FW800 can handle transfer rates peer to peer up to 3200 ( 400 MB ), but reality check on forums say 50/MB/s is reasonable. In my experience, half that is the norm depending on your drive.

The bus on my new iMac will handle up to 800 Mb ( 100 MB ) , and both external RAIDS seem to handle 100 but are usually undercut significantly.

In terms of TBolt, this will blow everything away, so I would simply couint on getting a new machine ; > The pricepoints IMO are still too high. I'm surviving on 800 - for now.

Funny, I spent the weekend getting in a new NanoBridge up and running and I'm getting a blazing 300 MB/s assymetrics but my network will only take 100-


sometimes the technology simply leap-frogs itself ===



JJ

Jul 5, 2012 7:54 PM in response to jderuvophotography

well then, let me fire another across the bow, if you dont mind.

what really attracted me to CSynch was the ability to backup over a WLAN.

I have an external drive that is offsite, and on the same LAN but the best I can get off the drive is USB to get it on a network.

How does one hook an external into a LAN with a worthwhile connection ?


JJ

Jul 5, 2012 8:04 PM in response to wombat2007

For us!

We have a computer acting as the Library host/server with a couple of Addonic cases plugged into it

This all runs WiFi

When I come back from a job with a portable hard drive I will back it up over the network

But I am not using the wireless as part of my backup routine

To get things off site we go the swap out harddrive route on a regular basis depending on how many jobs are running at the time

Hope this helped a bit, not the exact surnario you were looking for but it's how we work and finding something that works for you and sticking to it goes a long way! :-)

Jul 15, 2012 8:45 PM in response to jderuvophotography

ya, I'm not a big fan of swapping out hardware. Also, the latency, otr time delay is tantamount to risking all that info in-between swaps.

I hooked into a neighbors WAN and backup via CS. Do the initial baclup local and then the updates take like minutes.

Never have to touch a drive.

Of course, if you're in the feild, your situation makes perfect sense, as you risk the same loss transporting the drive.

Have tried the WiFi connect witht eh Canon's ?

Jul 15, 2012 8:55 PM in response to wombat2007

The Addonics make the swap easy.

Also this is our offsite (3rd) backup

We also have a cradle running for the offsite stuff and to clone harddrives with when needed

Running the Canons on WiFi? No never, closest is using the EyeFi card with ShutterSnitch to iPad when we have Clients in the Studio

Shooting RAW with only small jpegs being sent to the iPad runs pretty smoothly for the way we shoot

Joseph

Aug 1, 2012 7:24 AM in response to jderuvophotography

thats really cool. I love the shuttersnitch idea.

i also love that you;re shooting in black and white.

kudos

my setup seems to be working. the secret is that if you initially synch the libraries locally, the remote

CS synch is minimal ( mine is like seconds if not too much has been uploaded )

I took your advice and made everything referenced, except the currrent working projects.


You had mentioned before you were shooting RAW. I have just been shooting JPGS.

What do you find yourself using RAW for ? Exposure ? I just can't grovk the benefit.

ChronoSynch and Aperture

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple ID.