Audio drift when recorded on different equipment
Composing music on Logic Pro. The bas player (who lives in another city) receives the Logic project file (with midi lines of the music), records his bas lines on Logic Pro and sends the recorded bas files (uncompressed). When the bas recording is imported into the master Logic project, the imported file drifts. Gradually it starts getting audibly out of synch, more and more as time goes on. Nope... on his Mac it sounds great, tight, correct, but when imported into the master-project where the other audio files are recorded, it drifts. Why? How?
Not only in Logic.
Concert video. Video taken with 5 HD cameras, camera's audio at 48KHz. Audio was also separately recorded in a Pro Tools system, also 48 KHz.
The video was later captured into FCP using the standard HD 1080 50i setting. Syncing up the videos was no big hassle. All the cameras audio-files read as they should. But since these are the camera-mics, the sound quality is far from usable. So, we add the "real" audio, the audio we recorded on ProTools during the concert. When adding the ProTools audio, it drifts out of sync... slowly but surely.
I found in the FCP manual about the fact that if audio and video is not synced using a master sync / time generator during the recording, the audio can drift in editing.
In my mind, the master program (in this case FCP) should read exactly 48000 / second regardless if the audio is captured on the cam or on any other 48KHz device. But it does not do this... it reads maybe 48090 - or whatnot - samples per second. The result is that the audio drifts ahead of the video and other audio tracks.
But how is this possible? I want to understand. I can understand the need for an external sync during recording, to more easily sync up the start point for each track when editing, but one second is one second long, exactly, no more, no less... and 48000 samples is 48000 samples, wether recorded on one device or another. 48KHz is 48KHz and 1 second is 1 second, 1 hour is 1 hour, summing up the exact amount of samples whatever the device. 1 hour is 172 800 000 samples read at 48KHz. Nothing more. Nothing less. And a digital device should be able to read this at that accuracy. Or?
In my mind, a bit is a bit, a byte is a byte, a sample is a sample... the master-program should read each sample in its place, 48000 / second, in a quantified "time grid" where each "grid snap" is 1/48000th of a second. So, in my mind, this drifting just cant happen, but it does. Its like it has a separate clock or a smaller "grid" for each track, instead of one clock, one grid for the whole project. I could understand if the whole project drifts (but that would not be noticeable to the ear since we are talking about a drift of 2 seconds per hour). But I cant understand how one track drifts independent of the other tracks. And it is not intermittent. It drifts with total consistency, in a linear mode. It does not depend on if I have on "Play" from start to end, or if I randomly move the "play-head" to a part of the project... the drift is linear and exact, so at the end of the project the amount of drift is the same, regardless if played for start to end or jumped to end.
Can anyone explain why, and if there is anything one can to in the program (FCP or Logic) when reading the files to ensure actual sample-time accuracy when reading imported files?
iMac C2D 2GHz and 2 Gb RAM, Mac OS X (10.4.10), a couple of G5 2x2GHz/2.5Gb, iBook G4 966, 10.3.9, a couple of old PMs and iMacs