How does Core Audio set sample rate?
When I play a particular movie in QuickTime, the audio and video are out of sync. The movie plays fine on other computers.
This Mac Pro (OS 10.6.8) is used exclusively for Pro Tools and Final Cut Pro. The audio hardware is a Pro Tools HD Native card:
Native card --> Pro Tools Digital I/O boxes --> Lavry D/A converters --> monitors
The sample rate of the Digital I/O boxes is set by an external master clock, a Lavry Gold A/D converter.
I opened the DigiDesign Core Audio Manager and it says:
Connected @ 44.1K, 32 In/32 Out, Buffer Size 512
Yet the movie is at 48K (I know because I created it in Final Cut Pro), and the external clock is set at 48K. So, I don't know where the 41K in the Digidesign Core Audio Manager came from. This is why I am guessing the problem is sample rate.
Please help me understand what sets the sample rate.
Does the application using Core Audio set it?
If so, how is this made consistent with the sample rate set for the hardware by the external master clock?
Do I have to be sure I always change the external clock setting to be consistent with the movie being played?
BTW, I have not had a sync problem when I play video in Final Cut Pro..........it is always in sync. So, I have never worried about how Core Audio works. The sync problem is only with QuickTime.
Final Cut Pro X, Mac OS X (10.6.8)