Quote:
Originally Posted by joaquin
Hi Mr Katz. There's a statement in your book "mastering Audio" (great book and really appreciate your detailed and instructional views), where you say that a digital interface, even when clocked to a great great clock, will never outperform it's own, due to a separate process (pll), wich in a bad scenario, can only introduce artifacts to the sound. I believe that you also state one should clock externally only when recording thru external converters, and always clock internally when not. I assume that in a Hibrid Mixing stage, the benefits of great analog outboard gear will surpass the downfall of clocking externally. Would you elaborate a bit more on the topic, cause here in gearslutz, there's always been a popular consent, that a better external clock will improve your audio...if you have , let's say, a digi 002 and wish a Big Ben to make things better!?
Anyway. Thank you so much for stoping by and share your knowlwadge this month! Hope to have you around in the future.
Cheers....................Joaquin.
Hey, it's almost May... maybe there'll be another month :-). Others may of course chime in. My input is based on both technical measurements, theory, and listening. First of all, your statement above was a bit confusing. When you said "one should clock externally only when recording through external converters", it gets confusing. Let's rephrase that to, "an A/D converter PROBABLY will perform best when running on internal (crystal) clock" as opposed to being clocked externally." So, yes, PROBABLY, your A/D should be the master clock and your DAW should be clocked to the A/D. Notice how I avoided the term "externally" in that last sentence to avoid the confusion that you may have caused by referring to your DAW as "externally clocked" when I'm talking about the A/D as being internally clocked---which mean the same thing, of course.
OK, only measurements and very careful listening tests will confirm if your A/D performs better on internal or external sync. But I can say "PROBABLY" AND highly likely it will be better clocked interally. Apogee makes opposite claims about the Big Ben, and I have not seen a shred of objective measurement or evidence that would show it to be so. This is not voodoo, by the way, it is science. The ONLY WAY that an external clock can outperform an internal clock is:
---if the internal clock is inferior and poorly designed (which means the converter designer did a bad job). Because the clock signal goes through far more potentially degrading circuitry when it is generated external to the converter
Another good question is when you MUST clock externally, if an external clock can ever IMPROVE performance instead of degrading it. In my book I claimed that an external clock would always degrade performance compared to the intrinsic jitter of the PLL, and that is the common wisdom. However, another authority (in my mind), Eelco Grimm, recently pointed out to me that an external low-jitter clock with a very low jitter BELOW the corner frequency of the PLL can improve the low frequency jitter. BELOW the corner frequency, the external jitter dominates, above the corner frequency, the PLL's jitter dominates.
So, with an EXTREMELY low jitter external clock, and if the converter's PLL has a relatively high corner frequency, the low frequency jitter of the converter can be improved compared to other external clocks. A good converter, therefore, should have a PLL with a very low corner frequency. And the lower the corner frequency, the less likely that an external clock will improve and the more likely it will degrade the converter's performance. For example, Prism converter's have a corner frequency below 200 Hz while typical converters' PLLs are above 2 kHz! So it is highly likely that a very good converter like a Prism will not be affected at all, or possibly degrade no matter what external clock you feed it.
I'll be sure to include this addendum in my second edition.
-----
Now, what can you do if you do NOT have measurement equipment? This whole jitter thing is so subjective, isn't it? Well, you can be as objective as possible. Take a high quality source of stereo music (such as a 30 IPS analog tape, or an SACD). Feed it into your A/D and record that into your DAW. Transfer it twice, once on internal, once on external clock, and if you have several models of external clock, transfer it several times.
Now, line up all of the transfers on different tracks in your DAW, as closely as possible.
Then, with your DAW's DACs set to internal clock (which is likely to be most stable, and at least it will be consistent), solo back and forth between any pair of tracks. The track which has the widest image, most stable, with the most solid bass, purest, and warmest, is the one which represents the transfer with the least jitter. Make the comparisons blind if you can.
That's the best test you can do minus using measurement equipment, and if you perform that test with your DAW that carefully, and tell us about the results, then I'll believe you! And if you cite this testing method, then we will all benefit from some carefully-performed tests instead of the usual half-baked claims.