Posted what I thought was a simplified explanation about sound quality in DD/DTS vs. PCM. Curious where I'm off vs. maybe the guy correcting me.
My post:
The response:
My followup:
Some more expert insight into my misunderstanding? The other poster is an analog signal engineer, so I am guessing he knows more about this than me.
Thanks.
BB
My post:
In PCM audio (what CDs use) the bits are sent over the wire, and not only does the value of the bit (1 or 0) determine the signal, but the timing between the bits is part of the equation as well. Since the bits are not actual on/off, but an electrical signal that rises and falls in voltage quickly or sharply, there is some room for error.
If your digital cable allows interference in, or because of its electrical properties sort of smears out the rise or fall of the voltage over time, the receiving equipment may have difficulty resolving the transition from carrier signal to bit signal at the exact correct instant to interpret the signal without any distortion. In effect this jitter (shifting back or forth in time the arrival of the bit by a small amount) can sort of subtly blur or distort the shape of the analog signal derived from the converted bits. Hence the claim that some digital cables sound "better" than others. You probably need a fairly good system to hear this difference.
Now Dolby Digital and DTS bitstreams are NOT handled this way. Data is sent in packets, grouped together when the correct number of packets are received and then transformed into an analog signal. Much the way info comes over the internet. So there is no issue with whether the bits arrive at the exact right time, as long as they arrive nearly so, all intact and in the right sequence. When they do not, you get things like loss of digital lock, or dropouts and such, not continuous audio with degraded sound.
If your digital cable allows interference in, or because of its electrical properties sort of smears out the rise or fall of the voltage over time, the receiving equipment may have difficulty resolving the transition from carrier signal to bit signal at the exact correct instant to interpret the signal without any distortion. In effect this jitter (shifting back or forth in time the arrival of the bit by a small amount) can sort of subtly blur or distort the shape of the analog signal derived from the converted bits. Hence the claim that some digital cables sound "better" than others. You probably need a fairly good system to hear this difference.
Now Dolby Digital and DTS bitstreams are NOT handled this way. Data is sent in packets, grouped together when the correct number of packets are received and then transformed into an analog signal. Much the way info comes over the internet. So there is no issue with whether the bits arrive at the exact right time, as long as they arrive nearly so, all intact and in the right sequence. When they do not, you get things like loss of digital lock, or dropouts and such, not continuous audio with degraded sound.
There is only ONE source of jitter in the digital (PCM) chain that is relevant: at the digital-to-analog conversion (i.e. the DAC in your receiver, preamp, etc). Jitter anywhere else has no relevance on the sound. As long as the DAC gets it right it makes no difference. If the DAC is dependent upon such timing the designer of the DAC should be taken out and publicly humiliated for such an incredibly stupid design.
DD/DTS IS in fact handled the same way as PCM. The only relevant difference is DD/DTS is (usually) buffered more heavily due to the variable bit rate (i.e. there may be periods when the incoming data rate is slower than the output).
Some actual examples of "high end" audio gear that mucked things up despite the fact there were proper implementations DECADES before digital audio was even a wet dream:
1. Used the incoming bitstream as a reference clock.
2. Poor mixed signal layout leading to noise coupling to the clock.
3. And the best example (from Richard D. Pierce):
quote:
--------------------------------------------------------------------------------
In the process of testing, I connected a Tascam DA-30 DAT recorder to a "highly regarded and favorably reviewed high-end DAC from a prestigious company. With certain cables, the residual noise floor on the output of the DAC was simply AWFUL. Why?
The reason was simple: you had two utterly incompetent implementations. The DA-30's SP/DIF output had miserable current drive capability: load it up with capable capacitance, and the output went into slew-rate limiting and failed to meet the rise-time requirement. The DAC, for all of it's thousands of pretentious dollars, had the most MISERABLE design for clocking around. The result was that, given the right cable with enough capacitance, the resulting output had oodles of jitter in it.
Now, here's the ironic thing: the DAC used was considered by many to be very "transparent" and was one of the few that, it was said, COULD reveal differences in digital cables.
Well, that's NOT what was really happening. In reality, the idiot who designed the reclocking circuitry in the DAC simply got it wrong in a seriously stupid fashion: the designer simply made a DAC clock recovery circuit that was SO sensitive to small changes in input conditions, that its performance was all over the map.
Yet, members of the high-end press praised this piece of crap for its "transparency. "It's obvious," it was said, "that anything that DOES show such large differences in cables MUST be transparent and high-resolution," when, in fact, precisely the opposite was the case.
Far less price- and name-pretentious DACs, those with FAR better clocking circuits did NOT exhibit this ridiculous sensitivity to the cable: they were immune to the variation that the more expensive DAC simply could not handle properly.
So, if the problem exists, it's not because some of us are using DACs that are immune to these errors, it's because some of you AREN'T :-(.
Dick Pierce
Professional Audio Development
DD/DTS IS in fact handled the same way as PCM. The only relevant difference is DD/DTS is (usually) buffered more heavily due to the variable bit rate (i.e. there may be periods when the incoming data rate is slower than the output).
Some actual examples of "high end" audio gear that mucked things up despite the fact there were proper implementations DECADES before digital audio was even a wet dream:
1. Used the incoming bitstream as a reference clock.
2. Poor mixed signal layout leading to noise coupling to the clock.
3. And the best example (from Richard D. Pierce):
quote:
--------------------------------------------------------------------------------
In the process of testing, I connected a Tascam DA-30 DAT recorder to a "highly regarded and favorably reviewed high-end DAC from a prestigious company. With certain cables, the residual noise floor on the output of the DAC was simply AWFUL. Why?
The reason was simple: you had two utterly incompetent implementations. The DA-30's SP/DIF output had miserable current drive capability: load it up with capable capacitance, and the output went into slew-rate limiting and failed to meet the rise-time requirement. The DAC, for all of it's thousands of pretentious dollars, had the most MISERABLE design for clocking around. The result was that, given the right cable with enough capacitance, the resulting output had oodles of jitter in it.
Now, here's the ironic thing: the DAC used was considered by many to be very "transparent" and was one of the few that, it was said, COULD reveal differences in digital cables.
Well, that's NOT what was really happening. In reality, the idiot who designed the reclocking circuitry in the DAC simply got it wrong in a seriously stupid fashion: the designer simply made a DAC clock recovery circuit that was SO sensitive to small changes in input conditions, that its performance was all over the map.
Yet, members of the high-end press praised this piece of crap for its "transparency. "It's obvious," it was said, "that anything that DOES show such large differences in cables MUST be transparent and high-resolution," when, in fact, precisely the opposite was the case.
Far less price- and name-pretentious DACs, those with FAR better clocking circuits did NOT exhibit this ridiculous sensitivity to the cable: they were immune to the variation that the more expensive DAC simply could not handle properly.
So, if the problem exists, it's not because some of us are using DACs that are immune to these errors, it's because some of you AREN'T :-(.
Dick Pierce
Professional Audio Development
Understand that. But my understanding of PCM was that the signal clock is derived from the arrival timing of the incoming bitstream. Clock data is NOT included in the digital data itself. Is this not so?
If it is so, would not anything that distorted the electrical signal in the digital cable distort the result of that clock timing calculation the DAC performs? And hence the result of the transform which yields the analog waveform?
I know some good DACs buffer and reclock the signal for PCM, but these tend to be nicer ones($1K range and up), certainly not the ones in even otherwise very decent receivers.
"DD/DTS IS in fact handled the same way as PCM. The only relevant difference is DD/DTS is (usually) buffered more heavily due to the variable bit rate (i.e. there may be periods when the incoming data rate is slower than the output). "
This buffering and reclocking is what should make this type bitstream more immune to this problem. No?
So are you saying I am incorrect about PCM jitter, or incorrect about DD/DTS immunity to jitter, or both? Or just incorrect that most DAC stages in receivers and CD/DVD players do not buffer and reclock the bitstream for PCM?
If it is so, would not anything that distorted the electrical signal in the digital cable distort the result of that clock timing calculation the DAC performs? And hence the result of the transform which yields the analog waveform?
I know some good DACs buffer and reclock the signal for PCM, but these tend to be nicer ones($1K range and up), certainly not the ones in even otherwise very decent receivers.
"DD/DTS IS in fact handled the same way as PCM. The only relevant difference is DD/DTS is (usually) buffered more heavily due to the variable bit rate (i.e. there may be periods when the incoming data rate is slower than the output). "
This buffering and reclocking is what should make this type bitstream more immune to this problem. No?
So are you saying I am incorrect about PCM jitter, or incorrect about DD/DTS immunity to jitter, or both? Or just incorrect that most DAC stages in receivers and CD/DVD players do not buffer and reclock the bitstream for PCM?
Thanks.
BB
Comment