Quote:
Originally posted by Nika Aldrich If the noise floor is correlated to the signal itself then it isn't noise, is it?
Au contraire, it still noise... similar to a floating point representation, the noise floor is now floating, and it depends on the density of the neighboring pulses, but that's only half of the story. The other half comes from the fact that the bitstream produced by the converter is not a representation of an instantaneous sample value but a representation of the signal slope.
And this is where we go back to the question "What is a differential bitstream?"
A differential bitstream represents the differential between two points on the signal (in DSD case, or to be precise in sigma-delta converters, the slope) and it signals whether the difference between two consecutive samples increases or decreases. In this case if the sample rate is high enough (And 2.8 MHz qualifies as high enough), this approach yields a very accurate following of the signal slope. Pulse Density Modulation (As per the raw output of a sigma-delta converter) is one type of differential bitstream, for example, all forms of FSK modulation produce differential bitstreams (Opposite case PSK forces you to do differential encoding before modulating).
Now, when the converter saturates it's only affecting the slew rate of the signal as opposed to the instantaneous sample value, so although the noise floor is higher and uncorrelated this is doesn't degrade the audible portion of the reconstructed signal, the quantization noise is definitely beyond the stop band of the DSD DAC LPF at all times and for all samples, and is also below the timing sensitivity threshold of the ear.
Reefman paper (Where Merging got their graphs from) shows the DSD response as it is, full of broadband noise, noise which as I said, is beyond the DAC LPF.
Food for thought, which one is more sensitive to jitter? DSD or PCM?
Quote:
DSD is certainly broken in relation to a PCM system that, not using a single bit DSM, uses a multi-level DSM and randomizes the internal behavior so as to totally decouple the quantization error from the signal while also avoiding differential non-linearity, isn't it?
Yes, but that doesn't actually mean that it's broken in itself as a principle, it means that the quantization error can't be decoupled, only averaged over time... statistically though, the error average should decrease over time. That whole issue of randomizing behaviour to force linearity is a double edge sword, sure your noise floor is independent which assists during processing, on the other hand, if the signal exhibits a non-linear behavior you are screwed.
BTW careful with Betamax vs. VHS comparisons, Beta was a better format from a technical standpoint (Chroma, Luminance, Contrast, tape sensitivity and so on)... VHS won the war because Beta was delayed in court.