Possibly this is straying into areas that more properly belong on the techie board, but...
Why does this matter? Rate of phase change at a given frequency is the frequency domain dual of temporal delay at that frequency. Hence, if this is the same at all frequencies, so is the time delay at all frequencies. Under this proviso, there is no pulse distortion through the filter.
More simply, the rate of change of phase can be as severe as you like, what matters is that it is constant (2nd differential w.r.t. frequency is nil). Non-linear phase response (e.g. the case of the elementary first order RC analogue lowpass filter) leads to pulse distortion.
Given enough over-sampling (and therefore cost), i.e. effectively doing the DAC reconstruction filtering with linear-phase FIR filtering in the digital domain, it is possible to engineer a CD system with essentially perfect phase linearity, i.e. no pulse distortion.
Unless I'm mistaken, the problem Gordon is referring to arises even with a perfectly flat, linear-phase band-limited system (i.e. a "perfect" CD system). Hard band-limiting itself is the problem. Fiddling with the phase delay of lowpass filters to try to deliberately engineer pulse distortion into them is an attempt to compromise your way out of the ringing that hard band-limited systems induce on hard edges of acoustic sources (e.g. and orchestral slam) whose bandwidth is greater than that of both the recording system and the human ear. More simply - recording system=hard-band-limited, human ear=soft-band-limited, mismatch=problem.
phase change rates gets more severe if you increase the number of elements of the filter
More simply, the rate of change of phase can be as severe as you like, what matters is that it is constant (2nd differential w.r.t. frequency is nil). Non-linear phase response (e.g. the case of the elementary first order RC analogue lowpass filter) leads to pulse distortion.
Given enough over-sampling (and therefore cost), i.e. effectively doing the DAC reconstruction filtering with linear-phase FIR filtering in the digital domain, it is possible to engineer a CD system with essentially perfect phase linearity, i.e. no pulse distortion.
Unless I'm mistaken, the problem Gordon is referring to arises even with a perfectly flat, linear-phase band-limited system (i.e. a "perfect" CD system). Hard band-limiting itself is the problem. Fiddling with the phase delay of lowpass filters to try to deliberately engineer pulse distortion into them is an attempt to compromise your way out of the ringing that hard band-limited systems induce on hard edges of acoustic sources (e.g. and orchestral slam) whose bandwidth is greater than that of both the recording system and the human ear. More simply - recording system=hard-band-limited, human ear=soft-band-limited, mismatch=problem.
Comment