Q. I read and enjoyed the recent article on audio from video players and how video players' DACs are sufficiently good that there is no audible difference between them and dedicated CD players. This triggered a follow-up question. There are companies that advertise modifications to Blu-Ray players which include new analog input circuit boards to increase audio fidelity. According to these sources, the enemy is digital jitter. They claim that because an HDMI input necessarily interweaves audio and video information, the result will always be digital jitter, up to "7 nanoseconds" worth. Most of that is engineering-speak that is over my head. My question is if this is true, and more important, does it matter? I don't doubt that they can show on paper that some aspect of the audio has improved, but is it likely that I will be able to hear a difference? Sometimes I feel like companies have consumers continuously seeking the Holy Grail of perfect specs, but that in reality, some of these expensive products and tweaks will not result in increased "real world" fidelity or enjoyment for 99.9% of those consumers. Thanks.—D.H.
A. Thanks very much for your compliments on my recent article.
As to digital "jitter", it does exist and it is measurable but it should be placed in the proper context. "Jitter" is essentially a timing error in the data that's read from a CD and converted back to an analog signal in the DAC (digital-to-analog converter). But get this: jitter is measured in picoseconds and nanoseconds: One nanosecond = one one-thousandth of a millionth of a second; a picosecond = one one-thousandth of a nanosecond. So these virtually infinitesimal timing errors, if and when they occur, are so minute that they may show up in lab tests if the errors are magnified to even make them measurable. However, in terms of audibility, they appear as tiny amounts of distortion that are typically at -80 dB to -100 dB below the audio signal level. Essentially, they are buried in the residual noise floor of the digital medium.
From a practical point of view, it's impossible to detect distortion anomalies at such minute levels, especially with audio/musical content. I'm highly suspicious of companies that claim that their expensive modifications to existing Blu-ray players will increase audio fidelity.
I have yet to experience any "audible" artifact caused by digital timing errors, or "jitter".
It's likely that many audiophiles familiar with analog wow and flutter from the bad old days of vinyl playback and analog tape transports somehow equate "jitter" with the same phenomena. The very word sounds bad---"jitter"! Yikes!
Your last statement is entirely correct. Virtually all of the after-market tweaks and expensive mods are aimed at wealthy gullible audiophiles who are easily persuaded to part with their money. Of course, lacking any proper blind A/B comparisons, if they've paid lots, they'll "hear" a difference. – A.L.