ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Main Area and Open Discussion > General Software Discussion

Monster Cables- The World should know!

<< < (2/15) > >>

One *could* say that they more expensive digital cables sound better because one paid more and is listening more closely...

Ok - Devil's advocate session is over...


.. There is no arguing this ...-f0dder (March 22, 2008, 09:07 AM)
--- End quote ---
- guess what?!

This is a constant topic among audiophiles...
--- End quote ---

Your contra-arguments have been used ever since the CD was first launched, and still they improve the digital media. How can they keep improving something that is "perfect"? Because it is not perfect, I would say.

 I really don't care if you can measure the difference in digital audio cables or not. The difference is there -Curt (March 22, 2008, 08:05 AM)
--- End quote ---

- but you may keep your attitude on this, and I will keep mine.

... For thirty years I was a hifiholic. I already have had this 'debate' too often. -Curt (March 22, 2008, 08:05 AM)
--- End quote ---

Other than this, I am glad sound quality has improved even more.

The way I understand it goes something like this...

Higher quality cables help do 2 things:
- reduce internal and external interference, and
- transmit the signal without loss

In the analog world the topic of "are those expensive cables worth it?" is certainly debatable. Personally I do not have the ears to tell the difference so I get the mid-range stuff (not the cheapest that anyone could ever get away with, but not $20 a foot either).

BUT when it comes to digital it's a whole new ball game. Your HDMI source, lets say a BluRay player, is more or less pumping out "1s" and "0s", or more accurately "Ons" and "Offs". So long as the device on the other end of the cable can tell whether it's been delivered a "1" or a "0" is all that matters. I don't know how much influence interference can have on the signal, but with digital it would have to be enough to turn the 0s and 1s into "0.5s" (a bit where the receiver cannot tell if it's "on" or "off").

I'm reminded of this image from SpinRite, an excellent data recovery tool.

The progression of different digital media (CDs, DVDs, HD stuff, etc.) is a different topic altogether. For HDMI cables the only question is: do the 1s and 0s make it to the other end or not? With digital, this is easily tested scientifically.

Hirudin: please don't bring spinrite into this, it's snake-oil :), and mr. Gibson only has buzzwords and "it's super secret advanced tech!" to say, nothing quantitative.

Curt: as for improving digital media, there's a couple of factors. For CD players alone, there's a few... one thing is that audio CDs don't have synchronization tracks, which means you can't 100% accurately re-position (or even position) the laser. There's also the quality of the error-correction circuitry (and sadly, with copy-protected CDs, that circuitry IS necessary, even if your CDs aren't scratched...), if you don't have your CD player hooked to your amp with a digital connection the D/AC quality has quite something to say as well, and you might even have interpolation/up-sampling involved, too.

If you use a digital connection between your player and your amp, some of those points-of-degradation are eliminated, but not all of them.

High-definition formats are, afaik, stored on data discs with proper sync info. I haven't looked into the audio/video container formats used on those media, but I wouldn't be surprised if there's checksum info included. This eliminates some of the error points you have with CDs.

Then, you add in HDMI connection instead of a lossy analog connection, and you either have bit-perfect transfer or you have nothing. Cable quality has something to say wrt. how long cables you can have, but it's not like paying 3x for a 1m cable will yield any difference in quality. Either it works at full quality, or it doesn't.

For analog transport, it's a somewhat different matter, but I'm with Renegade here. Self-suggestion and product buy-in are the "quality" factors here, as long as you go for a certain minimum quality.

Your contra-arguments have been used ever since the CD was first launched, and still they improve the digital media. How can they keep improving something that is "perfect"? Because it is not perfect, I would say.
-Curt (March 22, 2008, 11:32 AM)
--- End quote ---
Curt, you're misapplying the data. The CD itself is a perfect representation of the data stored on it. There's no improvement with DVD, Blu-Ray, FLAC, or MP3 regardless of bitrate. So long as the media isn't damaged, it's objectively a perfect representation.

On the other hand, we must question what it is a perfect representation of. We've found that its input data contains imperfections that are more perceivable than expected, so we have invented ways of improving the data -- higher sampling rates, greater resolution, etc. Similarly, there may be room for improving the quality of the electronics that turn the CD-stored digital data into audio (the DAC, amplification, speakers, etc.).

Any indictment of the CD's signal quality must look to the quality of what its given and how its output is rendered. Yet the CD itself is unquestionably perfect (given undamaged media). Indeed, any of the subsequent improvements that I mentioned -- sample rate and resolution -- can be encoded onto CD media as data, and recovered with perfection.

It's the same thing with digital cabling. The input signal might be questionable, and after the signal has traversed the cable, the output systems can be questioned. But as f0dder and Edvard have said, the perfection of a cable carrying a digital signal within its specifications (distance, input voltage, etc) can be easily verified.

That's the beauty of digital. There's no fine gradations of quality. It does its job or it doesn't, and you can immediately tell which.

If this weren't true, then Monster would have to be supplying all sorts of fancy wires inside your computer as well, and those little copper traces on the PC boards wouldn't work. But no: as long as the wire carries something reasonably close to 3.3V or 5V or whatever technology you're using, it is interpreted as if it is exactly that voltage; any slight variations are inherently erased. If the incoming signal is 3.5V or 3.1V, it doesn't matter; the behavior of the system is absolutely identical to how it would have been at 3.3V. Were this not true, modern computers could not function.

Edit: f0dder's comment about laser positioning is correct, but not necessarily relevant. Using ExactAudioCopy I can ensure that I'm getting the precise recorded data. So it is possible to retrieve the exact input data from the CD. More generally, the CD can also carry data, and if this weren't repeatably, precisely, retrievable, then it would be useless as a mechanism for storing software and other files. Yet it does work well for that usage.


[0] Message Index

[#] Next page

[*] Previous page

Go to full version