Your contra-arguments have been used ever since the CD was first launched, and still they improve the digital media. How can they keep improving something that is "perfect"? Because it is not perfect, I would say.
-Curt
Curt, you're misapplying the data.
The CD itself is a perfect representation of the data stored on it. There's no improvement with DVD, Blu-Ray, FLAC, or MP3 regardless of bitrate. So long as the media isn't damaged, it's objectively a perfect representation.
On the other hand, we must question what it is a perfect representation of. We've found that
its input data contains imperfections that are more perceivable than expected, so we have invented ways of improving the data -- higher sampling rates, greater resolution, etc. Similarly, there may be room for improving the quality of the electronics that turn the CD-stored digital data into audio (the DAC, amplification, speakers, etc.).
Any indictment of the CD's signal quality must look to the quality of what its given and how its output is rendered. Yet the CD itself is unquestionably perfect (given undamaged media). Indeed, any of the subsequent improvements that I mentioned -- sample rate and resolution -- can be encoded onto CD media as data, and recovered with perfection.
It's the same thing with digital cabling. The input signal might be questionable, and after the signal has traversed the cable, the output systems can be questioned. But as f0dder and Edvard have said, the perfection of a cable carrying a digital signal within its specifications (distance, input voltage, etc) can be easily verified.
That's the beauty of digital. There's no fine gradations of quality. It does its job or it doesn't, and you can immediately tell which.
If this weren't true, then Monster would have to be supplying all sorts of fancy wires inside your computer as well, and those little copper traces on the PC boards wouldn't work. But no: as long as the wire carries something reasonably close to 3.3V or 5V or whatever technology you're using, it is interpreted as if it is
exactly that voltage; any slight variations are inherently erased. If the incoming signal is 3.5V or 3.1V, it doesn't matter; the behavior of the system is absolutely identical to how it would have been at 3.3V. Were this not true, modern computers could not function.
Edit: f0dder's comment about laser positioning is correct, but not necessarily relevant. Using ExactAudioCopy I can ensure that I'm getting the precise recorded data. So it is possible to retrieve the exact input data from the CD. More generally, the CD can also carry data, and if this weren't repeatably, precisely, retrievable, then it would be useless as a mechanism for storing software and other files. Yet it does work well for that usage.