I have always had this idea it is because the software developers who are employed by hardware companies are of a different breed. For most of their work they have to focus on not making any irrecoverable errors. If you write a desktop application and you make a bug, it might crash that one application and annoy people, but they can just uninstall it -- if you write a hardware driver and you make a bug it could irreparably damage hardware or require a somewhat complicated process to get the machine going again.
I assume this is why desktop software is fancy looking, sleek to use, and full of bugs, while driver software is usually ugly, a pain to use, but generally doesn't blow up your machine
.
It has gotten a lot better though in many cases -- I think perhaps some companies are employing a bit of a mix.