I think we're starting to have two separate conversations here. I was discussing what I believe are factors contributing to the inertia surrounding any attempts to radically change the Linux "standard" file hierarchy.
Linux is a kernel not a guideline, nor a standard.
I was using the word Linux
in its colloquial sense as a term referring to the entire environment (i.e. the kernal, the GNU codebase, the user/developer community). So in this sense, it is a guideline. Or a convention, or a philosophy, or anything else you may want to call it. But the one thing that it is not
is a 'standard.'
People have argued for 'standardizing' Unix for years without success. There have also been some halfhearted attempts to do the same with Linux. To date, it hasn't gotten very far. While most developers in the Linux environment recognize the need for some standards, very few seem willing to agree on just what those standards should be. So in place of a standard, there is now a corpus of "generally accepted ways of doing things."
Sure sounds like a guideline to me.
And yes, POSIX is a standard. Mostly on paper IMHO. Because in actual use
it has also become more of a guideline. Even if you ignore actual deployments, only a relatively small number OSs (BSD, OSX, Solaris, and a few others) completely comply with it. Most shoot for being "mostly compliant" in that they follow POSIX standards where there is a clear benefit in doing so, and deviate from it when there isn't. Microsoft implements what POSIX compliance it has via a compatibility feature.
It has also been noted, that there is a fair amount of ambiguity in the POSIX standard itself. Furthermore, an OS does not need to implement every function in the standard in order to be certified as being compliant with it. A "significant fraction" is sufficient for certification. So maybe POSIX isn't as much a 'standard' (in the colloquial sense
) as many people think?
Re: #2 (BTW, since you "won't even comment on 2" , should I ignore the next three sentences that follow? Sorry... couldn't resist. Still friends right?
I was using the word amateur
in its original sense as one who engages in a pursuit, study, science, or sport as a pastime rather than as a profession
. I did not mean to imply anything negative by it.
So in that sense, yes - much of what gets developed in the Linux is done by amateurs - who may also be professional developers elsewhere. Linus Torvalds himself still often refers to Linux as his "hobby-thing" .
The point I was trying to make, was that one contributing factor to the 'inertia' is that most Linux developers are not doing it for a living. For many, there is a limit to the time and energy they can put into working on their hobby so to speak. They are also not centralized, as they would be if working for a company. Each developer is working at his/her own pace on whatever he/she feels is most important.
And if they're like most people, they tend to be reluctant to do a rewrite of their working code just to comply with some vague standard unless they have to. In this environment, the general feeling is "What's good is what works!"
Which in turn leads to a lot of (dare I say it?) 'quick & dirty' development. Maybe I should have said recursive improvement cycles
If developers want to their apps to run on your distro, they had better stick to good coding practices. It's the only way to change the situation. Every time a compatibility layer is added, developers have less incentive to change their ways.
Agree 100%. Which leads me back to numbers 1,2, and 3 above...