From my experience of working as a programmer, systems analyst, IT project manager and applications development manager, there are probably few or no mistakes that one is likely to find in this or any other discussion on errors made by skilled programmers that would be statistically "uncommon", and that will likely tend to be the way of it until human error is eliminated from the process of code-writing.
One of the more moronic statements I came across in IT management was at a meeting where a senior applications development manager, who, his team having been hit with a rash of coding errors that had manifested themselves only after
operational implementation, instead of saying "We need to improve pre-production release testing", said "We need to have a Zero Error target". It was a gob-smackingly ignorant statement.
(The analyses of errors in programming coding indicated that it is not a process in statistical control
, and even if it were
(or could be) a process in statistical control, the main source of error would still remain/be human error
, which is impossible to eliminate unless you eliminate the human element.)
Happened with me many times. Either the compiler or the run time error points my mistake.
I think the mind really gets 'tired' of repeated copy-paste-change work and starts registering 'Oh come on, it is done' before the last line ends.
- and that could well be true. We don't really know
the real cause of it in our heads.
One thing for sure is that programming requires a collection of learned skills and habits of mind
that are products of our socio-cultural development rather than being associated with a necessary natural survival characteristic. If programming was
associated with a necessary natural survival characteristic, then there would very probably be fewer
common coding errors!
For example, you would be able to observe which areas of the brain light up on an EEG (Electroencephalograph) whilst a skilled programmer was coding, and you would probably see no lighting up of those areas of the brain associated with survival, because there is no danger
You could contrast programming with (say) a skilled juggler juggling 3 spherical objects. Juggling, like programming, is a learned skill, and, similarly, you would be able to observe which areas of the brain light up on an EEG whilst a skilled juggler was juggling 3 spherical objects, and, again, you would probably see no lighting up of those areas of the brain associated with survival.
However, using the same skilled juggler, if you then substituted 3 very sharp knives
for the 3 spherical objects, the EEG will show the juggler's brain lighting up very rapidly all over the place, including those areas of the brain associated with survival
- which is what one might intuitively expect. We are designed by our DNA and evolution to survive
- all those who weren't simply dropped out of the gene pool. Once a threat to survival comes into it, our brain knows very well what is at risk, and literally "takes over".
Now I haven't studied this closely, but it would be interesting to see the statistics for the same skilled
juggler, showing the number of mistakes made whilst juggling 3 spherical objects versus the number of mistakes made whilst juggling 3 very sharp knives.