topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Friday April 19, 2024, 5:22 pm
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Author Topic: User Requirements - the risks.  (Read 2513 times)

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 7,540
  • @Slartibartfarst
    • View Profile
    • Read more about this member.
    • Donate to Member
User Requirements - the risks.
« on: December 17, 2017, 10:24 PM »
I have often struggled to understand the problems described by users when they state their requirements - i.e., what users actually say they "want to do", or "need to do" - because that is usually what they think they want/need to do, whereas the best solution might be to fulfill a need that they are unable to articulate, but that can usually be discovered by some business/systems analysis.

Until one has performed that analysis, one, at best, can only assume that one thinks one understands the user's requirements. It's a potential minefield of errors - and risks. I have seen the mismanagement of these risks effectively bring about the failure of very large projects, and sometimes whole companies, or cause layers of top management to be fired/made redundant, together with the staff who had been correspondingly mismanaged.
I once was fortunate enough to discover one such risk, embedded in a large (7-year) World Bank funded government project in Thailand. This was in Year 2, before the risk had fully manifested itself, and corrective action was taken to mitigate it, with the project concluding successfully. However, at the time, it wasn't easy to get the Thai management's attention on the potential risk, because they were in  a state of cognitive blindness and simply couldn't see it. I literally had to force their attention on it, and then they started to see it - you could almost see the lights coming on in their heads. Then shocked silence, followed by prompt corrective action to mitigate the risk(s) identified and already being manifested. It was a really interesting demonstration of human group behaviour and adaptability to risks inherent in dynamically changing circumstances.

I was reminded of this when re-reading the attached document today:
Requirements Risks Can Drown Software Projects (2002) - leishman.pdf
(File was originally downloaded from a now defunct link to <http://www.stsc.hill.af.mil/crosstalk/2002/apr/leishman.pdf>
 - but that is not available via Wayback.)
« Last Edit: December 17, 2017, 10:45 PM by IainB »