Software
project is all about unknowns. At the beginning of a software project, the
project charter takes a very simplistic view of the final product and try to
estimate the dollar amount (because as we learned, the project sponsor always
look at the bottom line which is the dollar cost of the project). At this
stage, the larger amounts of unknown create a larger uncertainty in estimation.
But as the project moves into the deeper level of planning and implementation,
the more unknown becomes known, hence the uncertainty in the estimation becomes
lesser compare to the previous stage. This phenomenon is described by the
concept of “The Cone of Uncertainty”, originally used in the chemical industry
by the founders of the American Association of Cost Engineers (now AACE
International) and got wide popularity after it’s published in Steve
McConnell’s famous book “Software Estimation: Demystifying the Black Art”.
Figure: The Cone
of Uncertainty from www.agilenutshell.com
According
to the above graph, it’s evident that the later in the project life cycle, the
better the estimation. But the ‘catch 22’ of this reality is, no one would come
to the one stage down towards the certainty if the initial estimation (which is
bound to be inaccurate due to the high error margin) is not given at the
project inception.
So,
can this cone be beaten? As Steve McConnell mentioned –“Consider the effect of
the Cone of Uncertainty on the accuracy of your estimate. Your estimate cannot
have more accuracy than is possible at your project’s current position within
the Cone. Another important – and difficult – concept is that the Cone of
Uncertainty represents the best-case accuracy
that is possible to have in software estimates at different points in a
project. The Cone represents the error in estimates created by skilled
estimators. It’s easily possible to do worse. It isn’t possible to be more
accurate; it’s only possible to be more lucky”. Now the only option is left and
that is to live with that. Below are some techniques on how to deal with that
reality:
- Be honest and upfront about the reality. Though it may not be taken as a positive gesture initially, but being truthful about the risk of estimating with the expectation of high accuracy. If the project sponsors can be made convinced with the reality of software project (probably by showing some of the past history of software projects within that organization), may be padding onto the final numbers may give everyone sufficient wiggle room
- Attach the variability with the estimation when presenting to the project sponsors. There’s absolutely no benefit to anyone in that project to surprise the stakeholders
- Actively work on resolving the uncertainty through making the unknowns known. This is the responsibility of the estimation team to force the cone of uncertainty to narrow down. Without an active and conscious actions, the narrow down of the cone, as it appears, won't happen won't happen on it's own with the progression of the project's life cycle
Let's try to take a postmortem look on why we have this cone of uncertainty in our software projects. The single most reason of the uncertainty of the estimation is that the estimation is actually doing a prediction (or forecast) on the capabilities of software developers. Unfortunately, by nature, human behavior is unpredictable. The same person can behave differently based on the presence of surrounding factors or absent of surrounding factors. A programmer may come up with the solution of a complex programming problem in a few minutes whereas the same person may struggle to resolve a lesser complex problem in another time. So the entire game of prediction is bound to fall apart when it's trying to predict the most unpredictable nature of human psychology. So the strategy shouldn't be to try to hit the bulls-eye with a single estimation value, rather try to maximize the chance of coming close to the actual with the estimation through the use of techniques like: range value, plus-minus factor, confidence level factor etc. That's why sometime it is being said that we don't have failed projects but just failed estimations.
Though we're living with this cone of uncertainty in every software projects but somehow we were so oblivious and pretending that didn't existed. Anyway, I hope we won't be from now onwards. In my next posts, I would talk about a model selection framework that would help to identify a standard estimation model and then finally I would provide some helpful tips and techniques on how to better communicate your estimations with some confidence and industry best practices.
No comments:
Post a Comment