How to improve your agile estimation by embracing uncertainty

Agile estimation abuse is everywhere

It’s little secret that in a world of complex environments and agile delivery, the practice of estimation is seen as a necessary evil. This can be traced back to a variety of reasons, but ultimately the culture of an organisation will dictate how its employees will approach the exercise.

Too often I’ve seen delivery managers manipulate, hesitate or flat out refuse to provide ‘the powers that be’ a rough indication of when they intend to ship their latest product release. This is because of the reliance on forecasting; senior leaders naturally want to get an understanding of where the company expects to position itself in the future, so they can plan around it. This intent is perfectly healthy, but where we veer off the tracks is when people are held accountable to the accuracy of the estimates they provide.

If things don’t go to plan, the finger of blame is pointed at the team delivering the work. After all, they had full control over their destiny, right? Wrong! And this is what most working environments, especially large ones, seem to gloss over which can have a negative impact on their success over the long term.

My face when the boss tells me “Hey mate, can you give me the estimates for all the features in your 5-year roadmap, so I can take that into the next board meeting on Monday?”

If you’re not 100%, you’re doing Agile estimation correctly (probably)

If you’re familiar with the Cynefin framework, you’ll know that most companies are building their products and providing their services in a complex system of work. That inherently means we can never be guaranteed an outcome, even after we’ve had enough time to analyse the system over a period of time.

Therefore, because we are operating amidst uncertainty any estimate we provide has a degree of ‘confidence’ attached to it. For example, if you were asked to provide an 10 estimates for your next 10 releases, and you were 90% confident about those estimates, then 9 out of 10 outcomes should occur within the window/s provided. Similarly, if you were given 10 different deadlines for 10 different activities, and you were 90% confident you could meet those targets, then you should only fail to meet the deadline once out of the ten times you performed those activities.

Now, as if all this complexity wasn’t complex enough, we need to overlay the ‘human’ factor to estimation. Authors of the famous book ‘Thinking Fast and Slow’, Nobel Prize in Economic Science winner Daniel Kahneman and his colleague Amos Tversky explore this in great detail. In short, all of the cognitive biases and illusions that distort and warp our view on the world and therefore our ability to predict what happens next, to the point where virtually all people can be classified as overconfident or underconfident in their estimation patterns over a given length of time. If you go back to the previous paragraph, this means we consistently either overshoot or undershoot the mark in terms of hitting the ‘right’ number of outcomes based on our current knowledge.

So, to recap, we’re surrounded by complexity we can never fully understand, and on top of that we as people are prone to providing flawed estimates by default. The next question is, how do we start to correct this behaviour?

“The most important questions of life are indeed, for the most part, really only problems of probability.” – Pierre Simon Laplace (1812)

How to align your brain to perform Agile estimation better

In order to improve our ability to account for uncertainty, we must first identify which category we fall under; the overconfident or underconfident.

To do this we’re going to follow the calibration exercises provided by Douglas W. Hubbard in his fantastic book How to Measure Anything. There are 10 general knowledge questions in a table below. For each question, I invite you to provide a range of answers that you feel 90% confident would fall under that range i.e. a Confidence Interval of 90%. You can either do this with pen and paper, or by typing in your answers directly into this quiz. No cheating though! Use the information you possess currently and don’t google anything.

Let’s take an example. Say I was asked to provide a range of answers I was 90% confident with for the question ‘How many countries have at least one McDonald’s’? My answer would be between 100 and 190. I’ll give you my rationalle later, but for now have a go at answering the next 10 questions yourself!

How did you go?

The point of this exercise is that although you might have a degree of familiarity with each question, you more than likely don’t have a definitive answer for most of them.

Take for instance my earlier example about the number of countries that have at least one McDonalds. I don’t know the answer exactly, but I know some elements related to the question that can help me narrow things down.

I know that there are close to 200 officially classified countries, and that McDonalds is the world’s most popular fast food chain. Therefore I’d expect the majority of countries to have one. But, there’s a chance some of the more underdeveloped countries do not have a fast food option. Therefore the estimate I’d feel 90% confident on would be, as mentioned before, between 100 and 190. And it turns out the answer is 120 (at the time the book was published) which falls within my estimate!

The answers you’ve provided in the quiz above can explain how optimistic or pessimistic your confidence levels are when dealing with incomplete information. If you’re too optimistic (or aggressive) in the estimates you provide it would probably be worth validating your initial assumptions and widen your ranges accordingly. Conversely, pessimistic individuals should try to tighten up their ranges by focusing on identifying and removing the elements of uncertainty that are influencing your estimates.

If you’d like to try some more questions you can find additional quizzes here.

In closing

Estimation in complex delivery is an oft misunderstood and exploited exercise by many organisations. By emphasising the accuracy of an estimate in the midst of uncertainty you are only setting your company up for long-term failure.

Instead, focus on identifying and managing the elements of uncertainty that exist within your complex delivery system. Once you have a better understanding of the unknown, or “you know what you don’t know” then you can start having much more mature conversations about your stakeholders’ risk appetite towards a particular target, rather than the target itself.

Want more control over your Jira dependencies?

If you want to see for yourself how Dependency Mapper empowers you to do Jira dependency management more effectively, you can try it free for one month.

hero_blankbackground