The mindset behind estimating and planning for agile
Ahmed Sidky, PhD., Executive Director
The International Consortium for Agile (ICAgile)
In the late 1950s Peter Drucker coined the term knowledge worker (Wikipedia, n.d.) and spent the rest of his life examining an age in which an unprecedented number of people use their brains more than their backs. More than 50 years later, Davenport (2005), in his book Thinking for a Living: How To Get Performance and Results from Knowledge Workers, stated that the rise of knowledge work has actually been foreseen for years.
As the complexity of knowledge work continues to increase, teams can no longer afford to estimate the work using the same techniques that were used in the pre-knowledge era (industrial age) since those techniques have proven to be futile in knowledge work.
This paper aims to examine the mindset behind the estimating and planning, to explore the underlying issues that make estimating and planning so challenging, and to provide techniques to make estimating and planning less painful and more fun. It will also provide clarity on how to answer the most basic question that teams encounter: When are we going to be done?
The Mindset Behind Knowledge Work
Knowledge work continues to grow at a rapid pace, and the approach in how we plan and estimate our work needs to adapt to accommodate the needs of the knowledge age. In 2001 the Manifesto for Agile Software Development (Beck, et al., 2001) was posted by software developers to “uncover better ways of developing software . . .” While the values and principles of the manifesto focus on developing software, they apply to other forms of knowledge work.
The manifesto has four values:
- Individuals and interactions over processes and tools;
- Working software over comprehensive documentation;
- Customer collaboration over contract negotiation;
- Responding to change over following a plan.
Those four values provide important clues on what to do so that we can improve our estimating and planning. The manifesto challenged the status quo/silo mindset where disciplines like analysis, development, and testing work asynchronously in coordination with project management. The manifesto places more value to the items on the left.
In 2005 the Project Management Declaration of Interdependence (PMDOI) was created. In the context of project management, PMDOI has values that align with the values of the agile manifesto. It also focuses on the value of delivering reliable results using agile and adaptive approaches. One of the values states, ”We expect uncertainty and manage it through iterations, anticipation, and adaptation.”
Agile has been gaining momentum for over a decade now, and more disciplines like business analysis, testing, finance and marketing are exploring ways to develop their own set of values to improve organizations’ agility.
Plan-Driven Versus Value-Driven
Plan the work and work the plan. This phrase has been all too common when it comes to plan-driven delivery, an approach that aims to complete tasks according to plan. Plan-driven delivery usually starts with identifying all the requirements at the beginning of the project and then proceeds to defining the tasks required to satisfy those requirements. In most cases, a period of time is spent up front gathering the requirements before proceeding to development, testing, and finally delivery according to plan. This sequential approach does not provide value (a usable solution) until the very end of the project. The biggest assumption here is that “one can specify a satisfactory system in advance, get bids for its construction, have it built and install it.” (Brooks, 1986). Brooks goes on to say, “I think this assumption is fundamentally wrong, and that many software acquisition problems spring from this fallacy.”
In contrast, value-driven delivery follows the concept of progressive elaboration, which focuses more on the act of planning rather than on the plan. Progressive elaboration involves continuously improving and detailing the plan as more detailed and specific information and more accurate estimates become available (PMI, 2013).
The main idea of the value-driven delivery approach is to deliver value as quickly as possible. This approach forces the team to think about what is most important to the customer so that they do not treat all requirements as equal. The team examines the highest priority requirements for the customer and continuously inspects whether they are hitting the mark and make adjustments when necessary.
It is important to distinguish between those two approaches and to understand the drivers behind each of them. When we examine the plan-driven delivery approach, we find that the goal is to finish, with the schedule being the driver, while the goal in value-driven delivery is to learn, with the feedback being the driver. Exhibit 1 shows a comparison between the two approaches:
Defined Versus Empirical Process Control
The comparison in Exhibit 1 shows some characteristics of the plan-driven delivery and value-driven delivery. One of those characteristics was using a defined process control model for the plan-driven and an empirical process control model for the value driven. Let's take a look at the difference between those two models. Suppose you are planning a road trip from Baltimore, Maryland, to Daytona Beach, Florida. You conclude that, under normal conditions, this trip would take about 14 hours. You have used the distance of about 840 miles divided by 60 miles per hour to arrive at 14 hours (defined process).
The first four hours go by as planned, but shortly thereafter you get a flat tire during the fifth hour of the trip. You stand still for two hours until you can fix the tire. At some point, the weather may change or traffic gets backed up. Those are conditions that are uncontrollable and not likely to be anticipated. Based on this new information, you need to re-estimate how long it will take to get there (empirical process).
Similar to the road-trip analogy, as we proceed with our project we are hit with reality (per Mike Tyson's famous quote, “Everyone has a plan until they get punched in the mouth.” The idea is to continuously make adjustments (re-estimate) based on the real data once it becomes available.
The Value of Estimating and Planning
Earlier we made a contrast between plan-driven delivery and value-driven delivery. The idea was to distinguish between the mere act of following a plan and the continuous act of planning. Value-driven delivery is about the later. Planning is an essential activity on agile projects. In fact, more planning is done on agile projects than traditional projects. Now that we understand the difference between the two, it is important to understand the value of planning.
Imagine being on a flight where there is no indication of how long it will take to get to your destination. You plan your trip, you book your tickets, and you arrive at the airport to find that your flight doesn't have an estimated time of arrival (ETA). You need to coordinate with a friend who will pick you up at the airport, and you have a business meeting later that day. When you ask the crew, they just tell you they don't know how long it will take, but they will let you know when they get there. Sounds ludicrous, right?
Just as it is important in the previous analogy to know how long it will take you to get to your destination so that you can plan subsequent activities, it is equally important for stakeholders to know when they can expect to get value from a project so they can plan next steps. Estimation is essential to good planning. It provides a reasonable indicator to answer the most frequently asked question at the start of every project: When are we going to be done?
Providing this type of information with reasonable accuracy is key to management. Organizations need to determine how long projects will take, when they will deliver reliable results, what skills are needed to deliver, and how much they will cost.
Estimation Without Fear
If estimation is essential to good planning, what is good planning and what makes it so challenging? Ron Jeffries, an Agile Manifesto signatory writes that “estimates are often misused. Estimates are often used as a bludgeon to try to get programmers to work faster. This leads to unhappy programmers and to poor software. No one wins” (Jeffries, 2003).
When estimates are held against the team, fear will set in, so teams will do their best to avoid estimates or spend too much time up front gathering all the details needed to come up with an estimate. It is only natural for teams to go to one of these two extremes to avoid being blamed for providing wrong estimates.
This usually derails the project and increases the cycle to get the most value as quickly as possible that was mentioned earlier. Nurturing an environment where there is a level of safety for teams to learn is key to avoid those two extremes. Usually, estimates will improve as the team learns more about the project, which is not always possible in the beginning of the project. This will build the team's confidence and makes predictability easier over time. It will also result in a good plan that conveys useful information to stakeholders and management so that they can make reliable decisions. Mike Cohn writes, “A good plan is one that stakeholders find sufficiently reliable that they can use it as the basis for making decisions.” (Cohn, 2006)
The Law of Diminishing Returns
The more time we put into estimating, the better it will turn out. This may sound logical at first glance, but it is not the case. The relationship between estimate accuracy and effort is shown in Exhibit 2. As we can see, more effort can yield good results but not beyond a certain point. After that, spending more time becomes a wasteful activity and doesn't increase our accuracy.
When teams estimate their work, they need to be aware of the law of diminishing returns. We can often spend a little time thinking about an estimate and come up with a number that is nearly as good as if we had spent a lot of time thinking about it. Teams tend to fall into the trap of spending more time than necessary on coming up with estimates. This only gives an illusion of accurate estimates and drives a behavior in which we get attached to our estimates, even though, at the end of the day, they are just estimates.
As we embark on a new project, there are things that we know (known-knowns) and others we don't know (known-unknowns). There are also things we will discover along the way (unknown-unknowns). Exhibit 3 depicts the different areas from simple to anarchy (Stacey, 2002). It provides a sense of where our effort fall based on agreement and certainty. The more agreement and certainty we have, the simpler the effort becomes. The less agreement and certainty, the more complex the effort becomes. We need to assess where we stand at the beginning of the project so that there is awareness about where the project falls on the scale of agreement (in regard to the requirements—the what) and on the scale of certainty (in regard to the solution —the how). This level of awareness is helpful to the team as well as the stakeholders so that they can work on reducing the risk of those two dynamics.
Another way to look at this is to consider the unknown and unknowable factors. An example of unknown factor is not knowing what we want. On the other hand, an unknowable factor is not knowing the behavior of our solution within our environment. For example, selecting a new technology is an unknown, while implementing the technology in our environment and how it will behave is unknowable..
Continuous learning is essential to course correction when estimating and planning. However, teams need to use a systematic approach so that they can improve estimating and planning over time. We need to learn what works and what doesn't so that we do not end up repeating the same mistakes and expecting different results. (The definition of insanity attributed to Albert Einstein, “is doing the same thing over and over again and expecting different results.” [Wikipedia, n.d.]) Unlike the traditional approach that waits until the end of the project to discuss lessons learned, planning in agile is a highly disciplined process that is repeated systematically at regular intervals.
A systematic approach in agile that promotes continuous learning and improvement is the use of retrospectives. Retrospectives are regular touch points with the team (usually at a regular cadence, for example, every two or three weeks). The outcomes of the retrospectives are action items the team can tackle to improve the process going forward. These are highly disciplined meetings that follow five steps: set the stage (where we discuss focus/purpose of the retrospective), gather data (create a shared pool of data on what we see happening as a team based on facts, not opinions), generate insights (observe patterns and build shared awareness), decide what to do (move from discussion to action), and finally close the retrospective (reiterate actions and follow up and appreciate everyone's contribution). (Derby, 2006)
Metrics Drive Behavior
Organizations usually define successful projects as on time, on budget and having met all requirements as originally specified. These metrics convey a fundamentally wrong assumption, which is that requirements do not change over the life of the project. In fact, most teams suffer the consequences after working at an unsustainable pace to meet deadlines and satisfy the original requirements, only to have the table turned against them and be met with an unhappy customer. While the team was heads-down developing the original requirements, the market conditions changed, and the business needed a different set of requirements.
Establishing practices and metrics that promote collaboration and continuous learning is key to delivering value. When assessing where the team stands, it is important not just to focus on the process but also to observe behaviors. The metrics need to be displayed in the team's area and updated regularly so that the team can see how they are progressing. These displays are known as information radiators. An information radiator is a display placed where people interested in the project can see it as they walk by (Cockburn, 2001).
Examples of metrics to display include: burn down charts (showing how the team is progressing over time and what remains to be completed); velocity (displaying how many story points the team has completed each iteration or release); an impediments log (listing what is preventing the team from making progress, when the issue was identified, and when it was resolved); and bugs tracking (detailing how many escaped bugs were found and how long it took to resolve them).
It is important to note that displaying metrics is not a worthwhile exercise if they are not kept up to date. Furthermore, if the right questions are not being asked, and actions are not taken when we examine those metrics, they become merely decorations on the walls.
Techniques for Estimating and Planning
Some techniques for estimating and planning in agile include:
- Estimation in size versus time;
- Relative estimation versus absolute estimation;
- Ideal versus elapsed time.
Estimation in Size Versus Time
In the traditional approach, estimates are done once project managers identify the tasks of each team member. For example, analysts provide their tasks; developers and testers do the same. Now, it is time for estimation, so the project manager asks, what is the level of effort (usually in time). After a long pause, each individual gives their best guess pulled out of thin air. Since the project manager has been burned before, he or she pads each member's estimate. Sounds familiar?
This kind of early estimation has been an activity that is undertaken by individuals on the team. Estimation in agile is more of a team sport. Based on their shared understanding, the team collectively provides estimates for the work that needs to be done. Agile teams plan using slices of functionality known as user stories. User stories are then discussed with the stakeholders to gain an understanding of what satisfies each user story and makes it complete. In the Scrum framework, the Product Owner plays the role of the representative on behalf of the stakeholders and provides direction to the team on what gets build first. During the conversation with the team, the product owner answers questions for the team and define an acceptance criteria in a collaborative manner. There is also a definition of done that applies to all user stories. User stories (slices of value) are estimated in terms of size and not of time.
Planning poker is one of the techniques used to estimate the size of each work item. It uses the Fibonacci sequence of 0,1,1,2,3,5,8,13,21,etc. (Wikipedia, n.d.), where each subsequent number is the sum of the previous two. During the estimating and planning session, the team has all the work units (usually user stories) in a prioritized fashion. The team picks one story at a time, clarifies their understanding of what needs to be done to satisfy the story (the acceptance criteria) and provides their estimates at the same time. This is usually done using a deck of cards with a number on each card. All team members show their cards at the same time to avoid biases. When variation arises, the team will discuss the lowest and highest estimates and continue to re-estimate until estimates converge.
Relative Estimation Versus Absolute Estimation
Suppose you are lost, and you want to get directions to your friend's house? You call your friend, you tell him or her where you are, and they tell you to drive down the road until you see a public library which will be on your left. Then you are to make a left turn, continue for five blocks, and their house will be on the right. Another way to give you direction would be to say, “Drive five miles south, then make a left and drive for another five miles, and your destination will be on right.”
Most of us are more comfortable with the first approach since it establishes a point of reference that is familiar to us (a public library). Relative estimation is the process of estimating our work in relation to what we already know. In contrast, estimating in absolute terms is difficult, if not impossible, to do since it requires us to estimate something that we have no knowledge of.
One technique used in agile to facilitate the relative estimation process is affinity estimating. Affinity estimating is suited for large-scale projects with a sizable amount of units of work (epics or user stories). Unlike planning poker, affinity estimating is a more qualitative approach (usually t-shirt sizes of XS, S, M, L, XL). The team will organize the stories by size and discuss any disagreements until they come to a consensus.
Ideal Time Versus Elapsed Time
Another factor to be aware of as you are estimating is the distinction between ideal time and elapsed time. A good analogy provided by Mike Cohn is the game of American football. While the game has four fifteen-minutes quarters and therefore is 60 minutes long (ideal time), everyone who has watched a game of American football knows it ends up taking three hours (elapsed time). “Ideal time is the amount of time that something takes when stripped of all peripheral activities. Elapsed time, on the other hand, is the amount of time that passes on the clock (or perhaps a calendar)” (Cohn, 2006).
When a team provides estimates for performing specific tasks, most likely the estimates are in ideal time; however, it is important to make this explicit to the team. It is much easier to predict the duration of a task in ideal time than in elapsed time, since it is not possible to foresee interruptions that will occur in the future. This is where using yesterday's weather provides an indicator on how much work the team has completed and most likely will be able to complete in the future. It is known as the team's velocity. Velocity is a capacity planning tool that is used on agile projects. It is calculated by counting the number of units of work completed in a certain interval, the length of which is determined at the start of the project.
Suppose the team completed 10 points for the past two iterations, and their iteration is two weeks long. The 10 points would be considered their velocity. Let's say their total work size (backlog) is 100 points, it would be predicted that they would complete the work in 10 two-week iterations. With two iterations already completed, this leaves the team with eight iterations to finish the work on their backlog, assuming the backlog remains unchanged.
So, When Are Going To Be Done?
Now that we discussed different techniques on estimating and planning, how do we put it all together to answer the question, When are we going to be done? After all, our projects are measured in terms of time and not in terms of size. While it may be useful to the team to know the size of their work, we need to translate this to management. Answering this question by saying, we will be done in 100 points will not convey useful information to our management or to other stakeholders .
Estimate size, derive duration (Cohn, 2006) is the key to answering this question. In order to do this, we need to know three variables:
- Product backlog size;
- Team velocity;
- Iteration cadence.
Suppose our estimates show a backlog size of 200 points. Our team average velocity is 20 points per iteration. This means that we will complete your work in 10 iterations, assuming no more work has been added (200 points divided by 20 points/iteration). If the iteration takes two weeks, the team will be done in 20 weeks (two weeks x 10 iterations), which can be translated to five months. Starting the project in January means we will be done by May.
The above example is utterly simplistic and does not take into account many other variables. This is an evolving process, and estimates are constantly adjusted based on empirical data.
In knowledge work, work is about generating thoughts and ideas, and these cannot be estimated. However, the main idea to take away here is that an estimate is not about being precise; it is about being predictable. And the way to do this is not by using a defined process; it is by using an empirical process that factors in real data and make course corrections the norm. This, in turn, gives a more robust plan that more likely to be accurate over time.
We have to come to the realization that as humans, we are not able to estimate in absolute terms, and we can grasp the concept of estimating in relative terms better. Relative estimating is not only valuable in terms of better predictability but also for triggering discussion and negotiation among the team and the customer, it is invaluable to adjusting estimates and ultimately reducing risk and uncertainty. Shared understanding and a better group buy-in of the resulting estimates are reached through consensus. Since we are creatures of habit, it becomes more challenging not to fall into our comfort zone of estimating in absolute terms.
Fostering the discipline of estimating in relative terms will require awareness and constant practice, not only at the team level but also at the organization level. Management must understand that the estimates are just estimates, and they are more likely to get better results with relative estimates than absolute estimates. This means that management needs to encourage the practice of relative estimation and not give the team a free card to fall back into their old habits.
It is imperative to nurture a culture of continuous learning that promotes results based on fact-based data. At the end, we have to realize that estimates in the knowledge work era are more like guesstimates. But while that is true, it is better to base our estimates on real data, which gives us an educated guess than to just guess with no supporting data. Over time, our guesstimates will naturally improve and become more robust.
Simply applying traditional estimating methods to knowledge work ignores reality. Management and stakeholders should encourage the fact that the continuous learning (re-estimation) and avoid blaming teams for providing imprecise estimates.
At the same time, teams should continue to use an empirical process to improve estimating and planning and avoid the guessing with no backing or estimating in absolute terms.
Nothing could be less predictable than the weather. However with enough historical data, meteorologists are able to predict rain showers, thunderstorms and hurricanes with a reasonable accuracy. Continued vigilance in reducing risk and uncertainty can only happens when we are intentional about evaluating the rigor of our estimates and how we arrived at them.
At the end, the goal is not just to become agile and use a specific set of practices so that we can call ourselves agile, the real goal is to build high-performing teams and deliver quality solutions. This takes a relentless approach to continuously improving our agility and enhancing organization capabilities. Therefore, establishing the agile mindset and defining a set of values with a constant mechanism for feedback and alignment become critical for success.
Anderson, D., Augustine, S., Avery, C., Cockburn, A., Cohn, M., DeCarlo, D., Fitzgerald, D., Highsmith, J., Jepsen, O., Lindstrom, L., Little, T., McDonald, K., Pixton, P., Smith, P. & Wysocki, R. (2005). The declaration of independence. Retrieved from http://pmdoi.org
Beck, K., Grenning, J., Martin, R., Beedle, M., Highsmith, J., Mellor, S. et al. (2001). Manifesto for agile software development. Retrieved from http:agilemanifesto.org
Brooks, F. (1986). No silver bullet: Essence and accident in software engineering. Retrieved from http://faculty.salisbury.edu/~xswang/research/papers/serelated/no-silver-bullet.pdf
Cohn, M. (2006). Agile estimating and planning. Upper Saddle River, NJ: Prentice Hall.
Cockburn, A. (2001). Information radiators. Retrieved from http://alistair.cockburn.us/Information+radiator
Davenport, T.H._ (2005). Thinking for a living: How to get better performances and results from knowledge workers. Watertown, MA: Harvard Business Publishing.
Derby, E. & Larsen, D. (2006). Agile Retrospectives: Making Good Teams Great. Sebastopol, CA: O‘Reilly Media.
Einstein, A. (n.d.). Definition of instanity. In Wikipedia. Retrieved from http://www.answers.com/Q/Who_first_said_that_the_definition_of_insanity_is_to_do_the_same_thing_over_and_over_and_expect_different_results
Fibonnaci sequence. (n.d.). In Wikipedia. Retrieved from http://en.wikipedia.org/wiki/Fibonacci_number
Jeffries, R. (2003). The noestimates movement. Retrieved from http://xprogramming.com/articles/the-noestimates-movement/
Knowledge work. (n.d.) . In Wikipedia. Retrieved from http://en.wikipedia.org/wiki/Knowledge_worker.
Project Management Institute. (2013). A guide to the project management body of knowledge (PMBOK® Guide) - Fifth edition. Newtown Square, PA: Author.
Stacey, R.D. (2002). Strategic management and organizational dynamics: The challenge of complexity. Retrieved from http://www.gp-training.net/training/communication_skills/consultation/equipoise/complexity/stacey.htm.
© 2014, Salah Elleithy
Originally published as a part of the 2014 PMI Global Congress Proceedings – Phoenix, Arizona, USA