- 1. Overview
- 2. What Is Software Estimation
- 3. Why Is Estimating Software Important
- 4. Why Is Software Estimation Hard
- 6. Software Estimation Techniques
- 7. Step-by-Step Guide to Better Software Estimation
- 8. Final Words
- 9. Further Reading
- 12. Featured Articles
Many professionals in the software business believe that estimating software changes with arbitrary precision is near impossible. And that’s, in fact, very true.
Difficult as it is, effort estimation is a necessary evil, and you need to get it right, at least within a small, consistent margin of error.
In effect, unreliable estimation is cited as one of the top 10 reasons IT projects fail. Unreliable estimations lead to misalignments in budget and schedule; for the business, this would be a risk that is too much to bear.
In this article, we will have an in-depth look at:
- Why it is essential to get the estimations right
- What are the challenges of software estimation
- Powerful software estimation techniques
- A working formula for how to implement those techniques to achieve the best results
2. What Is Software Estimation
Software estimation determines how much effort a project would cost, usually in time and material (or T&M). The time is typically estimated in person-days, while material refers to the number of resources involved.
A prerequisite for starting the software estimation exercise is the locking and sign-off of the business requirements. The outcome of the estimation process is typically used in project planning sessions to determine what’s in scope or prepare detailed project plans once the scope is locked.
There are three main methods for estimating effort:
We will discuss each in detail in the following sections.
3. Why Is Estimating Software Important
Here are some advantages to getting the estimations right.
3.1 Impact on Budget and Schedule
In Agile, for example, estimates will be used during sprint planning and task prioritization. In Waterfall, they are used to create the schedule. In both cases, estimates will also drive the project cost.
Cost and schedule directly impact the customer’s expectations in terms of budgeting and planning. All clients expect to finish the project on time and within budget, and the software has to perform as promised.
Organisations (customers included) typically operate under strict budgets. Projects that require the allocation of new resources or hiring external consultants are riskier. If such a project runs out of money, its resources might be freed, and with no staff to execute it, it will be aborted.
It is not unreasonable to assume that customers (especially the large ones) might be juggling several projects simultaneously, hence the importance of delivering on time so as not to jeopardize other projects.
3.2 Improving Team Morale
Unreasonable deadlines induce stress, shattering morale and confidence. It is especially harmful when it’s endemic and not only a one-time event.
Involving the team in the effort estimation process will create a highly desirable outcome. Everybody would have a say in determining the required effort for their tasks and can be held accountable just and fair.
3.3 Managing Stakeholder Expectations
Reliable estimates are influential in managing stakeholder expectations and planning and executing a successful implementation.
IT projects are complex enterprises, making them difficult to complete on time and within budget.
Despite acknowledging that, your stakeholders will not hand you a blank cheque and will still expect the supplier to do their utmost to meet the deadlines.
3.4 Product Quality and Technical Debt
Opportunities for process improvement can be found at any stage of the SDLC. The Analysis stage, particularly the effort estimation part, is no different.
Estimations have downstream impacts on the quality of the deliveries. Let’s consider the familiar scenario of projects running out of time and the usual sacrifices that are made to catch up with fleeting deadlines:
- Code Quality: usually of the code, feature, or product. A sacrifice in code quality accumulates technical debt with long-term consequences.
- User Experience: this leads to delivering software that falls short of customer expectations. In extreme cases, the software can be unusable.
- Product Quality: Cutting software testing activities is another example of such sacrifices, and this particular one immediately translates into more testing cycles and customer frustration.
- Launch Delays: This means slower time to market, potential profit loss, and losing to competition.
4. Why Is Software Estimation Hard
Estimates are difficult to calculate because of the uncertainty usually involved in complex systems like IT solutions, IT environments, and human groups. These uncertainties typically fall into two groups: known unknowns and unknown unknowns.
Examples of known unknowns are:
- Unclear or changing business requirements
- Changing regulations
- Legacy code
- Poor processes or outdated delivery methodologies
- Novel technologies
Unknown unknowns are more difficult to anticipate, examples can be:
- Changes in organisational strategy and priorities
- Natural disasters
- Loss of talent
- Mergers and acquisitions.
Uncertainty from poor processes, however, can only be eliminated with adequate process management, governance, and continuous improvement.
A further obstacle to accurate estimates is that no two projects are identical. Every project has the potential to present a novel aspect, where novelty takes the form of innovative features, changing customer preferences, and emerging or new technology.
This dynamicity makes expert judgements and statistical analysis less efficient.
4.2 Precision and Uncertainty
In most cases, you will never really find out how much effort a task needs until it’s almost done.
Naturally, you can always provide a bracket wide enough to be accurate. For example, 1-10 years is a very accurate estimation for building a web application, but it is not very useful.
You need precision, where a more precise and helpful answer to the question above would be 8-12 months.
The cone of uncertainty is typically used to illustrate how estimates vary with project progress:
At project inception, the estimation variance is highest except that, at this stage, you need the highest precision you can hope for. Variance drops to zero after go-live and when the feature is already in production.
4.3 Estimation Challenges
Let us take a closer look at some specific situations where effort estimation becomes difficult.
4.3.1 Unclear Requirements
If you start with unclear business requirements, you will most certainly underestimate the amount of work involved, especially in adapting and redesigning solutions.
Requirement gathering and creation have limitations, especially when the customer is not clear on what they want or when many design decisions will need to be made further down the road. This lack of clarity is acceptable if everybody acknowledges it and understands its consequences.
In summary, if the requirements can be documented early on in the project, before any design or development starts, it must be done.
4.3.2 Scope Creep
Scope creep occurs whenever new changes are added to the scope without revisiting the effort estimations or the schedule.
Unfortunately, scope creep is more common and readily accepted when the requirements are not adequately documented and signed off with enough clarity.
It is typically the project manager’s responsibility to keep an eye on the scope and prevent additional activities from taking place if they impact the deadlines.
4.3.3 Novel Features
Having repeated the same task several times, you asymptotically arrive at a level where you are comfortable enough with its estimates.
Software products, however, have become very complex, and no two projects are the same. In most cases, there will be a degree of novelty, and with originality comes risk and uncertainty.
Unfortunately, these risks cannot be easily mitigated because they are Unknown Unknowns.
4.3.4 Legacy Code
Legacy code presents challenges on many levels, and estimating changes in legacy code can be among those challenges.
On a positive note, legacy code is usually old enough to ensure a degree of familiarity, especially when changes are small and isolated. Significant modifications, however, can be daunting to estimate.
4.3.5 External Dependencies
Dependencies usually come in two flavours: external from partners and suppliers or internal from teams within the organization.
Dependencies can be classified as known unknowns; this makes them easy to account for, although they always present a varying degree of risk.
4.3.6 Resource Efficiency
Resource efficiency can vary with seniority and capabilities. This should be factored in during the planning phase. Staff can also:
- Take days-off
- Get pulled into other projects in emergencies
- Be distracted by admin work
Poor production processes will also impact delivery dates allowing them to fall behind original estimations. Poor performance, if known beforehand, would need to be factored into project estimates.
4.3.6 Poor Planning and Design
It’s no secret that you will fall short of the estimations if you don’t thoroughly analyse the business requirements and produce a good design.
Naturally, the design phase itself needs to be estimated as well.
6. Software Estimation Techniques
Several techniques are commonly used in the industry for estimating software. They broadly fall under three categories:
- Top-to-Bottom approach — This method relies on expert judgement. It is probably the most unreliable as it doesn’t involve the technical staff executing the tasks and may not consider the specific capabilities of each individual.
- Bottom-Up approach — This is the most reliable and accurate of the lot. However, it requires more effort since technical staff will be needed to estimate their tasks. It also requires changes to be broken down into sufficiently small pieces, an activity that requires some effort.
- Statistical methods — This approach looks at historical data and calculates estimates based on averages and variances. This method is cumbersome and prone to error when historical data is sparse or biased (see Thoughts on Six Sigma for Software Development).
The below figure shows the reliability of the different methods.
6.2 Estimations in Waterfall
The Waterfall software estimation technique lists all the significant tasks in a project and breaks them into smaller pieces for easier management. Afterwards, an estimate is provided for each task in person-days.
A critical path is then identified.
The critical path method (CPM) is a project modelling technique developed in the late 1950s where a critical path is defined as the shortest possible time the project needs from start to finish.
The critical path is delicate, and any delay in tasks on the critical path will eventually delay the whole project.
Project managers usually put significant effort into getting the estimates right to mitigate such risks.
This emphasis on correct estimation helps assimilate potential delays and makes the critical path more resilient.
6.3 Estimations in Agile
Estimating software effort in Agile works somehow differently. Like Waterfall, Agile also breaks big chunks of work into Epics, User Stories, and Tasks, but Agile uses story points instead of person-days.
One of the fundamental concepts of Agile is accepting uncertainty and changing requirements, even late in the project.
This concept makes effort estimation in a precise number of days inherently tricky as it shifts the effort from planning and design into faster, more frequent deliveries. Both planning and meticulous design are prerequisites for better estimations.
Story points are more flexible in that A) there is no rigid equivalency between story points and person-days (this is dynamically calculated per team and project and constantly updated throughout the implementation), and B) as the estimates grow, so does the safety margin in the story points (ensured by using a Fibonacci series).
Story points, however, have their own problems, and so does the entire estimation process in Agile.
With Agile, you lock the requirements but loosen the schedule; if a feature is not completed in this sprint, it will be finished in the next. This trade-off makes the need for estimations for project planning questionable.
7. Step-by-Step Guide to Better Software Estimation
We propose the following software estimation technique. Its main advantage is precision, while its weakness is that it applies only to Hybrid software delivery models.
This limitation, however, is not overwhelming from a practical perspective, and this is due to the popularity of the Hybrid model vis-a-vis Agile or Waterfall on the one hand and the fact that software estimation is not as critical with pure DevOps or Agile.
The diagram below shows the different steps involved in the estimation process.
7.1 Step 1: Break Down Major Tasks
At this stage, we assume that business requirements are locked, an architectural design has been created, and the major application changes have been identified.
Decomposing significant tasks into small, more manageable ones is a prerequisite for successful implementation. Hence, the first step requires your team to use Epics and User Stories (or any other similar pattern) for this decomposition.
Still using Waterfall?
If you are still using classic Waterfall, perhaps it is time to consider moving to Agile or the Hybrid Model. Either way, the decomposition of major tasks should still be feasible.
At this stage, we do not look at breaking up User Stories into Tasks and Sub-Tasks as that will come later.
7.2 Step 2: Generate a Rough Order of Magnitude
A Rough Order of Magnitude (ROOM) is a number or category assigned to each User Story. As the name suggests, its objective is to provide a rough estimate of the effort required.
When planning your project, you probably could use expert judgement to generate a Rough Order of Magnitude on the User Story level.
People like using T-shirt-sized scales when defining the ROOM due to its convenience. The advantage of using Rough Orders of Magnitude is multifold:
- First, we don’t yet have a detailed low-level design, so a precise estimate cannot be made.
- Second, because of the generous margin of error available in ROOMs, it is possible to use expert judgement to estimate. Expert judgement conducted by senior staff like architects and tech leads is quick and easy to obtain, making it convenient for this phase.
- Finally, the Rough Order Of Magnitude can be used afterwards by the project sponsors to decide on the project’s scope, depending on the available budget.
7.3 Step 3: Create a Solution Design
Now that the scope is ready, you can create a detailed solution design.
Be sure to invest enough time in preparing a solution design for your project. It will go a long way in helping you generate proper estimates.
A detailed low-level solution design document (LLD) will provide you with enough details on all the changes in scope.
In addition, a bi-directional traceability matrix in your LLD will help you estimate how much testing you need to cover.
Now that you have the solution design ready, you can go back to the user stories and break them up into further Tasks and Subtasks if required.
Each modification in the solution design must be reflected in a Task belonging to a specific User Story. The task should be small enough to ensure the estimation is precise.
A task is small enough to require no further division when one developer can complete it.
7.4 Step 4: Assign Story Points
Now that we have tasks and subtasks defined, you are ready to assign story points to each Task or Subtask.
But before we can do that, one prerequisite is defining a scale for our story points to avoid the caveats associated with person-days.
Creating a sequence of 7 or 8 levels is recommended, each level increasing in value in a non-linear way. You can use the Fibonacci sequence (1, 2, 3, 5, 7, …) or a power sequence (1, 2, 4, 8, …).
You may then provide general guidelines on the assignment process. The table below provides some examples:
|Story Point||Level||Category Description|
|1||Minor||Configuration change or document update.|
|2||Very Easy||Minor code updates, such as cosmetic changes.|
|3||Easy||Extending an existing feature, like adding a new parameter.|
|5||Average||Additional or modification of major components, such as a web page or user report.|
|8||Difficult||Requiring more expertise or time. Example: modifying core components.|
|13||Very Change||Refactoring large code units.|
|21||Major Update||Adding core components, supporting additional platforms, new interfaces…|
7.5 Step 5: Translating Story Points to Person-Days
Based on the table above, you can start with a default one-to-one mapping between story points and person-days.
At this point, the story point to person-day mapping is only a rough estimate, and it is just a starting point that you will use to create your project schedule.
It should be clear to stakeholders that the schedule will be impacted as this mapping is updated.
To update the mapping table, observe the total number of story points closed for 2-3 weeks (could be Agile sprints, depending on the methodology used). Then, divide the total of story points by the number of days consumed, and now you have your mapping!
Proceed to update the schedule. Use this procedure every few weeks so your ratio remains current and your schedule is up-to-date.
8. Final Words
Getting the estimations right is vital for proper planning and stakeholder management; it sends the client a message of confidence, optimism, and professionalism.
These signals make collaboration easier and project management smoother. Most importantly, it allows everybody to plan and prepare adequately, leading to better deliveries and efficient performance.
The proposed guide described earlier is not airtight and may not be popular with project managers and senior management for obvious reasons. It is, however, an excellent tool for complex systems and a significant departure from the old ways, which is why, in our view, it warrants a genuine discussion.
As with any production process, good governance (monitoring, updating, and validation) is essential to remain ahead of the curve.
9. Further Reading
Great talk by Bob Martin on the subject.