EVM implementation accelerator for small settings


Though Earned Value Management is considered as one of the most effective performance measurement and feedback tools for managing projects, its use, especially in regards projects involving small setting is very limited. In order to change this situation it was decided to find ways to accelerate the introduction of these types of techniques in the small settings. In this paper it is presented the systematic approach followed to solve this problem, based in three main axes: the Practice, the Factor and the Influence Modifier axes. Each EVM project for a small setting is characterized by a chosen set of practices and subpractices, by a given set of Factors that apply to the particular project and small setting, and by a given set of Influence Modifiers that can be used to accelerate the implementation. Practices, Factors and Influence Modifiers are related between them and the user chooses the right combination with the use of a schema, called the “Implementation Cube”, based in QFD like techniques.

Keywords: SPI – Software Process Improvement, CMMI – Capability Maturity Model Integration, Factor, Influence Modifier, QFD - Quality Function Deployment, EVM - Earned Value Management, SME – Small and Medium Enterprises, IPRC – Improvement of Processes Research Consortium, IPSS – Improvement of Processes in Small Settings.


The objective

“Earned Value Management (EVM) has proven itself to be one of the most effective performance measurement and feedback tools for managing projects” (PMI, 2005). EVM usage at a global level shows a relatively slow, but steady growth, both in the public and private sectors. A higher growth could be expected, given the simplicity of the theoretical base and the benefits it produces. It is important to note that EVM is a technique recommended by the most authorised management practitioners in the world and by prestigious institutions as the PMI, Project Management Institute.

There are several reported reasons for this slow growth in the usage of EVM, being part of them the extra cost and time needed in projects, this being more relevant for small settings that for big ones.

As small setting represent more than 85% of the whole market, the institutions at any level, whether local, national or international, are very interested in giving these small setting the innovation and competences they need to compete in the global market. Here is a justification of the interest in this part of the market.

It is necessary to distinguish between using EVM in a particular project from the implementation (or deployment) of EVM practices in a small setting, so these practices form part of the processes for managing projects. See Exhibit 1, Process and Practices deployment.

The objective of this work is to find simplified ways of applying EVM practices to small settings. This poses the problem of selecting the appropriate set of practices and subpractices to be applied.

Even for a small setting, projects are different from each other. Also the small settings are different between them. These differences between projects and/or small settings give way to the concept of Factors, which characterises the differences.

Once a set of EVM practices and subpractices have been chosen for a particular small setting and project with given Factors, still it is possible to introduce specific actions that can help in the EVM implementation, reducing time and cost involved. This gives way to the introduction of a new concept: the Influence Modifier, kind of a lever that if used appropriately will help accelerating the implementation.

Here it is presented the systematic method used for selecting the coordinates of the three axes: Practices, Factors and Influence Modifiers, how the axes related between them, and how practical criteria are used for producing applicable results.

The Small Settings picture

In this work it has been adopted the ESI (SEI, 2006) definition for Small Setting. It considers as such:

  • small businesses with fewer than 100 people
  • small organizations, within a larger organization, with fewer than 50 people
  • small projects with fewer than 20 people
Process/Practice Deployment

Exhibit 1. Process/Practice Deployment

Think small first

How Do You Eat An Elephant? One Bite at a Time (Hogan, 2000), or as one International Process Research Consortium - IPRC sponsor put it, “all improvement happens through small groups.”

Both the implementation cost and time of actual SPI CMMI, Project Management processes, or just EVM, are hard items to be handled by most companies in southern Europe, especially when referring to the Small Settings.

Carlos III University of Madrid is deeply involved in Software Process Improvement, working in close collaboration with leading SPI Spanish companies like Progresion SMP (www.progresion.net) and Zonnect Redes de Ingeniería (www.zonnect.com). The objective of this research is to apply proven methodologies for the small setting, as part of an overall strategy that finally helps also the big ones: Think small first, eat an elephant one bite at a time.

The orientation of Carlos III research in this field is taken from:

  • The policy of the EU through the Framework Programs that deeply promotes R&D in SMEs.
  • The recommendation and results of the SEI IPRC Project Charter that is promoting the Improvement of Processes in Small Settings (IPSS), planned to start October 1, 2006.
  • The best practices recommended by the PMI set of standards.
  • The accumulated experience and results of the research group.

This research is oriented to provide approaches, tools, techniques, and guidance for applying methodology and best practices in Small Settings both for Process Improvement and Project Management.

The particular project described in this paper address only an EVM Implementation Accelerator for Small Settings, taking into account the methodologies, models and best practices in the industry and their modifying factors (culture, sector, size, etc.). The research moves along a previously defined roadmap. The model is refined through iteration and experimentation results.

Organization of the paper

First there is a definition of terms, with same examples of factors found in projects. It continues with a description of the method used, followed by some facts on the validation method used and the actual state of the research. Finally there is a resume of conclusions.

Definitions and Examples


Some concepts used in this work are:


  • The effect of reducing the average time and/or cost needed to deploy a practice or process.


  • Subset of specific practices as defined in CMMI (Solomon, 2002; SEI, 2002), and/or other standards related with EVM (PMI, 2004; PMI, 2005), that are going to be deployed in a particular customer. The practices can be further subdivided in subpractices. Practices and subpractices produce Work Products.


  • Environmental parameters, characteristics, constraints, that characterizes a particular project. Some examples: budget, personnel skills, language barriers, time to finish, organization constraints, etc. Factors are catalogued, analysed and documented. Factors and deployment of practices are correlated in relation to their accelerating impact. Distinction is made between cost and time accelerating factors (Heales, 2002; Kemerer, 1999; Khosrowpour-pour, 2005).

Influence Modifier

  • Influence modifiers are those variables that can be used to modify the effect of the factors. An Influence Modifier can either power or inhibit the effect of a factor in a deployment process. Influence Modifiers are catalogued, analysed and correlated in relation to Factors and Practices.

Acceleration Cube

  • Practices / Factors / Influence Modifiers are correlated between them in a multidimensional structure called Acceleration Cube, with Practice, Factor and Influence Modifier Axis. Customer weightings, objectives, and other data used to correlate variables and obtain useful information are being considered.

Implementation Roadmap

  • With the practice data related to the desired improvement and the factors that affect the deployment scenario (all weighted according customer) it is produced the proposed implementation roadmap for maximum acceleration, Gantt Chart form. Both the Acceleration Cube and the Implementation Roadmap tool are actually under construction.

Some examples of Factors

Below are shown some examples of Factors that can be considered when trying to implement an EVM within a given firm:


  • It is assumed, as a hypothesis, that there are big differences in the way a project is executed, in the small setting context, in relation with the cultural constraints. Working hours, language, communication, team work, food habits, national holidays, idiosyncrasy, climate impact, siesta time; an identical project could have different execution parameters (no judging on the results) in different European countries.


  • It is also assumed that the sector where the project is based affects the way the project is executed. It is different the execution of a particular project in the car industry, in the medical sector, in the insurance sector, the public sector, in the military sector. Even the different software estimating methods take into consideration this type of information.

Size, Budget, Time

  • Size of the company is another factor that affects the way a project is implemented. Just compare a company with 6 people, (the average size of a European SME), versus a 100 people company or a larger one. The size of the company has some relationship with the availability of certain resources, budget, skills, etc.

Method used

Model Foundations

The three main sets of variables to be identified are practices to be deployed, factors and influence modifiers. QFD like technique is used to gather, classify and correlate the data. The sources for relevant information are: literature, case studies, surveys, experience and work groups. See model foundation schema in Exhibit 2.

Given the subset of recommended practices and subpractices needed for the process, the factors that affect the deployment and their correlations are identified and analysed.

Then the Influence Modifiers that act on the factors, either accelerating or inhibiting, are identified, analysed and correlated.

With this method:

a)   Practices to be deployed are clearly defined and weighted according target customer

b)   Factors are correlated in relation acceleration against Practices. The correlation gives information on the accelerating effect of each Factor

c)   Influence Modifiers are correlated in relation the acceleration against the Factors, thus giving information on Influence Modifiers to be used to reach the accelerating objective.

Model Foundations

Exhibit 2. Model Foundations

Research Phases

The research is organised in three main axes: the Practice axis, the Factor axis and the Influence Modifier axis. This is done in order to facilitate the practitioners selecting the best practices applicable to them, together with the Factors that affect the particular small setting, and then been able to ascertain which Influence Modifiers could be used to accelerate the deployment of EVM processes.

Phase 1 is devoted to the Practice axis together with an historical comparison base and a catalogue of accelerating approaches in industry:

  • Identify a reasonable number of best EVM deployment related practices and subpractices, and their implementation parameters (number of people involved, duration, effort, cost, etc.). This task consists basically of two things:
    • 1) Identify and classify the best practices / subpractices;
    • 2) recompilation of industrial data, which will give a comparison base for speed and practices (data from industry and from previous research programs carried out during the last five years).

Phase 2 deals with the Factor selection, analysis, hierarchy, correlation towards acceleration of the Practice deployment. The selection of Factors is derived from several sources: industry, literature, surveys, case studies, and experience and work groups. Traceability of source is assured. In this phase

  • Identify, define, analyse the factors that accelerate (positive or negatively) the deployment of the chosen set of practices (some factors are easy to identify such as company size, cultural habits, budget, and others are much more subtle).
  • Correlation of factors with practices in relation to acceleration. Correlation of factors with factors. QFD like techniques are used to organize the information.

Phase 3 deals with Influence Modifiers. Proceeding in similar way as with Factors a table with Influence Modifiers is built and analyzed. The information and data are also derived from industry, literature, surveys, case studies, experience, work groups, assuring its traceability.

  • Identify Influence Modifiers of the implementation speed, the formulae or function of their influence, the correlation with Factors in relation to mitigating or boosting their effect, the own correlation between the Influence Modifiers, their relationship or working rules in respect the set of chosen practices to be deployed.

Exhibit 3 shows a resume of the Rules used for Practices, Factors and Influence Modifiers.

Rules for Practices, Factors and Influence Modifiers

Exhibit3. Rules for Practices, Factors and Influence Modifiers

Phase 4 deals with the organization of the above information in a manageable form, with the Accelerating Cube and Implementation Roadmap

  • The Accelerating Cube receives the input (customer weighting of the factors and practices to be deployed) and produces an output that is displayed in the form of a proposed Gantt Chart with Practices, Factors and Influence Modifier. This Gantt Chart is an input that the Project Manager of the deployment project takes into consideration when preparing the detailed Project Plan.
  • The validity of the output is tested in pilots. Remember that almost nothing is known empirically about the most effective ways to implement processes.

Phases 5 and 6 are for validating the model, feedback and conclusions

  • A comparison data base is derived from previous projects, experience, literature, case studies, and surveys.
  • The model is used in two pilot projects, with the resulting Implementation Roadmap with practices, Factors, Influence Modifiers. Results and data are registered.
  • Processing results against the comparison base help producing conclusions.


Research situation

At the moment of submitting this paper the work research continues, with a focus on Factor and Influence Modifier analysis and correlation. A first model of the Accelerating Cube is being tested

Pilot Testing

Two pilot projects have been authorised and are going to be managed following A Guide to the Project Management Body of Knowledge (PMOBK® Guide). The first Project deals with General Improvement of Support Services for an Organization, including processes such as “Development of new management applications integrated with an ERP solution” and “Management and Governance of existing IT applications”. The second Project deals with the implementation of a sustainable process of measurement project status with two key indicators: Schedule Performance Index and Cost Performance Index. Each project has a different mapping against practices, and factors.

Issues and complexities involved

It is very time consuming in gathering an appropriate Deployment Comparison data base. Structuring Factors and Influence Modifiers in a user friendly way is a complex task.

Visibility of improvements

The improvements are measured by comparison with the Deployment Comparison Base, which is on construction.


There is the possibility of extending the Factor-Influence Modifier Model to other processes (rest of CMMI practices, Project Management Practices). This point is left open for further research.

Model Supporting Tools

A fundamental Supporting Tool is Zonnect (Zonnect, 2007), an experimental product that offers:

  • Services for the realization and collaboration in Software Engineering projects, guarantying control of all Project steps
  • Best practices and CMMI Capabilities “packaged”, adapted and ready to use


The Factor and Influence Modifier analysis gives useful information on how accelerate the deployment of EVM practices chosen. The concept of accelerating using Factors and Influence Modifiers seem to be appropriate, and the method used for collecting, analysing, correlating and validating, though time consuming, is producing consistent results.

The Accelerating Cube using QFD like techniques and the Implementation Roadmap could be the basis for the development of professional helping tools.

Assembling the Comparison Base in an enormous task; it is very time consuming.

The method used could be easily extended to other areas of Project Management, CMMI and other methodologies. The concepts of Factors and Influence Modifiers are useful for the purpose of accelerating EVM deployment and/or projects. Using the Influence Modifier as a lever it can produce a real acceleration in projects. The results of the pilot projects will be analysed and will produce a good feedback to further develop the method and the use of the Factor and Influence Modifiers concepts.

The selection, analysis and correlation of Factors and Influence Modifiers are also a time consuming tasks. On its own, the Influence Modifier table is a useful help for any Project Managers when organizing EVM implementations and deployment.


Heales, J (2002 November), A model of factors affecting an information system's change in state, Journal of Software Maintenance: Research and Practice, 14(6) 409-472

Hogan, B. (2000), How Do You Eat An Elephant? One Bite at a Time, Retrieved from: http://www.llumina.com/store/howdoyoueat.htm

Kemerer, C & Slaughter, S (1999) An Empirical Approach to Studying Software Evolution, IEEE Transactions on Software Engineering

Khosrowpour-pour, M. (2002). Advanced Topics in Information Resources Management; V.4.; United Stares of America, Idea Group Publishing

PMI (2004), A Guide to Project Management Body of Knowledge (PMBOK® Guide), Third Edition, Newtown Square, PA: Project Management Institute

PMI (2005) Practice Standard for Earned Value Management, Newtown Square, PA:Project Management Institute

SEI (2002), Capability Maturity Model® Integration (CMMI SM), Version 1.2; Continuous Representation; CMU/SEI-2002-TR-028, Pittsburgh, PA: Carnegie Mellon University

SEI (2006), Improving Processes in Small Settings (IPSS), A White Paper; The International Process Research Consortium (IPRC), Software Engineering Institute. http://www.sei.cmu.edu

Solomon, P. (2002 october), Using CMMI to Improve Earned Value Management, Software Engineering Process Management, Technical Note CMU/SEI-2002-TN-016, Pittsburgh, PA: Carnegie Mellon University.

Zonnect (2007), Accelerated processes on-line, http://www.zonnect.com ZONNECT rip, S.L., C/ Santiago Grisolía 2; PTM, Tres Cantos, 28760 Madrid

©2007 Luis Cabezas
Originally published as a part of 2007 Global Congress EMEA Proceedings – Budapest, Hungary



Related Content