The use, usefulness and support for project management tools and techniques
Project Management is often perceived in reference to its set of tools and techniques or its toolbox. One indication of this is the rich array of tools and techniques that have emerged from practice, as witnessed by the content of the PMBOK® Guide. A great deal of effort has gone into producing and updating the PMBOK® Guide and thus into identifying a consensus among practitioners as to the set of PM tools and techniques that defines the limits of the profession.
“The primary purpose of the (Guide) is to identify and describe the subset of the PMBOK® that is generally accepted. Generally accepted means that the knowledge and practices described are applicable to most projects most of the time, and that there is widespread consensus about their value and usefulness” (page 3).
The research that is reported here aims to answer the following questions:
• What are the relative frequencies of use of the different tools and techniques?
• Which tools and techniques do practitioners perceive to be the most useful?
• Are these the tools that organizations most often support?
The authors are conducting a survey of PM practitioners in order to answer these questions.
The present paper reports on the methodology being used and on preliminary results. The present research is focused on tools and techniques that are:
• Related to the project management profession, rather than tools of general management
• Specific as opposed to general means and processes
• Well defined in the literature
• Well known within the PM community of practitioners
• Associated with relatively uniform practices throughout the profession.
The Literature Review
Several papers have been published in recent years on the use of PM tools. We found four of these articles to be particularly relevant to our study. The article entitled “PMBOK: A guide for Project Management in the Next Century” (Hargrave & Singley, 1998) presents a comprehensive study of the use of PM tools, techniques and processes in a specific context. The research surveyed project managers in the Army Corps of Engineers on the use of the 37 processes and 116 techniques and tools identified in the PMBOK® Guide. The survey measured the “importance to PM” and the “excellence in practice.” The analysis highlighted the tools, techniques and processes that show a gap between importance and excellence in use. An extension of the survey also identified the top 10 tools.
A study of the use of risk management tools in Israeli high-tech industries investigated the frequency of use, the perceived contribution of usage to project success and extent to which usage was associated with high performance (Raz & Michael, 2001). Thirty-eight tools were included in the survey, which focused on risk management.
Thamhain published a study of the use and perceived value of 29 PM tools and techniques (Thamhain, 1998). The study further investigated the barriers to their use. Thamhain concluded that the contribution of PM tools and techniques to project performance is conditional upon their integration into PM processes and acceptance by the project team.
A paper published by Zeitoun presents the ease of use of 14 tools and processes and investigated implementation issues (Zeitoun, 2000). He argues that the usefulness of tools and techniques depends on the quality of the implementation process and the training that accompanies their implementation.
Drawing from the PMBOK® Guide, the work cited above and other sources, the authors identified what they feel is the set of tools and techniques that composes the PM toolbox. The sources consulted by the authors most often include both very general concepts and processes (e.g., training programs, performance measurement) and very specific tools (e.g., WBS, PDM). The research identified above investigated the use of both specific tools and techniques and very general processes, they do not make explicit distinction between tools and process. The present research investigates only tools and techniques that are project specific, but not general processes. The authors considered tools and techniques to be those things that project management practitioners use to “do the job,” to “execute a process.” Metaphorically the tools are used to “execute the recipe” or to “play the partition.” They are concrete and specific means to apply rules and principles. Their use and skillful practice require practical know-how.
A list of 72 PM tools and techniques was prepared in line with the approach described above and in the introduction. The tools were then sorted to approximately follow the project life cycle, but in order to help respondents make clear distinctions, tools with similar names or related meanings were placed next to each other in the list. For example, “critical path” was placed next to “critical chain.” The complete list is presented in Exhibit 1. A definition of each of the tools and techniques was provided. The primary sources of definitions were the PMBOK® Guide and Max Wideman's Comprehensive Glossary of Project Management Terms which can be found at http://www.pmforum.org/library/glossary/index.htm-Index_Section. These were completed by definitions by the authors.
The research also required the authors to identify the dimensions of use that would be investigated. The first was the extent of use, which the authors decided to measure by questioning on the frequency of usage. The authors also studied the perceived usefulness of tools and techniques. The later measure had to avoid confusion between usefulness of present practices and potential usefulness. In order to do this, questions were developed on the perceived contribution that better or more extensive use of a particular tool would have to project success.
Tool and techniques are used within an organizational context. One thing the context can do is provide support for the use of some tools and techniques. The level of support was measured by a specific question on support for each tool or technique in the survey and also measured by general questions on the organizational context including questions related to the existence of well-documented methodologies and the level of PM maturity within the organization. The study evaluated the level of support for two reasons: first, to control for the effects of support on usage and second to gather information to return to organizations as to the perceived level of support being provided.
The survey was, in fact, designed as part of a benchmarking instrument on competence. The authors are collaborating with Lynn Crawford on an extension of her well-know research into PM competency (Crawford, 1997, 1998, 1999, 2000). The organizational environment questions used in the questionnaire were drawn from her studies. The aim of the overall research program is to enlarge the sample while adding the tool-usage component. One of the central elements of Lynn Crawford's research is the Australian National Competency Standard, which evaluates the frequency and the autonomy with which a person uses project management processes and practices. The capacity to use tools and apply techniques is consider to be a technical skill, which is part of a larger competency construct. By adding a tool-usage questionnaire to this study, a link can be observed between higher-level competencies and the use of concrete PM tools and techniques. In the basic design of this research, both are linked to project performance.
Socio-demographic questions completed the survey instrument. The questions relative to tools and techniques are presented in Exhibit 2. As a test of the instrument and in order to gather preliminary data, project management personnel from the Montreal area were invited to answer the web-based questionnaire.
The Survey Results
To this preliminary survey, only 39 people responded. This is sufficient to test the instrument, but can only give some very preliminary indications of the results the complete study might produce. Those that responded are from a variety of industries, have extensive PM experience, and work in companies that represent a wide variety of project management contexts. Responses to all the project environment questions were well distributed.
The primary questions relative to PM tools and techniques within the questionnaire deal with (1) “frequency of usage,” (2) “organizational support for usage,” and (3) “potential contribution to project performance of more or better usage.” An examination of the responses to the questions dealing with tools usage, support and potential improvement reveals important variance along these three dimensions. It, therefore, seems we have much to learn about the relationships among the use, the support and the value of the different tools and techniques of our toolbox. This is an encouraging result for the pursuit of the research project. With a larger sample, the analysis of correlations between the organizational context variables and the tools usages variables will most likely produce interesting results. At the present stage of the research, we will limit our data analysis to the examination of patterns among the tools usage variables.
A variable was developed to measure the intrinsic value of tools, as perceived by respondents. This variable was created by adding the present frequency of use to the potential contribution to project performance of more or better use. This yields a measure of the overall potential of the tool to contribute to project success or its intrinsic value.
Present frequency of use + Potential improvement = Intrinsic value
Another measure of the perceived usefulness was created by subtracting the level of organizational support from the measure of present usage. This gives an indication of the extent to which respondents would use the tool in the absence of organizational support.
Present frequency of use – Organizational support for usage = Autonomous usage
Exhibit 3 presents the tools that received the highest and the lowest scores for each of these variables.
There are many similarities and some differences among the top scoring tools in the different lists. Both similarities and differences are worthy of comment. Scores for “intrinsic value” and “autonomous usage” are very similar. This is quite understandable. If someone considers a tool intrinsically valuable, that person would try to use it even if it were not supported by the organization. These two scores are so similar that in the rest of the discussion we will refer to both as the perceived value of the tool.
There are tools that could be called “super tools.” The Gantt chart and the WBS are examples of tools that are among the most frequently used, are supported by organizations, and score very high on value. Despite the high scores on present usage these tools score very high of potential for increased contribution to project success. Therefore, despite extensive usage, these tools still have the potential of contributing to performance, if more or better use is made of them. Another set of “super tools” also shows very high scores for usage, support and value, but did not show high scores for potential improvement. Kick-off meetings, progress reports and bid documents are examples of tools that are very valuable but are already being used at close to their full potential.
A different set of tools was shown to be very much under-exploited. These tools scored very high on potential increases in their contributions to project success through more or better usage, but showed below average organizational support for usage and low present usage scores. The following are examples: database of historical data, risk management documents, and database of lessons learned. Databases of historical data, of risks and of lessons learned scored among the very lowest on usage and support but very high on potential increased contribution. These are the tools that organizations are not presently supporting but that they probably should invest in. For tools like databases, the relationship between usage and support scores is very strong. These tools have the potential to be among the highest valued of all, but these are tools for which an individual cannot easily improvise usage on his or her own. Organizational support is a precondition to usage. For this set of tools there is a very high correlation between usage and support. This can be seen in Exhibit 3, both by their low ranking on both of these variables, and by their low scores on “autonomous use,” which is the difference between present usage and organizational support. All the tools associated with the use of a database (on risks, costs, lessons, etc.) are among this set of tools.
Examining the list of the lowest scoring tools, some standout as the lowest on all variables. These include Monte-Carlo Simulations, Life-Cycle Costing, Decision Tree, and Probabilistic Duration Estimate (PERT Analysis). There might be some concern that low scores may indicate that a tool was not applicable in a respondent's context or not well enough understood for the respondent to have an informed opinion. The survey invited the participants to indicate if a tool was not applicable to their work. It also invited them to indicate if they had insufficient knowledge of the tool or technique to have an opinion as to the effect of more extensive or better use. Since each participant did mention at least once their insufficient knowledge, we suppose that the measure of low intrinsic value for these tools is a real indication of the participant's perception and not an indication of inadequate knowledge of the tool itself.
As we already discussed, low scores of usage and support are more interesting when the same tool scores high on potential contribution, as did databases for historical data, lessons learned and risks.
There is considerable variation in the use and perceived usefulness of PM tools and techniques. These have been related to levels of organizational support for their use. The approach developed here has been shown to have the potential of being a powerful instrument for diagnosing the use of PM tools and techniques in an organizational context, and for guiding future investments in the tools provided to project management personnel. The authors are pursuing a program to evaluate PM competency in collaboration with Lynn Crawford of the University of Technology Sydney.
Crawford, Lynn H. 1997. Project management competence for the next century. In the Proceedings of the 28th Annual PMI Seminars and Symposium, Chicago, Illinois.
Crawford, Lynn H. 1998. Standards for a global profession—Project management, In the Proceedings of the 29th Annual PMI Seminars and Symposiums, Long Beach, California.
Crawford, Lynn H. 1999. Assessing and developing project management competence. In the Proceedings of the 30th Annual PMI Seminars and Symposiums, Philadelphia, Pennsylvania.
Crawford, Lynn H. 2000. Profiling the competent project manager. In the Project Management Research at the Turn of the Millennium: Proceedings of PMI Research Conference, 21-24 June 2000, Paris, France, pp. 3–15.
Hargrave, Bernard L., & Singley, John. 1998. PMBOK: A guide for Project Management in the Next Century. In the Proceedings of the 29th Annual PMI Seminars and Symposiums, Long Beach, California, USA.
Michael, E., Raz, T. 2001, January. Use and benefits of tools for project risk management. International Journal of Project Management.
PMI Standards Committee. 2000. A Guide to the Project Management Body of Knowledge. Newtown Square, PA: PMI.
Thamhain, H. J. 1998. Integrating Project Management Tools with the Project Team. In the Proceedings of the 29th Annual PMI Seminars and Symposiums, Long Beach, California, USA.
Zeitoun, Al. 1998. Raising the Bar in Project Management Awareness and Application. In the Proceedings of the 31st Annual PMI Seminars and Symposiums, Houston, Texas, USA.
Proceedings of the Project Management Institute Annual Seminars & Symposium
October 3–10, 2002 • San Antonio, Texas, USA