Project Management Institute

Framework for delivering higher quality statements of work for outsourced software development project

Abstract

In 2009, a study performed by AMR Research Inc. showed that roughly 80% of enterprises plan to increase their amount of IT outsourcing or keep it the same. Despite this continuing trend, statements of work (SOW) are often approved without an objective assessment of their quality. This leads to increased risk to the contracting company, including receipt of low quality products, cost overruns, and schedule delays.

This study defines what constitutes the ideal SOW for IT software development projects (“benchmark SOW”), develops a scoring algorithm to assess how any given SOW (“target SOW”) fares against the benchmark, and develops a mechanism to devise a set of recommendations to bridge the gap between the “target SOW” and “benchmark SOW.” Field data (publications, journals, interviews with practitioners) were used to establish the benchmark SOW, and to validate the effectiveness of the framework.

The outcome of this project delivers a process that ultimately answers the question: “How robust is my SOW?” This insight enables the contracting company to make a conscious decision as to whether risk must be reduced by further refining the SOW before contractually committing the work to the selected vendor.

Introduction

Purpose

The intent of this research is to design and develop a process to determine the quality of statements of work created to procure IT software development services. Once the quality is established, the process determines the relative level of risk the project is carrying, and proposes areas where the SOW can be improved before the work is committed to the selected vendor.

Background

Internal and external factors influence whether best practices in procurement management will be leveraged in developing a strong SOW for IT software development services. Such factors include, for example, the project manager's experience, whether the SOW is developed by the vendor or a contracting party, the maturity of the company in adhering to project management best practices, and the time frame available and/or urgency required in the execution of the SOW before the work can commence.

When statements of work are developed without much thought into best practices, the impact to the contracting company can be substantial in a myriad of ways: from delayed (or reduced) revenue resulting from late market entry, to escalating unplanned project costs, to compliance/legal exposures.

Regardless of the factors that ultimately drive the quality of the SOW, the project manager needs to be well aware of how much risk the sponsors are carrying before contractually committing the work to the vendor.

Literature Review

Michael G. Martin (2010) created a methodology to develop statements of work that served as the foundation for this paper. The methodology is referred to as the “F3 Methodology” and consists of three phases: Foundation, Framing, and Finalizing.

Foundation

The Foundation phase involves establishing the foundation for the development of the statement of work document (i.e., gathering as much information as possible so the SOW is accurate, detailed, and complete). The steps involved in this phase include Requirements Definition (gathering as much information as possible about the client's needs and requirements), Due Diligence Analysis (assessment of the accuracy and completeness of the requirements), and WBS Structure (outlines all the work — and only the work — that needs to be performed in order to deliver the final product).

Framing

The Framing phase addresses how the SOW should be structured, and what type of content will be included in the document. The following standard baseline SOW framework (and related definition) was used as the benchmark for this research:

Table of Contents. This lists all the primary sections of the SOW and should be available if the SOW has more than two pages.

Statement of Confidentiality. This addresses the confidentiality of the SOW (i.e., all parties involved in the contract understand that duplication or distribution of the information is not permitted unless authorized.

Introduction. The introduction consists of “purpose” (why the scope of work needs to be performed), “description of work” (what products need to be delivered), “assumptions” (all information that is true and applicable to the overall SOW), and “constraints” (what may prevent any party involved in the contract from meeting their obligation).

Products Provided. This section addresses in detail each product to be provided to the client. A subsection should be included for each product, and will include a “Detailed Description” of the product, “Key Assumptions” (all information that is true and applicable to the product), “Roles and Responsibilities” of the vendor, client, and any other third party involved in the delivery of the product, “Change Enablement” (what is required to support the implementation of the product), “Key Requirements” (specific requirements and product characteristics requested by the client), and “Deliverables” (assets that will be delivered to the client, including acceptance criteria).

Roles and Responsibilities. Addresses roles and responsibilities that are not addressed in the product service section that apply to the service provider, the client, and any third parties.

Management Procedures. This section describes the management aspect of processes and procedures required to manage the SOW (e.g., change control process, dispute resolution process, reporting, etc.).

Hours of Operation. This section specifies what constitutes a business day. This includes definition of overtime, holidays, and after-hours work.

Facilities/Tools/Equipment Requirements. This section describes the facilities, tools, and equipment the vendor needs to deliver the product (e.g., laptops, cubicles, a dedicated “war” room, etc.)

Schedule. This section presents the schedule (milestones and planned completion dates) for delivering the product. This is based on the work breakdown structure (WBS).

Pricing. This section includes the pricing structure for delivering the product. The price will include not only the base fees and/or time and material, but also other costs such as ramp-up and one-time fees, and travel and expenses.

Signature Block. This section captures signatures from representatives from the client and vendor who are responsible for approving the SOW.

Glossary of Terms. This presents the descriptions of the acronyms and definitions of the concepts, words, and phrases used in the SOW.

Attachments. This section contains all additional information referenced in the SOW (e.g., detailed project schedule, change order form with instructions, sample reports, etc.).

Finalizing

The Finalizing phase is mostly about writing the content in the SOW following the framework specified. It also includes reviews and revisions between the client and the vendor until the SOW is completed. There are many aspects to be considered as best practice when adding the content and/or writing the SOW:

• Use simple and direct language for clarity (e.g., use “shall” to indicate mandatory and binding requirements, and “will” to indicate a declaration; avoid “might,” “may,” “should”)

• Use active voice (e.g., “Vendor will develop the training module”) instead of passive voice (e.g., “A training module will be developed by the vendor), and words that convey action (e.g., analyze, attend, audit, build, compare, design, estimate, evaluate, implement, etc.).

• Use positive words (e.g., use “impossible” instead of “not possible” or “unknown” instead of “not known”).

• Use technical language sparingly. If technical language is required to convey the intent of the work, describe it in terms that can be understood by any person reviewing the document.

• Define acronyms and abbreviations. Spell out acronyms and abbreviations when they first occur in the SOW and describe them in detail in the glossary of terms.

• Make the SOW easy to read by following general writing considerations (e.g., use letter-size paper, allow sufficient margins, use an easy-to-read font type and size, etc.).

• Avoid vague or obscure words and phrases with multiple meanings (e.g., “best effort,” “adequate,” “high quality,” “similar,” “unless otherwise directed by client,” “to the satisfaction of the client,” etc.) or legal meanings (e.g., “partnership,” “assure,” “insure,” etc.) that can lead to ambiguity.

• Don't hide behind words, refusing to commit or give a direct answer (if requirements are unclear, continue to work with the project team to gain clarity that is needed to successfully deliver the product).

• Avoid terms that are specific to an industry or organization and that may not be understood universally.

• Avoid redundancy, unless it is necessary to state certain requirements in multiple sections of the SOW.

• Avoid non-specific words and phrases (e.g., “any,” “and/or,” “as applicable/as necessary,” “as required,” “as directed,” “best effort,” “best practice,” “including but not limited to,” “either,” and “et cetera”)

• Avoid catch-all and open-ended phrases (e.g., “use a method common in the services industry”).

• Avoid big or complex words.

• Avoid extraneous material and requirements (if the content does not add value, remove it).

• Avoid bias. Do not include requirements that are met by a favorite product or it will dilute the value of the competitive bidding process.

• Avoid words or statements that are difficult or impossible to quantify (e.g., “all,” “best,” “every,” “greatest,” “least,” “worst,” commitment for improvement, increased competitiveness, etc.).

• Avoid words that imply perfection like “always,” “never,” “minimum,” “maximum,” and “optimum.”

Research and Approach to Analysis

As discussed in the abstract, this study:

• Defines what constitutes the ideal SOW for IT software development projects, or the “Benchmark SOW”

• Develops a scoring algorithm to assess how any given SOW, or “Target SOW,” fares against Benchmark

• Develops a mechanism to devise a set of recommendations to bridge the gap between the “Target SOW” and the “Benchmark” SOW.

The approach taken to research, develop and validate each of these components is summarized in the process flow shown in Exhibit 1.

Research and Approach to Analysis

Exhibit 1 – Research and Approach to Analysis.

Benchmark SOW

Literature, journals, and examples of Statements of Work were the foundation to defining the “Benchmark SOW,” or what “good looks like.” As part of the research, it became evident that the quality of the output is influenced by the process used to create the SOW document. “Without the process, the data supporting the SOW may be corrupted or not of high enough quality. Following this process will help form and build a solid foundation for the SOW, which will contribute significantly to the ultimate success of the project.” (Martin, 2010, p. 89) As a result, the “Benchmark SOW” in this paper addresses not only the “ideal output,” but also the “ideal process” to create a SOW

4-1.1 > Process Step 1.1 | Benchmark SOW: Frame Decomposition Level 2

The “process” to develop a SOW is composed of three main components: the Foundation (the information gathered before developing the SOW), the Framing (the SOW structure and type of content), and Finalizing (how the SOW was written). The “output” is essentially formed of the “content”; i.e., what is included in the SOW. See Exhibit 2.

Benchmark SOW Decomposition Level 2

Exhibit 2 – Benchmark SOW Decomposition Level 2

4-1.2 > Process Step 1.2 | Benchmark SOW: Frame Decomposition Level 3–4

Once the high level frame of the “Benchmark SOW” was in place, additional research helped determine the detailed components within each category (i.e., levels 3 and 4). See Exhibit 3.

Benchmark SOW Decomposition Levels 3 and 4

Exhibit 3 – Benchmark SOW Decomposition Levels 3 and 4.

4-1.3 > Process Step 1.3 | Benchmark SOW: Questionnaire

Since the goal of this research was to assess the quality of a SOW, it was essential that a mechanism be developed to take information relative to the “Target SOW” so that a comparison between target and benchmark could be made.

The mechanism developed to take in information from the “Target SOW” was a questionnaire. The approach to develop the questionnaire included identifying the “Questions,” multiple-choice “Answers” for each question, and the “Baseline” (i.e., the value, weight, and maximum score for each question).

The examples below describe the thought process used to identify what questions to include in the questionnaire:

• For “Scope Baseline” (Exhibit 4):

Decomposition to Scope Baseline

Exhibit 4 – Decomposition to Scope Baseline

A total of seventy four questions were identified for the elements listed in Exhibit 3, and the thought process described above was applied to each of them.

Once the questions were identified, the next step was to determine what set of answers would be available for user selection. The number of answers varied by question, and always included one unique choice, considered “best practice.”

“Best practice” was derived from the same process used to identify the questions included in the questionnaire. Leveraging the “Scope Baseline” example provided previously, the answer to the question “What components [of the Scope Output] are relevant to the SOW?” provides the best possible answer (i.e., “Project Scope Statement: product scope description, product acceptance criteria, project deliverables, project exclusions, project constraints, project assumptions”).

Other answers that were less complete were added as choices, because it enabled proper assessment of the gap between best practice and the option selected by the project manager.

The draft questionnaire including the set of questions and answers was submitted to IT managers, procurement practitioners, and IT project managers for feedback on format, content, and organization; their inputs were incorporated into a revised version of the questionnaire.

After the questionnaire was created, each answer was assigned a “value” (i.e., how many points each answer is worth), and questions were assigned “weight” (i.e., how important the question is in the context of the overall questionnaire), and “maximum score” (i.e., weight of the question multiplied by the highest value answer). These would be used in a later step to quantify the quality of the SOW, identify strengths and weaknesses, and derive a set of recommendations for improvement.

4-2 MAPPING

This sub-section focuses on the process step executed to define the “Target to Benchmark – Mapping” (Exhibit 5).

4-2.1 > Process Step 2.0 | “Target” to “Benchmark” – Mapping

The next step of the research was to determine what would be considered strengths and weaknesses in the “Target SOW” when compared to the “Benchmark SOW.” The purpose of reporting out strengths and weaknesses is to provide a high level summary to the project manager of which main sections of the SOW are well defined, versus which ones need special attention. The process/algorithm created to identify “Target SOW” strengths and weaknesses is shown in Exhibit 5.

Algorithm to identify “Target SOW” strengths and weaknesses

Exhibit 5 – Algorithm to identify “Target SOW” strengths and weaknesses

This algorithm was not only used to determine whether a subsection was a “strength” or “weakness,” but also to determine what “recommendation” would be reported out after assessing “Target SOW” against “Benchmark SOW.” It is in the “Recommendations” section of the SOW analysis results that this process reports out the detailed actions that can be taken to improve the quality of the statement of work. The algorithm used to identify recommendations for SOW quality improvement is described in Exhibit 6.

Algorithm to select recommendations for “Target SOW” improvements

Exhibit 6 – Algorithm to select recommendations for “Target SOW” improvements.

A total of more than 100 recommendations for improvement were identified and may be selected based on the information gathered from the “Target SOW.”

4-3 RESULTS

This sub-section focuses on the process step executed to define the “Target to Benchmark – Results.”

After running the “Target SOW” through the “Rules Engine,” one report per “Target SOW” is submitted back to the respondent to communicate the results, i.e., how their SOW fared against the “Benchmark SOW.” The components of the report include: “Summary Performance,” “Strengths and Weaknesses,” and “Recommendation.”

4-3.1 Summary Performance

The “Summary Performance” is presented in one page and communicates concisely and visually how the “Target SOW” fares against the “Benchmark SOW.” This page is split into two main components:

Score: The score is a percentage that is calculated based on how many points the “Target SOW” scored against the maximum number of points possible (i.e., the number of points in “Target SOW” divided by maximum number of points in “Benchmark SOW”). For example: “Your Score: 52%.

Graphs: Two graphs are presented in the summary performance page. One shows “SOW Performance in Points,” and the other shows the “SOW Performance in Percentage.” Performance in Points (example in Exhibit 7) is calculated based on how many points are possible in each main category assessed, and how the “Target SOW” fared in each of them. Performance in Percentage (example in Exhibit 8) is calculated based on how many points the “Target SOW” scored against the maximum number of points possible in each category.

SOW Performance (in Points)

Exhibit 7 – SOW Performance (in Points).

SOW Performance (in Percentage)

Exhibit 8 – SOW Performance (in Percentage).

4-3.2 Strengths and Weaknesses

The “Strengths and Weaknesses” section is essentially a table with two columns (example in Exhibit 9), each listing the SOW categories that were deemed “strengths” and “weaknesses.”

Strengths and Weaknesses

Exhibit 9 – Strengths and Weaknesses.

Finally, the “Recommendation” section (example in Exhibit 10) contains a list of detailed recommendations that should be used by the project manager to improve the specific areas identified as weaknesses.

Example of the Recommendations section

Exhibit 10 – Example of the Recommendations section.

4-4 VALIDATION

This sub-section focuses on the process steps executed to perform research “Validation.”

The next step in the research was to verify the validity of the process designed to assess the quality of the “Target SOW.” This was accomplished following the process described in Exhibit 11.

Process to determine validity of the process developed to assess the quality of the SOW

Exhibit 11 – Process to determine validity of the process developed to assess the quality of the SOW.

5 Applying the Process to Real Scenarios

As described in Section 2-1, the purpose of this research is “to design and develop a process to determine the quality of Statements of Work created to procure IT software development services. Once the quality is established, the process determines the relative level of risk the project is carrying, and proposes areas where the SOW can be improved before the work is committed to the selected vendor.”

Exhibits 12 and 13 summarize the outcome of executing the process designed and developed in this research against a total of nine “Target SOW.” The quality of each SOW analyzed is crisply articulated in “Score.” This indicates that (for example) project manager for SOW “F” (Score = 52%) is carrying a high risk of facing challenges in the execution of his SOW, that there are 11 areas (i.e., weaknesses) where the SOW can be improved, and that 45 detailed recommendations can be implemented in order to mitigate the risk with the SOW.

Conversely, the project manager for SOW “C” (Score = 70%) is carrying a lower risk of facing challenges in the execution of her SOW: “Process” and “Content” showed consistent results (i.e., better process to create a SOW leads to better SOW content), many areas were identified as strengths (17 in total), and the number of recommendations for improvement was the lowest across all SOW analyzed (22).

Consolidated view of the score and performance (process and content) of statements of work analyzed

Exhibit 12 – Consolidated view of the score and performance (process and content) of statements of work analyzed.

* Process is sum of actual points for “Foundation,” “Framing,” and “Finalizing” divided by sum of possible points

Consolidated view of Strengths and Weaknesses and Recommendations identified for SOW analyzed

Exhibit 13 – Consolidated view of Strengths and Weaknesses and Recommendations identified for SOW analyzed.

After completing the assessment for SOW “A” to “I,” a personalized Results Report was generated for each Statement of Work, and the results were sent back to the respondents for validation. Exhibit 14 shows the results of the validation.

Process to determine validity of the process developed to assess the quality of the SOW

Exhibit 14 – Process to determine validity of the process developed to assess the quality of the SOW.

Additional feedback received from respondents during the validation process was encouraging:

• “These guidelines are very helpful.”

• “The guidelines show, correctly, that the SOW was not as good as it needed to be.”

• “We had issues with the SOW because it was vague, just as you mentioned in your assessment.”

• “I am convinced that these issues could have been avoided if we had implemented the recommendations listed in this report before signing the agreement.”

• “Several good points were addressed in your analysis, and think they can be applied immediately.”

• “The categories selected were easy to follow and provided a road map for others, and me when tasked to develop a SOW.”

• “I liked the table about the strengths and weaknesses that are pointed out, as well as the recommendations that are provided for providing support for improvement.”

• “I also think that this is helpful for outgoing statements of work from organizations. I think this tool has merit for helping our procurement officers working with the divisions on their RFP statements of work, or for the PMO to have a standard tool to assess them before they go out to bid.”

• “The weaknesses are very representative of the SOW that was used for this: clarity, specificity, and articulation were all issues.”

• “Overall, the feedback document is very helpful – many of these items are things I wish I'd known in advance of my current project— only I had discovered them after the fact and am now experiencing the pain of not having revised the SOW accordingly.”

• “It [results report] is specific, well written, actionable and appropriate.”

• “I wish I had included acceptance criteria and performance standards so that there would be no doubt as to what exactly I am expecting along with what constitutes good (enough) versus unacceptable.”

• “[What] I enjoyed the most from the report” was “the attribution of a score, including the use of charts, since that in turn, immediately told me with a single number how good or bad my SOW was, along with areas where it excelled and failed.”

A subsequent step in the next phase of this project is to automate this process so the analysis and reports can be easily and quickly generated. Once that is in place, the process will be published to the larger public so project managers and procurement practitioners can easily determine if and how the quality of their Statements of Work can be improved before contractually committing the work to the vendor.

Conclusion

A process has been provided for project managers, IT managers, procurement practitioners and sponsors to understand the risk they are carrying when entering into a contractual agreement with a vendor for the development of software products. The foundation for this process was the development of a “Benchmark Statement of Work (SOW)” that includes best practices related to the “process” to create a robust SOW, as well as the expected “content” of such SOW. Based on the information available in the “Target SOW” (i.e., the SOW to be assessed for quality), the process reports out the overall quality score, areas of strength and weakness, and detailed recommendations on what needs to be done to mitigate the negative impact a lower quality SOW can have on the success of the project.

Initial feedback received from stakeholders who participated in this project was very positive and confirmed that this process is helpful in determining the quality of a Statement of Work, and what can be done to improve it.

References

Amaral Jr., A. (2008). The challenge of preparing enterprise resource planning (ERP) implementation proposals solicited through request for proposals (RFPs) in Latin America (LA). Project Management Institute: 2008 PMI Global Congress Proceedings —Sao Paulo, Brazil. Retrieved from www.PMI.org

AMR Research Inc. (2009). RTTS: The software quality experts. Retrieved from http://www.rttsweb.com/outsourcing/statistics/

Andy Putnam (2007). Factors influencing project success, challenge or failure. Retrieved from http://www.pmierie.org

Burek, P. (2009). Closing the gap between project requirements, RFPs, and vendor proposals. Project Management Institute: 2009 PMI Global Congress Proceedings – Orlando, Florida. Retrieved from http://www.PMI.org

DISC types. Retrieved from http://changingminds.org/explanations/preferences/disc.htm

Dow, W., & Taylor, B. (2008). Project management communications Bible. Indianapolis, IN: Wiley Publishing, Inc.

Federal Acquisition Regulation (FAR). Subpart 8.4 – Federal Supply Schedules. Retrieved from https://www.acquisition.gov/comp/far/current/html/Subpart%208_4.html

Fleming, W., & Koppelman, J. (2006). The first transition contract type is critical for success. September 2006 Contract Management, 50–55. Retrieved from www.ncmahq.org

Haapio, H. (2007). An ounce of prevention… Contracting for project success and problem prevention. Project Management Institute: 2004 PMI Global Congress Proceedings – Buenos Aires, Argentina. Retrieved from http://www.PMI.org

Hammad, M. (2006). Schedule improvement through innovative procurement strategies. Project Management Institute: 2006 PMI Global Congress proceedings – Santiago, Chile. Retrieved from http://www.PMI.org

Haugan, G. (1931). (2008). Work breakdown structures for projects, programs and enterprises. Vienna, VA: Management Concepts.

Homer, J. (1988). A project procurement vision statement, March 1998, PM Network. Retrieved from http://www.PMI.org

Information Services Procurement. Retrieved from http://projekte.fast.de/ISPL/

Kumar, S. Managing procurement and the associated risks. Retrieved from http://www.PMI.org

Martin, M. (1961). Delivering project excellence with the statement of work. Vienna, VA: Management Concepts.

Naoum, J. (2011). Information Services Procurement Library. Information Technology, Procurement, Risk Management. Beau Bassin, Mauritius: Duc.

National Aeronautics and Space Administration (1997). NASA Guidance for Writing Statements of Work, NPG5600.B, December 1997. Retrieved from http://www.hq.nasa.gov/office/procurement/newreq1.htm

Project Management Institute (2004). A guide to the project management body of knowledge (PMBOK® Guide)— Third edition. Newtown Square, PA: Author.

Santos, F. (2004). Applying PMI Best Practices to Proposal Development Projects. Project Management Institute: 2004 PMI Global Congress Proceedings – Buenos Aires, Argentina. Retrieved from http://www.PMI.org

SWOT Analysis. Discover new opportunities. Manage and eliminate threats. Retrieved from http://www.mindtools.com/pages/article/newTMC_05.htm

The Standish Group International (2009). CHAOS Summary 2009. Retrieved from www.governor.state.pa.us/portal/server…/chaos_summary_2009_pdf and www1.standishgroup.com/newsroom/chaos_2009.php

Zuberi, S. (1986). Contract/Procurement Management. August 1986, Project Management Journal, Special Summer Issue, 89–95. Retrieved from www.PMI.org

This material has been reproduced with the permission of the copyright owner. Unauthorized reproduction of this material is strictly prohibited. For permission to reproduce this material, please contact PMI or any listed author.

© 2012, Sandra Hueb Previde
Originally published as part of the 2012 PMI Global Congress Proceedings - Vancouver, BC, Canada

Advertisement

Advertisement

Related Content

Advertisement