Effective earned value management
Earned value is simple in concept and can be powerful in effect, but unless it is effectively used as a management tool and not as a scoring mechanism, it will not have the desired impact. There is a renewed recognition of the effectiveness of earned value management and increasingly pointed guidance from the United States Federal Government and Department of Defense regarding its implementation
This paper provides an overview of the current guidance relating to earned value in the United States, a “gut-level” understanding of earned value, and describes how information from an earned value management system (EVMS) can be used to effectively manage a program. It concludes with a look into the future of earned value and a caution against emphasizing its use as a scoring mechanism.
Acquisition processes are in a constant state of change. Mechanisms put in place to correct or prevent problems eventually accumulate to the point that they are viewed as cumbersome and reform initiatives that seek to streamline or eliminate them gain favor. If these reforms go too far, problems recur and the pendulum swings in the other direction, resulting in increased oversight, and the cycle repeats. The challenge is to dampen this oscillation and strike the proper balance between oversight and insight. The United States acquisition system is currently at one of these inflection points with regard to earned value management, and the opportunity to achieve this balance is before us.
Earned value management (EVM) is broadly recognized as a best practice and there are numerous examples of its successful use. Programs such as the FA-18 Engineering and Manufacturing Development contract used EVM with great success in the mid 1990s. However, the use of EVM was not what made that program successful; it was just one of the many tools used by both contractor and government program managers, control account managers (CAMs), and integrated product team (IPT) leaders. Formal EVM reports were submitted monthly, but the data were informally compiled and provided to both government and contractor managers weekly. This openness and transparency with regard to EVM and other information about the program gave managers some of the information they needed to actively manage the program; however, EVM should not be seen as more than it is: one of the many tools useful to a knowledgeable program management team.
There is a large ecosystem to support earned value management and international cooperation regarding standards. The Project Management Institute (PMI)® has a College of Performance Measurement and publishes a Practice Standard for EVM. The Project Management Professional (PMP)® examination tests for EVM knowledge. The National Defense Industrial Association (NDIA) Program Management Systems Committee (PMSC) has been particularly active in working with both government and industry, providing guidance to support successful implementation of an EVMS and publishing multiple documents that cover the full spectrum of an EVMS implementation. Numerous documents are available free of charge at the NDIA/PMSD website. (NDIA/PMSD, (n.d.). The Association for Project Management (APM) in the United Kingdom published an EVM Guide in May 2002 and on 1 July 2004, the APM and NDIA signed a Standard Equivalence Agreement (APM and NDIA, 1 July 2004), formally recognizing the equivalence of the APM standard with the ANSI/EIA-748-A standard.
However, even with this broad recognition and support, the rigorous application of EVM began to lose traction in the late 1990s, and by the mid 2000s it was clear to both government and industry that changes were needed. These changes came in the form of statutory and regulatory guidance intended to strengthen and broaden the implementation of EVM.
An Overview of Current Department of Defense and Federal Guidance
In 1967, the United States Department of Defense (DoD) issued the Cost/Schedule Control System Criteria (C/SCSC) to industry, which established the minimum criteria that a contractor's management control system must satisfy. There were two primary objectives for issuing these criteria (Fleming, 1992, p 25):
- “For contractors to use effective internal cost and schedule management control systems, and
- For the government to be able to rely on timely and auditable data produced by those systems for determining product-oriented contract status.”
In 1991, a singularly important event occurred in the history of EVM: Dick Cheney, who was at the time the U.S. Secretary of Defense, cancelled the A-12 Program. Even though a “compliant” control system was in place, it was apparently not being used by management (Fleming, 1992, p ix). This caused an immediate increase in EVM training for government acquisition personnel and emphasis on its implementation in the early through mid 1990s.
In 1995, the NDIA began work on developing a re-worded version of the Cost/Schedule Control System Criteria, reducing them from 35 to 32. In December 1996, the U.S. DoD accepted them verbatim and in August 1999, the DoD accepted ANSI/EIA 748 as replacement for C/SCSC, cancelling the C/SCSC (Fleming and Koppelman, 2000, pp 157–158). The shift of EVM from government requirement to industry best practice had begun. The U.S. DoD reaffirmed this recognition of the ANSI/EIA 748 standard again in July 2007 (Krieg, 6 July 2007).
Challenges with EVM Implementations
By the early 2000s, though, emphasis on EVM and related skills had begun to decline. Between October 2003 and June 2004, the National Defense Industrial Association's (NDIA) Program Management Systems Committee (PMSC) conducted a survey on the integration of earned value management and risk management (RM). One of the more interesting conclusions from this survey was the identification of a “lack of RM and EVM process maturity and a lack of knowledge/skills” (NDIA PMSC, 2005, p 8). Not surprisingly, problems with EVM implementations began to occur and were highlighted in various audits; the comment below is typical.
“In particular, earned value management, which is a means for determining and disclosing actual performance against budget and schedule estimates, has not been implemented effectively, and oversight entities have not had the visibility into the program needed to affect its direction.” (GAO, Dec 2005, p 2)
Changes to Federal Guidance
In an effort to correct this situation, changes began to be made to regulations at all levels. In 2006, the Civilian Agency Acquisition Council and the Defense Acquisition Regulations Council amended the Federal Acquisition Regulations (FAR) to “…standardize EVM use across the government…[for]…developmental effort under a procurement contract…” (Federal Register, 5 July 2006, p 38238).
In 2008, the Defense Federal Acquisition Regulation Supplement (DFARS) was modified to provide the following specific guidance about when implementation of an EVMS was required. For cost or incentive contracts valued at $20 million or greater, the EVMS had to comply with the ANSI/EIA-748 standard. If these contracts were valued at $50 million or greater, then compliance with the standard had to be determined by the appropriate Federal Agency. (DFARS 234.2, 23 April 2008, p 234.2-1).
The Office of the Under Secretary of Defense for Acquisition, Technology and Logistics (OUSD (AT&L)) issued a series of memoranda in 2007 and 2008, emphasizing the importance of proper implementation of earned value management and directing its proper implementation.
“Each DoD component will be accountable for effective implementation of EVM on its programs. […] Correctly imposing the EVM requirements on contract and establishing the baseline are critical prerequisites to the successful implementation of EVM. […] In addition, the Components should establish and maintain realistic, executable performance measurement baselines against which to measure contract performance.” (OUSD AT&L, 3 July 2007, p 2):
“It is imperative that all contracts use the appropriate solicitation provision and contract clause, as prescribed at DFARS 234.203. While program managers have ultimate responsibility for ensuring EVM requirements are correctly included in statements of work…contracting officers can help…by working more closely with programs managers and EVM subject matter experts. […] In addition, contracting officers must implement appropriate remedial actions in the event of contractor non-compliance.” (OUSD AT&L, 27 August 2008, p 2)
Congress added their own guidance in the form of specific language in section SEC. 887 of the “Duncan Hunter National Defense Authorization Act for Fiscal Year 2009,” requiring a report to Congress on the implementation of EVM. This was amplified by language in section 302 of the “Weapon System Acquisition Reform Act of 2009” (WSARA), which provided specific direction regarding the content to be included in the report to Congress. In addition, the WSARA directed organizational changes within the Department of Defense, creating among other organizations, the Cost Analysis and Performance Evaluation (CAPE) organization, and the Program Assessment and Root Cause Analysis (PARCA) organization.
Earned Value 101 – What are Earned Value and Earned Value Management?
All this emphasis on EVM, combined with tools and a vibrant supporting ecosystem, make it seem that effective implementations would be the norm, not the exception. However, for earned value management to really work, those who receive the reports and use the information have to understand it, and REALLY understand it. It is far more than another type of financial report. So, then what, exactly, IS earned value?
Earned value is quite literally, the “value earned.” Changing the order of those words helps emphasize their individual meaning.
We assign value to things every day. We make a list of products and services we need, check our budget, and go shopping. If the cost of an item is less than or equal to what we think it's worth, it's value to us and then we may buy it. If not, then we either look for an acceptable alternative or do without. This concept applies whether we are shopping in a store catalogue, the GSA catalogue, or negotiating the price for deliverables on a data accession list.
The “budgeted cost” is earned when the work (the thing we value) has been performed. The terms “earned value” and “budgeted cost of work performed” represent the same concept: how much the work that was actually performed is worth. This “earned value” can be earned in pieces over time.
Earned Value Management
Earned value management compares this “value earned” with two things: the value of the work that was planned over time (planned value or the budgeted cost of work scheduled) and to the actual costs of performing the work (actual value or actual cost of work performed). It is important to keep track of the time periods during which the work was planned to be performed, and when it was actually performed and the costs incurred. The differences (variances) and ratios (performance indices) are some of the more basic information provided by an EVM System (EVMS).
A Very Simple Example
Let's say it is 10 a.m. Saturday morning and you need to get the lawn cut by 6 p.m. You offer your teenaged son $20.00 to cut the front and back lawns and he agrees. After cutting the front lawn, he comes in to have lunch. While he's having lunch, some friends come over to invite him to go to a movie. He convinces you that the movie will be over by 3 p.m., leaving him plenty of time to come home and finish the lawn before 6 p.m. He then asks you for $10.00 to cover the cost of the movie, making the argument that since he has already completed the front yard, he has “earned” one half of the agreed on cost of cutting the lawn. He returns from the movie and starts cutting the back lawn, but stops when the lawn mower runs out of gas. You give him $2.00 to pay for the additional gas and he finishes cutting the back lawn before 6 p.m.
So, how did you do from an earned value perspective? The value you had assigned to cutting the lawn, or the cost you had budgeted to pay for this work, was $20.00. It was scheduled to be completed by 6 p.m. Therefore, the budgeted cost of the work scheduled, also known as the planned value, is $20.00. Since this work was performed prior to 6 p.m., the budgeted cost of work performed, also known as the earned value, is also $20.00; therefore, the schedule variance (SV) is zero, and the schedule performance index (SPI) is 1.0.
So far, so good; however, you had an unanticipated cost of $2.00 for gas, so the actual cost of the work performed is $22.00, not the $20.00 you had budgeted. Therefore, you have a negative cost variance of $2.00, and a cost performance index (CPI) of $20.00/$22.00 or 0.91. If all you are concerned about is cutting one lawn, this won't break the bank, but if this is a three-year contract to cut all the lawns in the township, this would identify, very early on in the contract, that there is a serious problem.
The Two Key Assumptions
EVM uses performance relative to a performance measurement baseline (PMB) to predict how the program will perform in the future, which relies on two fundamental assumptions:
- The PMB is as accurate (or flawed) in the future as it was in the past.
- Task efficiency will be the same in the future as it was in the past.
A control account that was poorly planned or is being poorly executed will have poor EVM performance, but this is an issue with either the baseline or those doing the work; it should not be seen as a risk. On the other hand, a control account that was well planned and estimated and is being well executed may encounter unforeseen problems. As additional time and money are spent to address these problems, they will begin to show up as variances in the EVM System (EVMS). However, recognize that this is a trailing indicator. A good risk management program identifies risks before they impact the program and provides management with enough time to mitigate them.
Relationship of Risk Management to Earned Value Management
Risk management considers how likely it is that future events will impact the program; it is a forward-looking activity. Earned value management applies performance factors calculated using historical data to plan future activities. By the time earned value metrics start to show a problem, it is too late to manage the unknown that caused the additional work; this is now an issue that must be dealt with, but the time for managing the risk is long past.
EVM does not “predict” where risks will occur in the program; it only applies past performance to future estimates of cost and schedule. At best, EVM metrics will identify where prior risks, whether known or unknown, have impacted the program. One of the challenges faced by all program managers is deciding which risks to mitigate, which to absorb, and then being able to justify the resources necessary to account for both.
The most essential concept in earned value management is that of the control account. This is where control account managers (CAMs) estimate how much time and money are required to produce whatever they are required to deliver. A control account includes the following four things (Fleming and Koppelman, 2000, p 160):
Scope of work
Resource requirements (cost)
Responsible organization (Organizational Breakdown Structure -OBS)
The control account is essentially a mini-contract. The CAM (OBS) agrees to deliver something (scope) in an agreed amount of time (schedule), using only the resources that can be obtained within a certain budget (cost). The sum of all control accounts, along with undistributed budget and summary planning packages, when laid out over time based on the logical dependencies make up the program management baseline (PMB) against which earned value performance is measured; therefore, a realistic PMB is essential to successful EV performance.
Control accounts are further subdivided into individual work packages, which is where costs are accumulated and progress measured. A work package is also like a mini-contract and each has its own period of performance. It will be “opened,” a charge number activated, and costs will be accrued. When the work has been completed, the work package will be “closed,” and no further charges can be made to it. This requires a period-based accounting system that provides a way to “earn value” for work performed during the same accounting period in which the costs for that work were accrued.
The EVMS – Financial Accounting and Project Performance
Organizations with automated systems to support this combination of financial accounting and project performance have a distinct advantage. By providing accurate and timely information to the people who can act on it, in a form that they can easily understand, and in time for their actions to be effective, they create a significant competitive advantage. Simple EVMS solutions work for simple programs, but as the scope of the effort increases, they become increasingly labor intensive, error prone, and fragile; simple solutions just don't scale well.
As data are collected over time, cost-estimating relationships can be developed that both cost estimators and CAMs can use to reduce uncertainty in future estimates. The use of standard WBS structures while capturing other attributes about the program facilitates development of more accurate cost estimates and for examining where cost efficiencies can be gained.
The flow of information through an EVMS is shown at a high level in Exhibit 1. Manual processes are time consuming and error prone, so automated data transfer should be used wherever possible. A single, authoritative source for each data element should be established and used by all consumers.
To help improve situational awareness about acquisition execution, the Office of the Secretary of Defense created a central repository (CR) for key acquisition information on acquisition category I (ACAT I) programs. The repository initially focused on the contract performance report (CPR), which contains earned value information, but later added the contract funds status report (CFSR). Based on the success of the initial pilot project, the CR is now implemented for all ACAT I programs.
“…effective immediately, all new ACAT I programs will structure their contracts with EVM requirements so that the addressee in the Contract Data Requirements List for the CPR, CFSR, and Integrated Master Schedule (IMS) data is the CR.” (OUSD AT&L, July 11 2007 p 1)
The repository will be the source of information for analysts supporting the program managers, as well as multiple more senior executive levels of government oversight. The Central Repository will allow the consistency that a “single source of truth” provides, but this is also where the balance between insight and oversight must be established. Each level of management should only receive the level of detail appropriate for their use, and not attempt to manage details that are appropriately the purview of the program manager.
Using EVMS Information to Manage Your Program
Assuming complete, accurate information is being provided from an EVMS rapidly enough to be useful, how do you use it? There are as many answers to that question as there are program managers, but here are a few suggestions.
Top Level Performance
Exhibit 2 shows three examples of a simple plot of cost variance and cumulative actual cost over time, along with some key “at complete” values. The LRE and VACC are provided by the contractor, and the EAC and VACa are developed by the Agency. The budget at complete (BAC) and the ceiling cost are static.
Example A shows a program that is proceeding almost according to plan. The cost variance is slightly negative, but actual expenditures are on a trajectory toward the Agency EAC, which is well below the allowable ceiling cost. The contractor's LRE is slightly optimistic relative to the Agency EAC, but the difference is not excessive.
Example B shows a program with an unfavorable cost variance and associated rise of EAC toward the ceiling cost. The contractor LRE and variance at complete appear optimistic relative to Agency estimates, but at least the contractor acknowledges the trend. The final cost is rapidly trending toward the allowable contract ceiling. Since both the contractor and the agency recognize there is a problem, they can focus attention on corrective actions well before the ceiling cost is reached.
Example C shows a program with the same variances and estimates that were shown in example B. However, in this case, the contractor estimates are unrealistically optimistic and not in line with historical performance. There is high potential for significant contract cost overrun and worse, the contractor fails to acknowledge the problem; fortunately, both of these facts are evident while there is time to correct the situation or take alternative action.
If the contractor LRE and the Agency EAC start to differ significantly then cost analysts on both sides need to be engaged to find out what is causing the difference between estimates. It's not just a question of who's right and who's wrong; the information and assumptions being used by the two groups need to be shared and examined.
The contractor may assert that poor EV metrics were caused by spending more time in the preliminary design phase due to changes in the requirements, but now that the requirements are stable and preliminary design is complete, the rest of the effort will proceed “as planned” (i.e., CPI from now on will be 1.0). If all of that is true, then the contractors view may be of some merit. However, if those changes will impact the rest of the detailed design, build, integration, and testing, then the impact to downstream activities may mean that the program has no chance of completing as originally planned and really needs to be restructured. Only transparency and sharing of information—good and bad—between the contractor and government will determine which of these scenarios is more likely to be correct.
Control Account Performance
Digging a little deeper into the EVM data, how are the individual work packages doing? Summary data can mask problems that are developing and the more granular data provide insight into root causes. Exhibit 3 shows what a plot of CPI for all open or completed work packages, sorted by CPI, might look like. Ideally, this would be a flat line (all work packages at a CPI of 1.0), but reality is seldom that clean. There will typically be some work packages (shown on the right) that do better than planned, while others on the left show significant over-run. In between are the work packages that are essentially “on plan.” If a control account has several work packages in the right-hand region that are both small in size and nearing completion, they may be masking poor performance in work packages that are much larger and just getting started; the program could be about to run off a proverbial cliff. Similarities and differences between over- and under-performing control accounts should be examined for root causes. What attributes do under-performing work packages share, and how do they differ from better-performing work packages? Are they all from one organization, involve the same functional discipline, have the same government oversight, had their cost estimates prepared by the same group, involve the same workforce, work the same shift, and so forth, whereas other work packages don't share these characteristics?
Both contractor CAMs and corresponding Agency IPT leaders should share a complete and common understanding of the work content of the control accounts for which they are responsible. When CAMs brief their accounts, they should demonstrate an understanding of what the EVM data say about their control accounts and have good explanations for any variances. They should also be looking ahead for uncertainties that may impact performance in the future and have plans to mitigate them. One of the most effective metrics to track during program reviews is the TCPILRE. In other words, the CPI the control account needs to have for the remainder of the work to achieve the latest revised estimate (LRE). When a control account starts to fall behind and the TCPILRE starts to rise, even the most optimistic manager will recognize the reality of the situation.
In the earlier example, in which the contractor asserted that over-runs caused by changing requirements were coming to an end and future performance would be “on plan,” that assertion should be supported by data in the plot shown in Exhibit 2. The work packages causing the over-runs should be primarily in the region to the left and be nearly complete in terms of the “value earned,” and significantly different in work content from work packages yet to be opened. If the under-performing work packages were merely under-scoped or under-estimated, then more unpleasant surprises can be expected until the remainder of the work packages are thoroughly reviewed.
What Does The Future Hold?
The National Defense Authorization Act for 2009, as amended by the WSARA, requires that the Department of Defense submit an annual report to congress on the implementation of earned value management. In the initial report, which was submitted to Congress in September 2009, the future of EVM in the DoD seems assured:
“After examining the topics identified in Section 887, the Department has concluded that the DoD EVM process is the best tool available to the program management community and senior leaders for effectively managing large, complex acquisitions. No other alternative exists that can match the benefits of EVM.” (Assad, September 1 2009, p iii)
It is clear from the increasingly pointed statutory and regulatory guidance that greater emphasis on earned value management is forthcoming, if not already here; however, the quality movement of the late 1980s provides a cautionary tale.
Authors Womack and Jones, in the preface to their book “Lean Thinking,” the follow-up to their landmark study about quality and lean manufacturing “The Machine that Changed the World,” they highlight the limitations of focusing on techniques rather than a more sophisticated, holistic understanding:
“…we met many managers who had drowned in techniques as they tried to implement isolated bits of a lean system without understanding the whole.” (Womack and Jones, 1996, p 10)
“…it was clear that the quality movement of the 1980s had gone badly wrong. Quality Assurance had become the classic…nagging nanny, checking up on production employees to make sure they hadn't taken shortcuts…This, of course, created a very negative, reactive reputation for Quality Assurance.” (Womack and Jones, 1996, p 181)
“What you really need is value-stream/product-based costing … so that all participants in a value stream can see clearly whether their collective efforts are adding more cost than value, or the reverse. […] Then ask the simple question: What kind of management accounting system would cause our product team leaders to always do the right (lean) thing?” (Womack and Jones, 1996, p 262)
In March of 2010, the House Armed Services Committee Panel on Defense Acquisition Reform concluded in their findings and recommendations that, while there are issues with contractor implementation and data quality, their primary concern is that an EVMS only measures the performance of the contractor, not the program.
“Furthermore, EVMS would generate no negative information about a contractor performing on cost, on schedule, and meeting all contract requirements even if (or perhaps especially if) the contract in question had a wildly inflated price or a schedule or set of contract requirements that utterly failed to meet warfighter needs.” (Andrews and Conaway, 23 March 2010, p 27).
This is an accurate, if somewhat troubling, portrayal of the information that data from an EVMS should be expected to provide. EVMS uses performance factors based on performance relative to a baseline, and applies those factors to future work. It can show if a program is “on cost, on schedule, and meeting all contract requirements” or, if performance is not as planned, show where problems are occurring and provide high confidence estimates of the cost on completion early enough to take meaningful action. Price and schedule estimates are reviewed when creating program independent cost estimates, during multiple program reviews prior to each milestone decision, and during the integrated baseline review (IBR), so if they are “wildly inflated,” that is a problem with the overall management of the program, not a limitation of an EVMS.
The joint capabilities integration and development system (JCIDS) process is part of a framework of acquisition decision support processes and is specifically designed to “…ensure capabilities required by the warfighter are identified with their associated operational performance criteria…” (CJCSI 3170.01G, 1 March 2009, p A-2). If some key performance parameters (KPPs) are driving cost growth, the Joint Requirements Oversight Council (JROC) will re-evaluate them. “The JROC will assess whether the cost growth is the result of the validated KPPs and if so whether or not an adjustment to the KPPs is appropriate to mitigate the cost growth.” (CJCSI 3170.01G, 1 March 2009, p B-3).
Therefore, if contract requirements “…utterly failed to meet warfighter needs,” then this is a problem with the larger decision-making process, not a limitation of an EVMS.
At the time of this writing, legislation is currently pending before the U.S. Congress (Library of Congress H.R. 5013, 29 April 2010) that further reinforces congressional interest in ensuring improved performance of the acquisition system, with specific requirements intended to improving the use of earned value management. In a Congress with no shortage of partisan disagreements, this bill passed nearly unanimously in the House of Representatives on 28 April, with a vote of 417 to 3. It has now been received by the Senate and been referred to the Committee on Armed Services. By the time of the PMI Congress, these changes may have become law.
If an EVMS is expected to provide more information than it was ever designed to do, or is allowed to be used merely as a scoring mechanism, then we will have failed. If, on the other hand, earned value management is properly seen in the broader context as one of many useful techniques for project managers to use with their teams to actively manage programs, then we can be successful in improving our ability to not only deliver “on cost, on schedule, and achieving all performance requirements,” but also to improve our competitive advantage.
Establishing a baseline of planned contract performance and providing accurate and timely EVM information to program managers and their teams will allow them to better understand program status and anticipate where management attention is needed. Gathering and maintaining this information over time will provide an historical basis for improving the confidence in future estimates, and data upon which to base process improvements. Higher confidence estimates reduce the risk premium, allowing more competitive bids. Improved processes yield even greater efficiency, and automating these improved processes can provide an extreme competitive advantage.
Andrews, R. (Chairman), Conaway, K.M. (Ranking Member), “House Armed Services Committee Panel on Defense Acqusition Reform Findings and Recommendations”, March 23, 2010. Retrieved from http://armedservices.house.gov/ on 11 July 2010.
Assad, S.D, (September 01, 2009), “Earned Value Management: Performance, Oversight, and Governance, Report to Congress In Response to Section 887 of the Fiscal Year 2009 National Defense Authorization Act” Department of Defense, Office of the Deputy Under Secretary of Defense, Acquisition and Technology (A&T). Retrieved from https://acc.dau.mil/CommunityBrowser.aspx?id=329976 on 14 July 2008
Association for Project Management (APM) and National Defense Industrial Association (NDIA) (2004, July). Earned value management systems standard equivalence agreement.
Chairman of the Joint Chiefs of Staff Instruction (CJCSI) 3170.01G, (1 March 2009). “Joint Capabilities Integration and Development System”, Retrieved from http://www.dtic.mil/cjcs_directives/cdata/unlimit/3170_01.pdf on 14 July 2010
Defense Federal Acquisition Regulation Supplement (DFARS) Subpart 234.2 “Earned Value Management System”, April 23 2008, Downloaded from http://farsite.hill.af.mil/reghtml/regs/far2afmcfars/fardfars/dfars/dfars234.htm on 18 July 2010
Federal Register/ Vol.71, No. 128, (July 5, 2006) “Federal Acquisition Regulation; FAR Case 2004–019, Earned Value Management System (EVMS)”, Downloaded from http://regulations.justia.com/view/49004/ on 18 July 2010
Fleming, Q.W. (1992). Cost/schedule control/system/criteria: The management guide to C/SCSC–Revised Edition, Irwin: Chicago – London – Singapore.
Fleming, Q.W., & Koppelman, J. (2000). Earned value project management, Newtown Square, PA: Project Management Institute.
Krieg K. J. (USD ATL) (6 July 2007).Memorandum to LTGEN Lawrence P. Farrell, Jr., USAF (Ret) (NDIA), Washington, DC.
GAO, “DOD SYSTEMS MODERNIZATION: Planned Investment in the Naval Tactical Command Support System Needs to Be Reassessed”, GAO-06-215 (Washington, D.C. December 2005)
National Defense Industrial Association (NDIA) Program Management Systems Committee (PMSC) Risk Management Working Group (2005), Integrating risk management with earned value management, retrieved on 21 June 2008 from https://acc.dau.mil/CommunityBrowser.aspx?id=17784&lang=en-US.
NDIA PMSC. (n.d.). Retrieved on 5 July 2010 from: http://www.ndia.org/Divisions/Divisions/Procurement/Pages/Program_Management_Systems_Committee.aspx
OUSD (AT&L) MEMORANDUM (Jul 03 2007), “Use of Earned Value Management (EVM) in the Department of Defense”, United States Department of Defense, Office of the Under Secretary of Defense
The Library of Congress H.R. 5013, “Implementing Management for Performance and Related Reforms to Obtain Value in Every Acquisition Act of 2010”, April 29, 2010 downloaded from http://www.thomas.gov/cgi-bin/bdquery/z?d111:HR05013:@@@R on 18 July 2010.
Womack, J.P. and Jones, D.T. (1996) “Lean Thinking — Banish Waste and Create Wealth in Your Corporation”, Simon and Schuster: New York
© 2010, Bill Shepherd
Originally published as a part of 2010 PMI Global Congress Proceedings – Washington, DC