Grow your own cost tracking system


Given the focus on the “holy trinity” of Cost, Schedule, and Quality in the delivery of corporate capital and maintenance projects, Program and Project Managers naturally want to accurately capture and forecast project costs and roll them up consistently to the program and enterprise levels. And some PMOs are even fortunate enough to have dedicated Project Management Information Systems (PMIS) that assist their information-gathering and reporting needs. But what to do if there is no automated system available for capturing “actual” and “committed” costs and forecasting EAC in a consistent manner across all projects and programs, even at the enterprise level? The unfortunate result is always the same: panicked Project Managers divert scarce assets, each invents their own cost control and tracking system and the Program/Enterprise information/data that (literally) trickle up to Management show no consistency in report design, data structure or efficacy of results.

One of the acceptable alternatives to this chaos of disinformation may be to design, construct and implement your own “homegrown” interim cost reporting and tracking system. Note the word “interim,” because such a system should truly be temporary. It should be designed and implemented as a “stop-gap” measure, to be used only for so long as it takes to convert (or modify) the Corporate accounting system on an urgent basis into a PM-compatible format with a fully-implemented deliverables-based Work Breakdown Structure (WBS) that integrates estimating, scheduling and cost tracking into a cohesive whole and imposes an integrated data structure and consistent reporting format across the entire enterprise. This paper tracks the development of such an “interim” system, which can be relatively inexpensive and quickly deployed, yet fully-automated and “user-friendly,” from conception through to utilization. The asset requirements to design and build such a system? One consultant with many other “irons in the fire,” a very talented and knowledgeable part-time programmer (and “super Mom”) working out of her home, plus some hard agreements concerning user requirements and expectations, locked down right from the start. Yet there were some mistakes…

How It All Got Started

This program was undertaken because the author-consultant was faced with Client PMO cost systems that had the capability for providing nothing more than once-monthly “actuals” to the PMs from the Corporate accounting system. Further, these “actual” costs were broken down by function and not by “deliverable,” as in an accounting system organized around a standardized WBS. Although captured in the Corporate accounting system, no “live” committed costs were being provided to the PMs. To cover this shortcoming, each PM ran his or her own queries into the various Client database systems and built his/her individual version of an Excel “commitment ledger.” Additional rudimentary and incomplete “identified but not committed” costs were inconsistently captured manually from various Corporate systems and/or by telephone or FAX from other Project Team members. This mass of sometimes conflicting data was then manually collated and manipulated by the individual PMs or their assistants (at the program level) and reported to Management with little or no consistency in either content or in reporting format between individual projects or across programs. To compile an Enterprise view of cost status from such individual area reports required a great deal of monthly repetition, interpretation of conflicting data and considerable guesswork to organize and collate the widely varying and inconsistent inputs from the various PMs. And the compiled Enterprise reports never matched up to the “corporate view” provided independently by the business analysts.

The author, tasked to assist the Client Project Management Office (PMO) in implementing “best in class” practices, and being told that Corporate accounting overhaul was still years away, used his prior experience and training in IT and MIS to design a system for consistent data capture and manipulation. This was charted by a process flow diagram (Exhibit 1) that was then converted into a Statement of Work (paraphrased for brevity in the “System Design” section of this paper). This SOW was used by the Programmer to design and construct a series of standard databased queries into the Client's corporate accounting systems and to “map” the cost transaction results to the User's requirements. The result was the construction and rollout of a “stand-alone” database of time-phased project estimates (cashflows) and actual and committed costs linked to project-specific Authorizations For Expenditure (AFE) which were further expandable into Program- and Enterprise-level data structures. By “tagging” data records and logically linking the individual AFEs into whatever hierarchical structure might be required to suit the User's situational requirements, the data could be further sorted and/or “bundled” for display at whatever level (by Program, by State, by area, by individual Manager, by entire enterprise, etc.) the User might require.

Exhibit 1


Developing User Requirements

Virtually every single publication or source available on the subject today makes it very clear that it is absolutely mandatory to “lock in” user requirements and expectations early on in the design process. We did exactly that. Just not all the user requirements and this basic deficiency rose up to hurt us, big time, later on. With the potential divestiture of a part of the business looming, the decision was made to process the system(s) in two semi-independent paths, one system for the business group that would remain with the current Corporate entity (which we shall call “Delta”) and another for the business group that was expected to be divested (which we shall call “Tango”). Directors of each Business Unit named their respective Managers of Project Services to champion the respective dual-pronged efforts at developing the two “interim cost tool” versions.

Design, engineering, writing of the software, construction of the “stand-alone” databases and initial testing went as well as could be expected. Where we ran into trouble was that the Project Engineers (major users) on the “Delta” path had been assumed to have the power to also speak for the Project Engineers on the divesting (“Tango”) side and this was simply not true (to our intense dismay). In the interim, and in the absence of an effective Business Unit cost-gathering system, each “Tango” project engineer had developed his or her own set of database queries and had become absolutely convinced that their own way was the “only” way. They simply refused to replace the results of their own commendable and independent efforts (i.e., utilize the new system) when it was rolled out, no matter how many times it was shown to be at least as accurate, and certainly more consistent, than their own query and display systems. In short, “It wasn't invented here, so we aren't going to use it.”

Exhibit 2


The end result was that the new “interim” system was fully implemented within the “Delta” business group and is still being used. The “Tango” business group eventually put the fully-functional and tested “Interim Cost Tool” system into “standby” and is even now reinventing the wheel by designing and engineering a very similar PM cost tracking system, but this time with the active engagement of the Project Engineers so that User involvement is unquestioned.

Why No WBS?

As Project Managers, the first thing we should look for from corporate accounting is a system that allows for cost coding by Work Breakdown Structure (WBS) deliverables, and not by corporate functions…only when Project costs are broken down by deliverables can we consistently understand how the costs are incurred and how the cost of project elements (deliverables) may vary from one project to the same element in the next project. That's why one of the first things we did in this consulting project was to design a fully integrated WBS for the Client. That it was available and not implemented was largely a result of pending corporate decisions regarding potential spin-off of certain business units and the lack of firm direction regarding what business systems might be employed after divestiture. Since those decisions were not yet finalized, and in fact were changed or delayed several times during the course of this project, it was impossible to decide with any degree of finality what corporate accounting systems were likely to evolve and so the cost of restructuring the existing data systems to accommodate a WBS was postponed several times. As a result, we actually adopted the name “Interim Cost Tool” for this system, which was constructed without benefit of WBS alignment but with the capability of future WBS emplacement, should the future of corporate accounting systems be determined.

Details of the Design

The intent of the project was to automate data retrieval and standardize its display across all projects, programs and the enterprise, using queries into the Corporate business system(s) database(s) and then impose standard formats on the results of the queries so that project cost data could be consistently displayed by individual project (AFE). By “linking” AFEs into summary “programs” (identified as a group of AFEs/projects centered around a common purpose and fiscal appropriation) and then summarized at the enterprise level, the data could be displayed in other useful ways.

Retrieved data was to be broken down (sorted and displayed) by:

•   Past year (totals only)

•   Current year (every two weeks or by month, at user option).

Resulting data was to be exported from our “stand-alone” database system on call and converted into a format that could easily be displayed in both graphical and tabular (i.e., “flat” file) format as shown by attached sample graphical report entitled “Port Jervis Pumping Upgrades” (Exhibit 2).

It was decided that the output data was to be sorted into three sections (“Estimated,” “Actual,” and “Committed”) and then graphed as shown in the attached “Port Jervis” example. In addition, it was specified that the available “Committed” data electronically captured from the Client business systems would be combined (merged) with a set of “Identified but not captured” data to be manually entered in the same format by the Operator. A combined table of “Total Committed” (Client business system data plus manually-entered “Identified but not captured” data) would result in the best possible fiscal picture of the project-to-date.

And this brings up a basic point. We at PMCC are totally convinced that AC (actual costs) doesn't give enough warning of where a project is heading. For years, we have kept our own Lotus or Excel “commitment ledgers” that manually tracked each individual commitment from the time it was first conceived through the moment when it cleared the books and really became an “actual cost.” So we have coined a phrase, which we call “AC*” to represent “actual plus committed costs” and we routinely calculate and display this combined figure as well as the more common “AC.”

The basic philosophy for gathering “committed costs” as regards material in this program, is that the Client database that houses information for material, not all of which is “tagged” by AFE until it was withdrawn from the material warehouse, is queried. The intent was to pull from that database and show that $XXX of material is committed to Y project. As a piece of material is paid for, that piece will be eliminated from the “committed” database. Therefore, the logic was that as the material (and dollar value of) COMMITTED decreases, the material ACTUAL increases proportionally.

With regard to contracts, the same philosophy pertains to capture of data, although this was made even more difficult because some Client contracts were found to span more than a single project (AFE). In this case, it became necessary to manually intervene to modify NTE $$$ values to reflect funds actually committed to the specific project being reported on.

The overall intent was that, on demand and using a simple Desktop icon, our “stand-alone” database would be queried by Access and the results output into an Excel data sheet and automatically graphed (PV, AC, AC*, EAC, EV, etc.) in conformance with “best-in-class” project management practices. Both the “baseline” estimate and the most recent “Current Estimate” would be retrieved and graphed for comparison and analysis. The only required manual inputs from the Operator (Project/Program Manager, Analyst, etc.) when calling for a “standard” cost status report would therefore be:

•   Provision of a time-phased estimate (cashflow) as a one-time entry

•   Selection of the report parameters:

•   Project (by name or by AFE)

•   Program (group of projects)

•   enterprise.

•   Entry of any “identified but not committed” cost data that could not be electronically captured

•   Correction of “Not To Exceed” (NTE) dollar values for certain contract transactions lacking that data

•   Entry of EV (Still pending is the link for automatic capture of this parameter from the Scheduling system).

In addition to the Project (AFE), Program and Enterprise-level sorts and displays, the users decided that data that was stored in the “data warehouse” (already identified by AFE), should be further capable of being classified, sorted and summarized by the following:

•   Project Engineer

•   Program Manager

•   Program number

•   Program name

•   Area

•   State

•   Legal entity.

Since these were not fields already contained within the data that was retrieved from the corporate cost system, a separate database had to be built to identify the fields to the AFE (project authorization document number) and manual “tagging” was required to link this data to the list of AFEs.

It was decided that the data extraction process from the Client business database(s) could be preprogrammed (queries written and stored for later use) but that the queries should be manually initiated to take advantage of perceived lulls in the host LAN/WAN and server activity. It was later found that this retrieval and storage operation to extract and update current “Actual Costs” (“AC”) and “Committed Costs,” whether run biweekly or monthly, would require from two to four hours. Fortunately, this became a once-monthly operation and required little more than initialization of the preformatted queries by the supporting analyst. For information, we are describing a database that was intended to encompass about 1,000 AFEs (individual projects) and required about 60 megabytes of server memory for storage of transaction data running back, in some cases, up to three years.

The cost data was pulled from the Corporate “Financial Data Warehouse” that resides on an SQL server and is an Oracle73 database. This file contained the current year actuals and summarized data including journal information and transaction date that was needed to group records biweekly. It also contained a transaction number that would make each record unique and allow us to ensure data integrity by counting records.

Our update database contained links to this table (which was renamed to “Current Detail”). The programmer also wrote a function (stored in the module called “WhichYear”) that created an SQL statement and then executed that statement. Depending on the year that was selected, a different function was called. The tables with past year data were called P1 Detail (for Prior year one), P2 Detail, etc. The Operator (in this case the Analyst updating the data) was given the ability to pull in one project or all projects that were active. Due to the size of the source table, only one month for one project was normally pulled in at a time. This means that for 600 projects, if the user selects all projects for an entire year, then this statement would run 7,200 times. This was not a problem for the “Delta” customers who had only 60 active projects, but the “Tango” customers (with about 600 active projects currently) were advised to accomplish this update in partial steps.

After all records were imported, somewhere between five to seven update queries were normally required to “tag” the records according to the needs of the customer. The final step was to take the detail data and load it into a summary table to enhance performance when running reports.

Our programmer's prior experience indicated that most of her customers prefer their reports/graphs in MS Excel because it's a program that they generally know and understand and can attach to an email. Accordingly, and on call by the User/Operator (PM or financial analyst) the Interim Cost Tool exports the results of a query that summarizes the data in the cost categories into an Excel spreadsheet. A series of functions in the modules “Excel Report” or “Excel Actuals” (depending on the customer selection) are then called. These functions look for an existing file with the same name and then ask the user if he or she wishes to overwrite this file (to protect archived reports that may have been given the same name). Then it copies and formats the data into a separate spreadsheet and creates a chart. All these functions were composed by our “Super Mom” programmer/analyst in the Visual Basic language in MS Access97.

Here's how our “super Mom” programmer structured the data (These server path names would be for the “Tango” business unit, with similar names but a different prefix used for the “Delta” business unit):

•   TcostTracking.mdb: This database is used by operators (analysts, project engineers, etc.) to execute reports, enter estimates and analyze project “actual costs” (AC). It stores the projects that are being tracked, the estimates for those projects, information tables such as activity codes, organizations, and other necessary information to produce reports. As well as the data, all the forms and program modules are stored in this database that relate to the execution of reports and screens.

•   Tdata.mdb: This stores only one table called Data1. This table contains the detailed information pulled from the data warehouse.

•   Note 1: To prevent data corruption, Users were allowed “read only” access to this file.

•   Note 2: When this table is linked to the TcostTracking database, it is renamed to Data Detail.

•   Tsummary.mdb: This stores the summary of Data1 in the Tdata.mdb file. This is a summary by Project Code, Responsibility, Accounting Period, Fiscal Year, Main and Subcategories. Also stores the Outstanding Material and Contract Commitment tables.

•   Tupdate.mdb: This is the database that is used to import actual costs from data warehouse and the outstanding contract and material commitments. Linked to Tdata and Tsummary and to the tables in the data warehouse.

•   File Folder TCReports: This is the default location (file name/path) where the reports generated from TcostTracking are stored.

The lack of a solid EV basis led to some shortcomings in the usefulness of this interim cost reporting and forecasting mechanism. The “bottom line” is that we always want to know where our project is going to end up financially (Estimate At Completion, or “EAC”). Since the normal way to calculate EAC is to use the formulae:

•   EAC = AC + ETC, where

•   ETC = BAC—EV, substitution therefore yields

•   Simple EAC = AC + (BAC—EV), and

•   Statistical EAC = AC + [(BAC—EV)/CPI]. But,

•   CPI = EV/AC, so further substitution yields:

•   Statistical EAC = AC + [(BAC—EV)/(EV/AC)].

As indicated by the formulae above, the “bottom line” is that we simply cannot compute an accurate forecast of Estimate At Completion (EAC) without some non-subjective idea of “how we are doing,” or EV (what the PMBOK® Guide calls “value of work completed.”)

Like the other values (AC, PV, etc.), we wanted to plot EV in terms of dollar value of work earned, but we had no consistent source of EV data. To get around this shortcoming, we worked the problem backwards and compromised by using a formula for a “calculated” EV($) = (EV%) X (final PV$) where EV(%) was generated as a result of field observations. This method produced an acceptable curve for both the statistical EAC and for EV (refer to Exhibit 2). However, this method of determining EV(%) is potentially inconsistent and highly subjective. Since use of this subjective Earned Value might be counterproductive and actually erroneous, the choice of selecting whether or not to calculate and chart the “Earned Value” (EV) function was left to the individual User (PM/Analyst).

As a further comment, it was initially hoped to be able to link EV from the User scheduling system (MS Project 2000) directly to the “Interim Cost Tool” but this ability has not yet materialized.

We also designed the system so that the original “baseline” estimate was always plotted, as well as the most recent update of PV. All calculations using PV were based on the most recent “present estimate” (manual entry).

PV or “time-phased Cashflow” data was protected so that any change prompts the User to enter an explanation as to why he or she is making these changes so that an “audit trail” is established. This screen is used to enter Change Order information and data relevant to rebaselining the project. In addition to establishing an “audit trail,” provision is made for version tracking (if desired).

A variety of other preformatted reports were also designed and implemented to allow for such routine tasks as database maintenance, viewing project-specific material and labor commitments, etc.

System Operation

It would be presumed that the User/Operator would have already entered PV data (time-phased cash flow estimate) as a basis against which to measure project progress. This was a one-time effort for each AFE and required amendment only if the Project (AFE) was rebaselined.

When a cost report is required by a User/Operator from the “stand-alone” database, the Interim Cost Tool is designed such that the Operator (usually defined as a Project Manager/Project Engineer, financial analyst, or the equivalent) enters the system (via desk-top icon) with a request to “generate reports.” The operator interface screen is designed to allow for “Project or Portfolio Specific Information” to be entered to provide sorting and running of reports by Project/AFE, Program (grouped AFEs), State, area, legal entity or entire Business Unit (defined as “enterprise” level).

The initiation screen was purposely designed to allow the User to select whether or not to report the Earned Value or “EV” for each report (rationale provided earlier in “System Design”). “Cash flow (“time-phased project estimate,” or PV), “Actual Cost” (“AC”) and “Actuals plus Committed” (previously defined as “AC*”) are always reported and graphed, if transaction data is available.

If the decision was to report/plot EV, the next step in the process would be to prompt the User/Operator to enter “Earned Value” (EV%) for the same cost categories as corresponding entries for “PV.”

The User/Operator is then required to select the mode he or she wants the data displayed:

•   “Responsibility” view (no taxes, overhead or AFUDC)

•   “Total project” view and select which of the following are to be included:

•   Taxes

•   AFUDC, and/or

•   Overheads

Note:“AFUDC” means “Allowance for Funds Used During Construction” (prior project completion and therefore prior to start of depreciation process).

Clicking on “Go” starts the query/report process and the “interim cost tool” automatically runs its queries into the stand-alone database, outputs into Excel, calculates the various values and plots the requisite graphs.

The User/Operator next would review the data and manually enter those estimated (or known) costs that are “Identified but not committed” to this project(s). Note: “Committed” in this context is limited to mean “capable of being electronically captured from the Client's accounting system.” This “identified but not committed” information will have been gathered offline and may include requisitions and/or contracts that have not yet been approved and converted to electronic Purchase Orders.

Using a similar manual process, the User/Operator then corrects any contract data (contracts that span multiple AFEs) and enters NTE data specific to the current project being reported on. This category might include some forms of “blanket” construction contracts that are not wholly specific to the project in question.

The results of the above are then entered into the Excel data sheet and added to “actual costs” (AC) to produce a revised “AC*.” This combined figure represents the best available estimate of the project's cost status at this point and reflects capturing of all known committed (i.e., electronically captured) plus all the “identified but not captured” funds no longer available to the Project Manager's discretion.

At this point, five processed data files would normally be displayed:

•   PV, or “time-phased cashflow estimate”

•   AC, or “total actual costs”

•   Statistical (or simple) EAC

•   AC* or “total of actuals plus committed/identified”

•   EV or “work performed” (if called).

It would be expected that the User/Operator would first look at the entire program (combination of AFEs), determine that there was a problem and then “drill down” to the individual project (AFE). Using the system's “micro-view” capability, the User can then further drill down within the suspect AFE to the category that might be in question (i.e., Material or Contracted Engineering or Construction Payroll) and compare actual costs incurred against the estimated costs. We also made provision for viewing detailed material and contract transaction data, in both actual and committed forms.


There are generally many ways to resolve every business problem. The question is how to properly balance an acceptable level of resource expenditure against anticipated benefit? This project probably stayed close to the “minimum resource” side of the decision matrix while producing acceptable results for one customer and unacceptable results for the other. But the point is that both versions of the system actually worked well (and still do) and they were designed, engineered and rolled out with minimum resource utilization.

The “major lessons learned” from this project are probably six in number:

•   We originally wanted to avoid a summary table because it was felt that it would be redundant, take up unnecessary server space and add time to the function of updating data, which was running several hours for the “Tango” customer (600 AFEs). However, we found that creation of a summary table greatly enhanced system performance, especially for those users located on the same server.

•   We should have better estimated the volume of data that was running through the LAN and the impact of the number of users. Users initially encountered very poor response times (some of which was due to the volume of data that was running over the LAN). However, use of the summary table concept negated many of these initial frustrations.

•   Another problem was the detail of information. “Delta” (relatively small database) had requested a greater level of detail, but “Tango” was ambivalent. In hindsight, and due to the volume of data involved, “Tango” would have been better served if we had limited the level of detail displayed, since they had not specifically requested it.

•   Full and uncompromising advance and continuing buy-in by all the users that will actually operate the proposed system is mandatory.

•   Unanimous buy-in by Managers to mandate an effective rollout and then enforce use of the system across the entire business unit is equally as important as total User buy-in.

•   The “bottom line” is that you really can “grow your own” interim cost tracking and forecasting system.

This paper has discussed only the major facets of the design and implementation process, as well as difficulties that were encountered en route. The presentation at the PMI Seminar/Symposium 2002 will include a demonstration of the system's functionality and reports, which is intended to supplement discussion of the written paper.


Willems, Cathy (MC Data Services, Searcy, AR)

This material has been reproduced with the permission of the copyright owner. Unauthorized reproduction of this material is strictly prohibited. For permission to reproduce this material, please contact PMI or any listed author.

Proceedings of the Project Management Institute Annual Seminars & Symposium
October 3–10, 2002 · San Antonio, Texas, USA



Related Content