The meaning of the Cobb-Douglas function and its use in applications for services and outsourcing

Characteristics of “Estimation”

Service operation is an activity that an IT professional performs in order to effectively execute their assigned work. In this context, estimating the time, cost, resource, and quantity of deliverables must be accomplished before contracting for such a service or before service operations begin.

A model to be used for the estimation needs to have the ability to accurately predict future service operation performance such that the assigned work will be completed within the estimation range. In this sense, the model must reflect the process to be followed, the performance to be achieved, and the probable range of deviation that may be made from the estimation.

The model should help monitor the performance in a predictable and reliable manner. The measurability of the quantities that the model predicts is important. It is critical that the model is able to accommodate the changes that result from changes, such as business objectives, IT technology advancement, and skilled staff availability during the life cycle of service operations by helping improve the way of providing the services on-flight.

Last year, Akiyama et al. proposed a phenomenological model to estimate the workload spent for the system operation of information systems (Akiyama, Ohki, Ohkubo 2000). This study described a model that was a hyper flat plane in the log-log axes space of the workload and of the quantity of deliverables.

It is a coincidence that the workload model form and the Cobb-Douglas function form are exactly same. About 70 years ago, Paul H. Douglas found the function empirically through his study on the theory of wages. Cobb studied the mathematical structure of the Douglas's function in the mid-1930s. Since then, the function has been called the Cobb-Douglas function and has been applied to studying economic growth in many countries.

Exhibit 1. Tape Mount Operation Data Plots

Tape Mount Operation Data Plots

Exhibit 2. Process and Parameters

Process and Parameters

In this paper, we describe the basic characteristics of the Cobb-Douglas function in detail. Possible ways of understanding the Cobb-Douglas function in terms of workload estimate, its contribution in developing WBS in support of project or program management, empirical risk estimation method, and process improvement are also described.

Basic Definitions

Exhibit 1 is an example of the study results and shows the workload distribution of tape operation at 11 information system sites. We call this kind of a diagram as a multi-site diagram in short because multiple data points show the performance of multiple sites respectively. The groups of the red and blue plots show the sites using a process of manual-only and a mixture of manual and automated operations respectively. It is recognized that the blue (automated) operation shows not linear relation while the red line shows the slope value close to one. The wider distribution may be resulted by larger differences in the productivity achieved at different sites.

It can be assumed that the workload distribution pattern for a single site will be similar to the pattern shown by Exhibit 1 if the site follows the same process. More precisely, the diagram of tape operation workload of a site may be similar to the red line's slope if the site follows the same manual operation process. If the site follows the process of the same mixing ratio of manual and automated operations, then the diagram of tape operation workload of a site will be similar to the blue line. This kind of a diagram that shows the performance of a single site is called a single-site diagram in short.

As we will show later, the Cobb-Douglas function gives the single-site diagram form and is expected to work to present the relation between the workload and the amount of delivered service regarding one site.

Let's give several definitions that we use in this paper. A group of deliverables of the same or similar kind is defined as the deliverable type. For each deliverable type, the process is defined and describes what is to be done, i.e., the activities, in the form of a procedure, deliverable, tool, and staff who execute the activities using the tool to create deliverables of the deliverable type.

The process is generic and implies a template to specify how the work is provided. Exhibit 2 shows a high-level pictorial view.

The following five parameters give the high-level description of a process:

• Workload: W

• Quantity of deliverables or amount of service: N

• Process complexity: α

• Productivity: a

• Variation on W: ΔW

Here the complexity parameter α of a process is defined, as the degree of overall interdependency among the activities and appropriate historical data is required to determine the value. Similarly the value of the productivity parameter a is determined using actual data.

Assuming one activity executed at a time, we have α = 1 if each activity has no influence from any previous or future activities. If an activity receives influence from other activities, we have α > 1. If multiple activities are grouped and handled as one macro activity, then we may have the value α < 1. If the grouping results show more influences from among activities or more failed activities, we may have the value α > 1 (existence of a hidden factory). In this way, the amount of service request is shown to be not linearly proportional to the necessary workload for the service.

The simplest form of the Cobb-Douglas function is given by the formula:

img

where a and α are constant parameters. It is easily understood that α is the inverse of productivity and that the level of effort (LOE) is not included in W. This form of the Cobb-Douglas function will be examined in further detail in the next section.

It is also noted that the Cobb-Douglas function shows the input and output in the symmetric format as shown in the next expression:

img

An extension of Equation 1 is given. The following equation is an example of the Cobb-Douglas function of three independent parameters N1 , N2, and N3:

img

Similarly we will be able to derive the inverse function on Ni (i=1,2,3).

The Mathematical Structure of the Cobb-Douglas Function

Before we discuss possible several applications of the Cobb-Douglas function, some of the important structures need to be examined. First, Equation 1 derives the next equation:

img

where ΔN is a small change to N. Equation 3 means that the process complexity α is obtained by changing N slightly and by obtaining ΔW.

Assuming that the complexity α does not change, we have the following equation showing the relationship between the deviation ranges on the quantities W and N:

img

Assuming that a fixed percentage of the workload W is allowed for variation, as specified by rC, Equation 4 gives the fix amount of variation of rC × N × α-1 for the entire service N as well.

Exhibit 3 shows Equation 1 pictorially with the values of ΔW and rC. The thick straight line shows Equation 1 itself. The dotted line shows the line of the workload W + ΔW. We call this diagram the Cobb-Douglas function framework diagram.

For handling risks to be known, but the events identified later, the workload W needs to be increased by a fixed ratio of W. This is because a larger value of W may include more risk. For the planned service N, shown by the dotted line, the total workload is given by W + ΔW = (1 + rC) × W.

In this example, we are interested in the scalability of W and N. For example, the workload of one month should be given by the one-day workload multiplied by the number of working days per month. Assuming that W0 and N0 are the one-day workload and service respectively, the linear relation is given by the next formula:

Exhibit 3. The Cobb-Douglas Function Framework Diagram

The Cobb-Douglas Function Framework Diagram
img

where n is the number of days worked per month. By substituting Equation 5 into Equation 1, we have the next equation:

img

where a0 = a · nα-1. Now we know that Equation 1 derives Equation 6. Conversely, by substituting Equation 5 into Equation 6, Equation 1 is derived from Equation 6.

It is interesting to note that a0 holds if α = 1. From this, we understand that the Cobb-Douglas function is reduced to a very familiar linear equation of the workload and service amount when all the work is done by executing one activity at a time and by having no rework at all.

The Cobb-Douglas function states that the workload and the amount of service of one month is predicted from the workload and the amount of service of one day, as long as the complexity α and the constant a0 are unchanged for the month.

Equation 2 gives the next relations, although we do not show here how the equations are derived:

img

Here, it is noted that the parameters N1, N2, and N3 need to be independent to be consistent with Equation 7.

Applications of the Cobb-Douglas Function

Understanding how and why the Cobb-Douglas function shows up in the workload estimation model:

How do we understand why the log-log space is essentially convenient and useful in graphically visualizing the relationship between the workload and amount of service regarding a service-providing site?

Exhibit 4A. WBS Example

WBS Example

Exhibit 4B. One-Week Progress Diagram

One-Week Progress Diagram

A possible explanation for the above question is as follows. The process complexity is not always one generally. The parameter stays unchanged unless the process is changed. Most information systems may introduce automated operations, i.e., grouping multiple operations or activities into a few macro-level activities that may give a smaller value of the process complexity α, as Exhibit 1 shows. If reworks are expected more for failed jobs or operations, then a larger value of the parameter α is expected.

In this way, there exist large varieties of processes complexities at different sites respectively, however, only one value of the complexities is shown up effectively for each service component and described by the parameter α. These are represented by the Cobb-Douglas function Equations 1 or 2 or an extended form with different process complexity values.

Contribution of WBS to Project Management

We want to understand how the Cobb-Douglas function helps demonstrate that WBS is critical to successful project management. Let's consider a simple WBS example as shown in Exhibit 4A. The workload W is assumed to complete the entire service N for one month. The entire work is broken down into the four same components. This means that each component requires 1/4 W workload and delivers 1/4 N service for one week.

Exhibit 4B shows how the one-week activity progresses and is called the one-week progress diagram.The thick, straight black line shows the planned progress of the service. The straight dotted line is the upper limit showing the fixed percentage of the allowable workload to the planned workload. The non-straight, solid line shows the actual progress during the week and is called the progress line in this paper. The thick vertical line, shown in the leftmost low area of the diagram, is the inverse of the productivity and constant, i.e., it does not depend on the variable N. Thus the Cobb-Douglas function provides a model to measure the productivity ratio of the team of one or more staff working on an assigned WBS item.

It is noted that the first half of the progress line shows a relatively steep curve. This means that the process seems to be more complicated than expected. There exist some reasons for this. For example, the original estimation of the workload may be wrong or a hidden factory to rework on more failed operations is triggered. The rest of the line shows a mild curve verifying that the performance becomes stable as expected.

Exhibit 4C shows the one-month progress diagram. The method for how this diagram is created from the week diagrams is as follows. The first week diagram, shown in Exhibit 4B, is moved to the area indicated by “1st Week.” Since we assumed all the same WBS items, the second week diagram is the same as Exhibit 4B. It has the same productivity ratio but it may have a different progress curve. It is moved to the area indicated by “2nd Week.” This time, the width of the diagram becomes narrower because of the log-scaled horizontal axis. We move the narrower diagram up to the leftmost end of the progress line, and connect it to the right-end of the progress line of the first week shown in the area “1st Week.” Again because of the log-scaled y-axis, the height of the week progress diagram becomes shorter. We can move similarly the third and fourth week diagrams to the areas “3rd Week” and “4th Week” (a very small box), respectively.

It may be said from experience that if we look at a project during the initial 10–15% of the project life cycle, the project will be most likely be predicted on its overall performance. This important initial phase is emphasized in the left half areas of Exhibit 4B and 4C, respectively.

The slope of a progress line shows how the process complexity varies during the providing of the service. If a steeper slope is found along the progress line of the progress diagram for a WBS work item, there exists a particular workload bottleneck in working on the WBS item. When the problem is fixed, the steep curve goes away.

In a WBS that includes WBS items of different sizes, W and N are handled in the exact same way. First, the progress diagram is created for each WBS item. Then, integrating them creates the entire progress diagram. Furthermore, the progress diagram concept is extended for the case that multiple WBS items are executed in parallel.

Range of the Applicability

It is very important to find the range of the parameters W and N regarding to the applicability of the Cobb-Douglas function for a specific value.

Exhibit 4C. One-Month Progress Diagram

One-Month Progress Diagram

A good example is COCOMO-II. For creating a small size program, = 1 may hold. If the program size goes up to a larger number of KLOC, then is definitely expected. This is because the process to build small-scale software is different from the process to develop large-scale software in terms of the software size, the number of programmers or users involved, tools to be used, and number of requirement items. Given this, it will be very useful to find the best value of the process complexity according to the experience.

Empirical Risk Impact Estimation

The Cobb-Douglas function can be used in risk management. Risk is contained in human and irregular system processing, such as program errors, incorrect data, wrong operation or reply, etc., as it is in the case of the system operation service. Since there is many risk origins and because we do not claim zero defect and zero variation on human's activities, we recognize that system operation services may include many types of risk sources.

Let's assume that Exhibit 1 shows the historical data of system operation service performance. Therefore, each diagram will show the maximum increases experienced during the past operation services. It is most likely that the next service planned with the same process will be completed somewhere between the two dotted lines. This is a quick way to “estimate” the maximum value of the risk impact regarding a planned amount of the same service.

To compute the risk impact in a more detailed manner, the distribution of the points along the logW axis is obtained by projecting the data points to the axis. This results in the workload distribution of the performance data points. Assuming the distribution model is correct, we can calculate the risk impact based on the vanilla risk calculation method.

It is noted that an organization that a higher maturity level of Capability Maturity Model® results smaller. This means that the organization can consistently deliver more accurate estimation of the deliverables, workload, and risk.

Process Improvement

As noted previously, the process complexity varies over a wide range. In order to monitor and to stabilize the process execution as requested by the desired service level, it is necessary to measure the complexity range for each process. The two relevant parameters are as follows:

• The process complexity (α), and

• The productivity (a).

The first parameter indicates how simply the service activities are executed as a procedure. The second parameter implies how quickly the service is provided upon request.

It is important to know how the quality of service is managed based on the two parameters. The quality of the service may be considered from the next two aspects:

• Stability of the service delivery

• Stability of the activities.

If we assume that the process includes activities that verify the quality of delivering the services, then the parameter α has control (understandability or trace ability, etc.) on the delivery. The productivity parameter a defines the workload needed for one service. By definition, the stability of the activities is realized by having the lowest value of α and a high value of the productivity a at the same time.

Conclusion

The Cobb-Douglas function can describe the process complexity, visualization of progress, the log values of the workload, and the quantity of deliverables or services in relation to the complexity of the process.

The Cobb-Douglas function framework diagram is proposed and the progress diagram is developed based on the diagram framework. Multiple progress diagrams given for a set of WBS items, respectively, are integrated into the entire progress diagram that corresponds to the entire WBS.

Accordingly, the following conclusions are reached and understood:

1. Monitoring the performance for a WBS work item is made possible by monitoring the process complexity.

2. Project historical data is visualized in the progress diagram format, which can be used to empirically estimate the risk impact.

3. Process improvement is made possible by managing the parameters of the process complexity and productivity.

Finally, it is noted that the workload parameter W can be replaced by the earned value EV parameter without loosing consistency. In other words, the process complexity is measured in the log-log space of the earned value and the quantity of deliverables (service). This viewpoint will be very convenient for project management.

Acknowledgment

One (Y.A.) of the authors would like to thank David Frame who provided the information on the existence of the Cobb-Douglas function. He also thanks Akira Tominaga for his support during the study on the project management.

References

Akiyama, Y., Ohki, T., and Ohkubo, T. 2000. A Practical and Quantitative Approach for Improving Program Management for Service and Outsourcing. Proceedings of the 31st Annual Project Management Institute 2000 Seminars & Symposium. Newtown Square, PA: Project Management Institute.

http://cepa.newschool.edu/~het/essays/product/elastic.htm#cobb: (C) Cobb-Douglas Production Functions—This Web page describes briefly the mathematical aspect of the Cobb-Douglas Function.

http://cepa.newschool.edu/~het/home.htm: The History of Economic Thought Website of the School of Thought.

http://cepa.newschool.edu/~het/profiles/douglas.htm: This home page is the brief description of the major works of Paul Howard Douglas.

Jergeas, George, and Hartman, Francis. 1998. Outsourcing Projects for Success. Proceedings of the 29th Annual Project Management Institute 1998 Seminars & Symposium. Newtown Square, PA: Project Management Institute.

Lawler, P. Thomas, and Dieterle, Karen. 1999. Leveraging Project Management for Process Improvement. Proceedings of the 30th Annual Project Management Institute 1998 Seminars & Symposium. Newtown Square, PA: Project Management Institute.

Paulk, Mark, Weber, Charles V., Curtis, Bill, and Chrissisk, Mary Beth. 1994. The Capability Maturity Model, Guidelines for Improving the Software Process. Addison-Wesley.

Rubin, A. Howard. 1999. An Effective Strategy for Managing Outsourcing With Measurement. Cutter IT Journal (October): 17–24.

Smith, B. Max. 1998. Anatomy of Service and Outsourcing (SAO) Projects. Proceedings of the 29th Annual Project Management Institute 1998 Seminars & Symposium. Newtown Square, PA: Project Management Institute.

Uyttewaal, Eric. 1999. Take the Path that is Really Critical. PM Network (December): 37–39.

This material has been reproduced with the permission of the copyright owner. Unauthorized reproduction of this material is strictly prohibited. For permission to reproduce this material, please contact PMI or any listed author.

Proceedings of the Project Management Institute Annual Seminars & Symposium
November 1–10, 2001 • Nashville, Tenn., USA

Advertisement

Advertisement

Advertisement

Publishing or acceptance of an advertisement is neither a guarantee nor endorsement of the advertiser's product or service. View advertising policy.