Quality assurance for engineering computer programs in the project environment

some comments on their use in an R&D organization

Bechtel Power Corporation


When the author of this article mentioned to a longtime friend that he was working on engineering computer programs, this friend related the following events:

In the past, hand calculations were used successfully for designing pressure vessels. Recently, a sophisticated computer program was used, and the designed pressure vessel failed. Damage claims, litigation, and expenses were incurred.

Ever since, the author has focused attention on the verification of computer programs. In recent publications, Coaker [1], Fong [3], and Sheron and Rosztoczy [8] described some serious problems with engineering computer programs (ECPs). Examples of the substantial cost effects from problems with ECPs are given by Coaker and in Richter [11]. To develop an understanding for the need and the efforts of ECP verification, the following topics that are relevant for the reliable performance of ECPs are to be discussed:

• Considerations requiring the verification of ECPs

• Defects in ECPs

• Definition of verification

• Verification report (VR) for ECPs

• Verification status for an ECP

• Error reporting and corrective action

• Integrated verification effort

• Overcoming restraining forces to verification

Figure I illustrates schematically how the different company entities such as Quality Assurance, Engineering, Data Processing, and Projects contribute to an integrated effort to ensure the use of a properly verified ECP.

integrated verification effort

Figure 1 Integrated Verification Effort

Need for the Verification of ECPs

The need for the verification of ECPs is derived from several considerations:

• Is the project application of an ECP safety related?

• Every engineer is ultimately responsible for the accuracy of the calculated values that he uses for analysis and design.

• The individual engineer is not expected to know all the details of each ECP used.

• An inspector may check every detail of a safety-related calculation.

These considerations require the careful examination of the performance of ECPs that are used for safety-related applications. In the papers on the control of computer programs in the project environment [10] and the comparison of software development methodologies [9], the author has examined the productivity and reliability aspects for the performance of ECPs in greater detail. This article addresses the reliability aspects of performance, and particularly the steps required to assure the use of verified ECPs not only for safety-related, but for all applications. The review of the verification status of ECPs has identified the following areas that need improvement:

• Standard or public-domain ECPs may be inadequately verified.

• Verification of ECPs is variously interpreted, accomplished, and documented.

• Use of ECPs may be inadequately controlled and reported.

These areas can affect safety-related applications and may be subject to external review and inquiry by regulatory agencies. To avoid potential liabilities and to reduce budget and schedule overruns, ECPs need to be verified properly and perform reliably. ECPs can be finite-element programs that are used for structural analysis and design, heat transfer programs, and programs used for many different technical and scientific applications. Safety-related applications may refer to areas such as seismic analysis of nuclear power plants.

Defects in ECPS

Several potential defects can affect the reliable performance of ECPs:

• Incorrect technical models and equations

• Inadequate numerical or logic algorithms

• Incorrect coding and compilation

• Incomplete documentation and verification

• Incorrect use of ECPs

• Defective computer system

A computer system is composed of hardware, software, and communications. It has occurred that the technical models and equations used in an ECP were incorrect. Numerical algorithms sometimes do not converge to an accurate value. A logic algorithm, such as a sorting module, may not sort on the proper characters, or constants and variables may not have been properly initialized. Errors in the coding may not be detected by a compiler, and data may not be transferred correctly between subroutines. When the documentation of an ECP is incomplete, it may be used for the wrong application. When the verification of an ECP is inadequate, the values calculated by this ECP may also be inadequate, and the user may be unaware of this fact. Finally, if the computer system is defective, the reliable performance of an ECP may be affected.

The accuracy of the results from an ECP can be influenced by the selection of the input model and the input data. Such an example is given by the selection of mesh sizes for a finite-element ECP in critical regions [2].

Verification of ECPs

To avoid unsatisfactory results from an ECP, government agencies, societies, and companies are actively producing guidelines and standards for the verification of ECPs. In his paper on the need and challenge for computer standards, Schuster [7] reported the existence of 60 standards and over 300 projects to develop them. These efforts are sometimes limited to a series of definitions or brief directives.

Other efforts, which are mainly pursued in universities, try to test each line of source coding or control paths and nodes. According to Huang [5], these efforts require a testing logic that itself may be affected by errors. Source code testing can be expensive, and it is not expected to yield practical results for business and engineering programs in the near future. When computer vendor services are used, the source code may not even be available for testing by the individual user. In addition, source code testing ignores the testing of requirements, equations, parameters, convergence of numerical algorithms, and technical models that are used in ECPs and in the input. Such testing is, however, very important for the verification of ECPs.

Sometimes, complexity metrics, as discussed in reference [4], are misinterpreted to test the reliability of an ECP. Complexity metrics test for the redundancy of program control paths and associated information flow. Consequently, a program may be made more efficient by reducing the number of control paths without affecting its reliability. However, the manpower costs to improve the efficiency of the program may far outweigh the savings in run costs on mainframe computers.

Many societal standards are procedures-oriented. They require a test plan for the development of an ECP and emphasize the specification of acceptance criteria and expected results for tests. Actually, industry requires a different approach because test plans of this type may not have been prepared for an ECP or are not available when computer vendor services are used. Furthermore, the results from a hand calculation and the size of the deviations of a finite-element analysis from a theoretical solution are usually not known in advance and cannot be specified. Therefore, industry has to rely on results-oriented documentation and verification, as detailed in references [10] and [12]. An improved definitio of verification is given in this section.

The terms validation, qualification, and verification are variously defined in the literature. In practice, only one type of verification report is prepared and issued for an ECP. Therefore, a definition of verification is required that includes the definitions of validity and qualification and yet remains practical. Such a comprehensive and practical definition for verification follows.

The verification of an engineering computer program is the process that establishes:

• Use of valid technical (mathematical) models, and numerical and logic algorithms

• Range of valid applications and parameters

• Reproducible and accurate results

Reproducible and accurate performance of the computer system (hardware, software, and communications) is essential for the verification of a computer program, and data processing departments have established procedures for the certification (verification) of the computer system.

The range of valid applications and parameters refers to the technical program capabilities, the range of values for parameters or variables, and boundary conditions.

To obtain a reliability metric, a measure for the reliable performance of an ECP, the accuracy of the results is measured by comparing the results from the computer program for a set of test samples with the results from a comparison method such as:

• Hand calculations

• Independently verified solutions

• Mathematical solutions

• Empirical (experimental) data

• Data from the technical literature

The results may also be compared for compliance with government and industry codes and standards.

When comparing results, deviations are calculated and their significance is determined for valid applications. While comparing results, it may become necessary to examine if the ECP and the comparison method use the same valid technical models, numerical algorithms, and equations. This examination is essential when the results do not agree within an acceptable relative deviation. The mere agreement of the results does not signify adequate verification, because both programs may contain the same defective algorithm such as an equation solver. This is the reason for requiring the comparison of results of an ECP with those from an independently verified program.

By defining a reliability metric, measurement of accuracy by deviations, qualitative, and sometimes misleading statements such as the following, are avoided:

• Compares well

• Excellent agreement

• Good correspondence

• Compare not well, etc.

These qualitative statements are capable of concealing deviations of large size.

The approach to measuring the reliable performance of an ECP by deviations is related to the “bottom-up” testing described in the book, The Art of Software Testing [6].

Verification Report for an ECP

The verification report (VR) for an ECP is the end product of the verification effort. To organize an effective and efficient verification effort, knowledge of the content of the report can save time and money.

The content of the described verification report was developed and improved by the author over a series of years and has been published in slightly different forms [10] [12]. The latest version has this outline:



Description of Verification

Program Application and Capabilities

Technical Methods

Verified Program Sections

Methods Used for Comparison

Independently Verified Computer Programs

Hand Calculations

Mathematical Solutions

Empirical Data

Technical Literature

Description of Test Samples and Program Capabilities

Evaluation of Test Results

Comparison of Results

Deviations and their Significance

Compliance with Codes and Industry Standards


Appendix Containing the Detailed Calculations for each Test Sample

The summary in the outline describes the highlights of the sections for easy user orientation. The VR must be modified to account for every subsequent verification contribution and revision. The term, “Technical Methods,” in the outline refers to the technical model and equations used in the comparison methods. If another computer program is used for comparison, it must be an independently verified ECP. A table that relates program capabilities (options), test samples, comparison methods, deviations, their significance, etc. is helpful to convey an overview of the verification of an ECP and for a quick reference. This table should be included in the user manual for reference or in a data base, and it can be used by projects for matching the technical project applications with the verified program capabilities.

The computer system must be certified for project and verification runs. Project and verification runs must be made with the same program version and on the same computer or a computer certified to give the same results.

The ECP must be verified on each computer on which it is run, because a different computer, compiler, and particularly the number of significant digits of the computer, may cause inaccurate results.

The validity of the input data and the input model has to be established for each test sample. Appendices containing the detailed calculations for a test sample should be organized according to the outline of the verification report as far as is feasible, and should contain sufficient information for reconstructing the calculations.

The location of input and output, interface data for each test sample, program version, and compilation listing should be included in the VR. The printout from an ECP should identify the program version and test sample, and should show a date and the computer on which the program was run.

The format of the VR may differ for different institutions and companies. However, the following common items are required:

• Table of contents

• List of effective pages

• History of revisions

• Program name and version, release date, and report revision number on each page

• Approval signatures of the preparer of the report, specialist who performed the verification, and independent reviewer

• Title page that identifies the specific verification report, name of computer program and version, release date, and approvals

• Input data listings and output printouts identified by program name and version, test sample, run date, and computer (preferably on each page)

To clarify the efforts of the independent reviewer, a paragraph describing the reviewer's activities should be included in the VR.

If the described VR does not exist, and the responsible engineer is no longer available, the verification may have to be repeated or a new analysis may have to be made. Such duplicate efforts may become likely for longer lasting projects, and they may cause budget and schedule overruns.

Verification Status Report for an ECP

Due to limited budgets for developing and testing a larger ECP, not all program capabilities can be verified initially. When the combinations of capabilities are considered, the number of runs for testing all of them becomes too large (10 capabilities have 3,628,800 combinations). Developers of programs like to see their programs used by as large a number of users as possible. Some of the developers declare their program a “public domain” or “standard” program that can be used without ascertaining the verification status of used capabilities. The sections on defects and verification of ECPs reveal that such a declaration may be quite misleading. Reviews have often revealed program capabilities that were not verified. If such a verification deficiency is discovered when the applicable project work is completed, budget and schedule overruns may occur, since it may be necessary to repeat the analysis with a verified program version or another program, or to verify the used version. To avoid such overruns, the following activities need to be carried out at the start of the analysis and design calculations:

• Identify the ECP capabilities to be used for project applications

• Examine the verification report of the ECP and determine if the used program capabilities are verified

• If one or more capabilities are not verified, decide on an action for verification or on the use of another ECP

To determine the verification status of program capabilities, the test samples for the used capabilities need to be identified. To establish the completeness for the verification of used capabilities, the following questions need to be answered affirmatively:

• Are the used comparison methods valid?

• Are the deviations insignificant?

• Does the VR relate to the program version used by the project?

After it has been found that the used program capabilities are adequately verified, project work may proceed.

Error Reporting and Corrective Action

ECPs are “living” entities that are corrected, changed, and enhanced over a period of time. Project managers must be informed of program errors that have been detected and decide on corrective action. Activities and responsibilities need to be specified for error reporting and correction. In his paper on the control of computer programs in the project environment, the author recommended that a qualified specialist be assigned to each ECP used in a discipline. This specialist receives the error notices for the ECP, evaluates their impact, and informs engineering management, quality assurance, and users of the following specifics:

• Affected program capabilities

• Significance and impact of program error for project applications

• Plan for error correction

• Alternative for avoiding the effects of the program error

After the project has received the error notification, engineers evaluate the impact of the reported program error on the project applications and take the required corrective action.

Integrated Verification Effort

To assure the use of a properly verified computer program on projects, several entities in a company have to cooperate, and their responsibilities must be clearly defined. The section on the potential defects of an ECP shows that a computer program is a complex software product that combines effort and expertise from these areas:

• Technical applications

• Numerical and logic algorithms

• Coding and compilation of programs

• Verification and documentation

• Use of computer programs

• Computer system

The technical models and equations are supplied by the engineering disciplines, and they are concerned with verification, documentation, error reporting, and corrective action. Often, the disciplines also decide the use of numerical and logic algorithms. They assign program specialists who participate in the preparation of status reports for the verified program capabilities.

Data processing departments usually provide the coding of ECPs and administer the program files and documentation. They prepare the contracts with computer service vendors and for purchasing hardware and software. They also verify (certify) the reliable performance of the computer system.

Engineers on projects use computer programs, supply information on the used capabilities of ECPs, and evaluate their verification status by consulting program verification reports and program specialists. They take corrective action if a program error or a verification deficiency impacts project work.

Procedures and guidelines for ECP related activities such as verification, error reporting, and correction are prepared in cooperation with the engineering disciplines, data processing, and quality assurance (QA) groups. QA monitors and audits compliance with approved procedures.

Quality assurance groups may work with the engineering disciplines to monitor and accomplish compliance or they may merely audit compliance.

Without the cooperation and integration of quality related activities by the different entities, the use of a properly verified ECP may be difficult to achieve. Figure 1 illustrates the integrated effort.

Overcoming Restraining Forces to Verification

There are some potential restraining forces that may delay the implementation of the approaches discussed in this article.

ECP's are usually developed in response to a perceived need and, when the program is completed, the engineers or programmers who have done the work are eager to see it utilized as soon as possible. The time needed to prepare proper documentation interferes with this and deferral of the documentation effort results.

Some engineers are confident that their efforts were honest and complete, and may ignore adequate documentation without realizing that details are easily forgotten and need to be documented.

Engineers sometimes feel that a smaller program does not need documentation. When some time has passed or the writer of the program has left, the evidence for the verification, essential for safety-related applications, may be lost and may cause duplication of the verification effort. Smaller programs do not need a voluminous verification report. A few pages of appropriate documentation may be adequate. Reviews of the verification documentation have shown that the problems found on smaller programs are, in principal, the same as those for larger programs, while the extent of the details is less. Sometimes engineers may not know what adequate documentation requires.

Most of the restraining forces concerning adequate documentation and verification can be overcome by preparing the documentation, while the program is planned, written, and tested. Education for good documentation skills is an important part of removing misunderstanding and restraining forces.

The acceptance of a procedure by the different entities may be slow and cumbersome. Several approaches may be helpful in such a situation:

• Procedures should be developed by a specialist who has extensive practical experience in the respective area, can express procedures clearly and concisely, and can integrate details into a whole

• Avoid too many definitions that often may prove to be inadequate

• Voluminous procedures seldom are accepted and followed

• Avoid too much emphasis on format (sometimes format is used as a substitute for content)

• Use effective group discussion techniques to clarify the subject matter and to arrive at consensus (see the author's paper on problem solving for details [13])

• Improve procedures and guidelines by observing the findings from reviews of the verification status of ECPs

The outlined approaches for improving cooperation among departments and individuals, and for reducing misunderstanding of the ECP verification requirements and efforts, will hopefully lead to the use of properly verified ECPs.


Although procedures were emphasized for the verification of engineering computer programs in this article, they also apply to business programs, and programs that are not safety related. Numerical methods are usually less complex in business applications. By examining the potential defects of an ECP, the need for verification was demonstrated. A performance oriented definition for the verification of an ECP was derived. Based on this definition, the content of an effective verification report was outlined. The evaluation of the verification status of an ECP was discussed for project applications. Error reporting and corrective action by projects were described. Finally, the integrated verification effort of the company entities was considered and approaches for the overcoming of restraining forces were outlined. By observing the thoughts expressed in this article, better understanding of the verification of an ECP and the saving of money and time may be accomplished.

The following benefits can be derived and realized by project managers from this article:

• Familiarity with the verification requirements for an ECP

• Reduction of project effort by communicating with relevant corporate entities

• Identification of required ECP capabilities and their verification in an early project stage to avoid later delays

• Awareness of effective comparison methods and criteria for accurate verification results

• Importance of relating program capabilities to be used on a project with those documented in the verification report valid for the proper ECP version

• Steps to be taken when an error is reported for a specific ECP used on a project

• Inclusion of budget and schedule items for potential verification activities for ECPs to be used on a project

• Assignment of responsibilities for the use of verified program capabilities at an early project stage


1. Coaker, J.W. Software Management: Caveat Emptor, Computers in Engineering 1982, New York: ASME, 4.

2. Dunder, V.F. Need for Adaptive Methods in Partial Differential Equations, Proceedings of the Workshop on Adaptive Methods for Partial Differential Equations, SIAM, Philadelphia, 1983. (Note: This paper and conference consider the accuracies obtained with different engineering input models using finite-element meshes.)

3. Fong, H.H., An Evaluation of Eight U.S. General Purpose Finite-Element Computer Programs, Paper No. 82-0699-CP, 23rd AIAA/ASME/ASCE/AHS Structures, Structural Dynamics and Materials Conference, New Orleans, LA, 1982.

4. Henry, S., Kafura, D., & Harris, K. On the Relationships Among Three Software Metrics, 1981 ACM Workshop/Symposium on Measurement and Evaluation of Software Quality, Association for Computing Machinery, College Park, MD, 1981.

5. Huang, J.C. An Approach to Program Testing, ACM Computing Surveys, 1975, Reprinted in IEEE Tutorial on Software Methodology, IEEE Catalog No. EHO 142-0.

6. Myers, G.J. The Art of Software Testing, New York: John Wiley & Sons, 1979.

7. Schuster, D.J. Computer Standards — The Need and Challenge, Mechanical Engineering, 1981, 40.

8. Sheron, B.W., & Rosztoczy, Z.R. Report on Nuclear Industry Quality Assurance Procedures for Safety Analysis Computer Code Development and Use, Office of Nuclear Reactor Regulation, U.S. Nuclear Regulatory Commission, Washington, D.C., (NUREG-0653), 1980.

9. Richter, H.P. A Comparison of Software Development Methodologies, Computers in Engineering 1982, ASME, 4, 147-152.

10. Richter, H.P., Control of Computer Programs in the Project Environment, Computers in Engineering 1982, ASME, New York 4, 127-135.

11. Richter, H.P. Effective Computer Program Development and Use, Proceedings IEEE COMP/SAC, Chicago, 1979.

12. Richter, H.P., Effective Computer Program Documentation, Fourteenth Asilomar Conference on Circuits, Systems & Computers, IEEE Computer Society, Los Alamitors, CA, 1980, 275-286.

13. Richter, H.P. Problem Solving for Conflict Management, 1979 Proceedings of the Project Management Institute, 11th Annual Seminar/Symposium, Atlanta, GA, 275-286.

Dr. Horst P. Richter is with the Bechtel Power Corporation in computer program development, modification, and user support. He also serves as chairman of the ASME Committee on Computer Systems.

1984 - Philadelphia

1985 - Denver

1986 - Montreal

1987 - San Francisco

1988 - Chicago

Applications are invited for anticipated tenure track position as assistant or associate professor with demonstrated teaching and research interest in Building Construction, Construction Management and/or Construction Engineering. Applicants should: be registered engineer or ability to be registered, hold doctorate in an appropriate area or MS with at least 5 years experience. Submit letter and resume to Dr. R. Helms, Director, Architectural Engineering, University of Kansas, Lawrence, KS 66045-2222. Application deadline December 1, 1983. AA/OOE Employer.

This material has been reproduced with the permission of the copyright owner. Unauthorized reproduction of this material is strictly prohibited. For permission to reproduce this material, please contact PMI.



Related Content


Publishing or acceptance of an advertisement is neither a guarantee nor endorsement of the advertiser's product or service. View advertising policy.