# Using knowledge elicitation techniques in the risk assessment of space projects

**Nicola DI PLACIDO, Managerial Engineer ***stageur* c/o Alcatel Alenia Space Italia S.p.A.

*stageur*c/o Alcatel Alenia Space Italia S.p.A.

**Corrado LO STORTO, University of Naples “Federico II”**

**Abstract**

Risk Management is increasingly gaining the attention of industries, as an essential and integral part of program management. This paper will deal with a program Risk Assessment phase aiming at evaluating both the probability of occurrence of an undesired event and its relevant impact in terms of costs, schedules and performances.

Information about the likelihood of an undesirable event occurring and its consequences is often unavailable through historical data, forecast methods and in literature. In many circumstances a frequency analysis cannot be performed, or may be extremely costly. In such cases, the judgment of the experts is the only source of information through which a quantification of uncertainty may be obtained.

The elicitation of knowledge is a formal process of research aimed at obtaining information or answers to specific questions about determined quantities. It is performed by interviewing the experts after having informed them about the problem investigated and the goals and results expected from the interview. In literature manifold methods and techniques of expert knowledge elicitation have been proposed. Most of them have been developed to meet specific risk analysis requirements for specific Programs or Organizations. These methods, however, may require a long implementation time or a complex mathematical framework. Moreover, if the implementation environment differs from that in which they were developed, they might be hardly implementable or even unsuccessful.

The purpose of this article is to present a methodology for the elicitation of expert knowledge in risk management processes which can be easily implemented in various circumstances. To that purpose, many different knowledge elicitation techniques for the extraction of probability distributions from the interviewed experts have been analyzed and compared. Among them the following techniques have been sorted out as resulting more suitable for the implementation of the method: direct technique, diagrammatic technique, decision tree technique, and modified Churchman/Ackoff technique.

The choice of the technique to be employed will highly depend on the type of information the specialist has retained. Accordingly a procedure has been developed to sort out the most adequate techniques to be applied.

**Introduction**

This paper presents a methodology which provides a framework for the analysis phase of complex programmes, applicable both on planning and on reviewing and checking the programme itself. This methodology is based on the adoption of a systemic encoding process of the expert's knowledge.

The paper is structured in four sections, the first of which introduces the basics of risk quantitative analysis. The second section deals with how to elicit the expert's knowledge and describes the techniques employed in the method proposed. The third section breaks down the method by help of a flow-chart. The fourth paragraph provides an application of the method to a programme of a payload for a commercial satellite. Conclusions sum up the most significant aspects of the method application.

**RISK**

*Risk is described as* “the likelihood of an undesirable event occurring multiplied by its consequences*” (Prithcard 1997);* it is an uncertain event or condition which may have positive or adverse impact on a programme objective.

Risk analysis requires a preliminary phase of identification on which all the events occurring are highlighted which might affect the program adversely. Once the identification phase has been completed, the analysis will determine the entity of the risks identified and thus will allow:

- the determination of preventive and mitigation actions;
- the determination of contingency funds in case the undesirable events will anyway occur;

**The quantitative analysis**

Being *R* the risk variable, *p* the likelihood of the event occurring and *c* the event consequences, we may assume:

The consequence of an undesirable event is a random variable, which cannot be determined in advance and which may be both continuous and discrete. When continuous, it will be marked by a probability density function (pdf) *f(c)*, and by a cumulative distribution function, named CDF, whose relation is:

If it is discrete, instead, it will be marked by a set of probability values P(c_{i}) and will be represented by a cumulated bar chart which will give a step function.

Considering then that the time *t* when the event occurs during the program, is random, the probability *p* of its occurrence at the end of the program will be given by:

where *f(t)* is the pdf of the random variable “time of occurrence”, while t_{o} and t_{f} are the program starting time and the end-of-program time.

The risk probability distribution will be the same as for *f(c)* in case of continuous variable, but scaled according to parameter *p.* The risk value *V _{r}* to be estimated will correspond to the value expected for such a function and namely:

On changing the time within which the event might occur also the risk value *V _{r}* will change.

If *n* is the number of risk events identified in the preceding phase and *V _{ri}* with i=1, 2, …,

*n*is the risk value attributed to one of the

*n*events, then the programme total risk

*R*wil be given by:

_{P}Consequences of an undesirable event occurring along a programme can be estimated in terms of costs by attributing a monetary value to each variable *c.* The contingency provision to be set aside for each undesirable event may be assumed as coinciding with the relative risk value *V _{Ri}*, while the provision set aside for the whole programme coincides with

*R*

_{P.}The forms of the probability density functions will depend on the accident concerned and may usually range from the simple “pulse” distribution, to a “uniform” distribution, or to a “*triangle*” distribution. More complex functions can be obtained through the *“Weibull”, “Gamma”* or “*Beta”* functions.

Risk quantified as above implies the availability of information. An accurate analysis of risk quantification techniques as proposed in literature demonstrates that most of the techniques proposed require inputs which are not normally available to those who will manage the risk. These inputs may include data deriving from previous programmes, special skills required to process complex models, knowledge about unknown phenomena which are univocally determined by the specific program, and (above all) availability of time and money.

When this occur, how may we perform a consistent risk quantification?

The answer is: by eliciting expert judgement.

Experts through their knowledge, accumulated after years of experience and observations, are able to understand and assess phenomena by quickly providing data and info which could not be obtained differently. When deep analyses, data or theories are not available, expert judgement becomes the most important instrument of risk analysis.

**Expert Knowledge Elicitation**

Most people are unable to reason in terms of probability distributions, so an expert may provide useful information about such distributions only if an analyst, or “elicitor”, helps him to extract them by means of adequately structured interviews.

The elicitation procedure may be broken down into five steps :

**Step 1**: Expert Motivation

The aim of this step is the establishment for the elicitor of a confident relationship with the expert. Before starting discussion, it is useful for the elicitor to obtain information about the expert's past work experience and carry out the interview in a place where the expert can access materials relevant for the investigation.

At the beginning, the elicitor will explain the problem to be investigated and how the analysis will be conducted. Key sub-step of this phase is the identification of any motivational biases (e.g. personal involvement of the expert in the risky activity). If such biases are identified, they may be minimised by disaggregating the risk, or by restructuring the risk presented.

**Step 2:** Structuring of the elicitation process

The elicitation structuring phase consists of four sub-steps: (1) setting of the variable; (2) identification of a range of results; (3) disaggregation of variables if required; (4) selection of an adequate unit of measurement. The purpose of the elicitor is to obtain an unambiguous specification of the quantity to be assessed. The quantity will be specified as clearly as to allow the expert to forecast future scenarios and identify the value which this quantity might assume.

**Step 3:** Conditioning the expert's judgement

In this phase the elicitor will induce the expert to take into consideration all the relevant information he possesses about the uncertain variable. The expert will be induced to consider either the information directly relevant to the risk concerned or the information he possesses about similar cases.

The expert might be requested to react to different scenarios proposed by the elicitor, or to forecast those scenarios in extreme situations. Or else the same problem might be proposed in different forms.

**Step 4:** Encoding the probability distribution.

The purpose of encoding the event probability distribution is to obtain a quantitative description of the subjective probability distribution which better reflects the range of values identified by the expert. The encoding phase will better start from the extreme values of the distribution. To this purpose the first question for the expert to be asked concerns which values may be regarded as extreme.

Codification techniques are manifold and fall into different categories:

#### Direct techniques

Through direct techniques experts are requested to answer the elicitor's questions by providing figures. The expert will therefore be requested to carry out a frequency analysis though restricted to few scenarios. The answers obtained will allow the identification of points allowing the plotting of a diagram representing the subjective probability distributions of the variable identified in step 2. That will be obtained by interpolation, so the higher the number of answers obtained from the expert is, the greater the distribution accuracy will be. *Diagrammatic procedures*

By following a diagrammatic procedure the expert selects a pdf diagram which better represents the trend of the variable concerned. Starting from a uniform probability distribution, it is possible by subsequent steps to model the probability distribution more and more accurately thanks to the answers the expert provides to specific questions asked by the elicitor. The diagrammatic procedure will be better applied if the expert is familiar with the probabilistic risk assessment theories which help understand and select the parameters concerned.

#### Modified Churchman/Ackoff procedure

The modified Churchman/Ackoff procedure consists in comparing different scenarios first and then attributing a score to each of them. In this way, once a priority order has been set of the events probably occurring, relative probability values may be assigned to them thus allowing to draw the probability density function researched. The expert will be allowed to distinguish among different possible scenarios and compare one with another.

#### Event tree breakdown techniques

Tree graphical representations of events are constructed thanks to information obtained by forward reasoning. Starting from the main risk event, the tree is developed through subsequent branches identifying specific subevents whose probability of occurrence directly leads to the occurrence probability of the main event.

It is a graphical technique providing a systematic description of combinations of possible sub-events which may lead to the occurrence of the “top event” investigated. The result investigated on applying this technique consists in the logic determining the risk scenario and especially the probability of occurrence of the event. Connections between the sub-events are expressed by using the Boolean logic with “AND” blocks, representing the intersection of sub-events, and “OR” blocks, representing their junction. This technique requires knowledge of causes and/or effects generated by the event occurring. Any sub-event will then be applied one of the preceding encoding technique in order to identify a probability distribution.

**Expert's different types of knowledge**

The first aspect to consider in designing a method for the elicitation of expert knowledge, is to identify the experts, to evaluate his level of expertise. When the assessment parameter employed is single and objective, it is easy and effective to determine people expertise on the basis of a score. Unfortunately in most cases expertise boundaries are unclear. Using a single parameter to measure people expertise might not be substantiated; so it will be necessary to adopt different criteria. The best way to assess a group of experts is attributing them a relative weight obtained by individually comparing the same parameters by the AHP^{1} method.

The expert's knowledge of the undesirable events occurring and their consequences highly depends on the type of events occurring. For this reason, knowledge about undesirable events may be classified into four categories as follow:

- 1)
*Quantitative knowledge.*It is the most comprehensive type of knowledge and allows the expert to distinguish among the different predictable scenarios and carry out a frequency analysis of the same. It can arise from the availability of a risk data base. - 2)
*Qualitative knowledge*In this case the expert, though familiar with the undesirable event occurring and its consequences, is unable to retrieve any data relevant to them. - 3)
*Partial knowledge.*In this case the expert is capable of distinguishing the various possible scenarios though he cannot determine exact likelyhood values probably because the expert has observed the accident occurred and its consequences only few times. Yet thanks to his experience he is able to compare and assess alternating scenarios. - 4)
*Relative knowledge.*In this case the expert is unable to determine the likelihood of the occurrence of an undesirable event and its consequences but he is capable of determining the probability values of other events which are cause and/or effect of the main event. This type of knowledge refers to events, which though frequent, show themselves in many different modes.

Each of the previously presented elicitation techniques requires a specific type of input data to be implemented. These data consist in information obtained by the experts and so depend on the experts' knowledge; this implies that the type of technique to be adopted will depend on the expert's knowledge available:

direct techniques requires a quantitative knowledge; diagrammatic procedure a qualitative knowledge; Modified Churchman/Ackoff procedure a partial knowledge; tree breakdown techniques a relative knowledge.

The method for the elicitation of expert knowledge will be based on the following assumptions:

**1)**each hazard is differently known (both in terms of knowledge and quantity of information) by the persons involved in a programme who will provide different information on it;**2)**knowledge may be broken down into four categories: quantitative, qualitative, partial, relative;**3)**applicable elicitation techniques basically depend on the type of information which experts can retrieve and so on the type of expert knowledge.

**A Framed Methodology for the Elicitation of Expert Knoweldge**

The development of the method proposed is represented as a flow chart in Exhibit 1. The method includes 4 macrophases: A) identification of risks and of relevant experts; B) a process of knowledge elicitation; C) explicitation of the single risk; D) explicitation of the program risk.

**A. Risk identification.** Members of the program team including the elicitor, will meet in a brainstorming meeting. By analysing the WBS (work breakdown structure), processes and check lists available, they will identify potential hazards listing them in a watch list where the same will be matched to the names of the relevant experts. At the end of this phase the elicitor will fill in an M matrix containing “*n*” risks “r_{i}” (with i = 1,2,…,n) identified on lines and the “m” experts, “e_{ij}”, in the columns (with j = 1,2,…,m), where *m* varies according to the risk concerned.

**B. Process of elicitation of the single risk**

**B.1.**as first the elicitor will be provided with some preliminary information about the event as specified below:*Preliminary information:*- Description of the event generating the hazard;
- The business area at risk (engineering, manufacturing, integration, management, contractual, financial);
- The activity or process originating the risk.

Personal data about the expert involved (years of experience in the area concerned, degree of relation between the position occupied and the hazard concerned) which will be described and scored.

**Exhibit 1.** Flow chart.

**B.2.**the elicitor explains the process and how the analysis will be conducted by illustrating the probabilistic methodology and the elicitation techniques applicable.*Motivating*:**B.3.**quantities to be estimated will be identified in terms of time of occurrance and its consequences by choosing the appropriate units of measurement. A range of possible values is in this phase devised both for the time of occurrence of the event and for the relevant consequences.*Structuring*:**B.4.**the elicitor will induce the expert to recall all his knowledge about the uncertain variables (time of occurrence and consequences), by helping them think about the problem from different points of view so as to minimise any possible bias.*Conditioning*:**B.5.**starting from the conditioning phase the expert is asked a set of explicit or implicit questions aimed at classifying his knowledge about the actual occurrence of the event and its consequences. On the basis of his answers the adequate techniques will be identified to estimate the variables concerned.*Knowledge questionnaire*:**B.6.**the elicitor, depending on the type of knowledge he has of the undesirable event, will apply the most appropriate technique by obtaining the occurrence probability distribution of the undesirable event concerned depending on the progress of the programme.*Application of the elicitation procedure to determine the event occurrence probability distribution:***B.7.**the elicitor will verify the effectiveness of the technique applied by showing the expert the distribution obtained and by asking whether the latter agrees with the meaning of some values as the mode, the mean, or scenario pairs having the same degree of likelihood of occurrence. If the expert does not agree with the results obtained,it is necessary to repeat the application of the technique till the derivative curve does not wholly reflect his opinion.*Verifying the elicitation procedure:***B.8.**the elicitor, depending on the type pf expert's knowledge about the consequences of the undesirable event occurring, will apply the most adequate technique so as to obtain a consequence probability distribution.*Application of the elicitation procedure to determine the consequences probability distribution:***B.9.**the elicitor will verify the efficacy of the technique applied.*Verification of the elicitation procedure:*

Now, the expert knowledge elicitation process has been completed. A new expert will be now questioned, by considering the same line of matrix M so as to collect other data about the same risk. The process will be repeated till the last expert in the line has been interviewed. Now all the information concerning the risk is available to pass to the following phase.

**C. Explicitation of the single hazard.**

- C.1.
different assessments of the experts will be treated individually and then aggregated after having been elicited. To this purpose the best technique to apply is the linear non-Bayesian approach which implies a weighted mean of the distributions. The weights represent a measurement of the expert's expertise about the risk concerned. They are measured through the AHP method and by considering some objective features of the experts as indicators.*Assignment of weights to the experts:* - C.2.
the m*Aggregation of the occurrence probability distributions:*_{j}functions of the undesirable event occurrence probability distribution , are each multiplied for the relative weight of the expert whose opinion they are representing. They will be then combined in a weighed sum using a Monte Carlo simulation, obtaining one single distribution for the n^{th}risk. - C.3.
the m*Aggregation of the consequence probability distributions:*_{j}consequences probability distribution of the undesirable event considered are each multiplied by the weight of the relative expert they stand for. They wil be then combined in a weighed sum, resulting in one single distribution for the nth risk. - C.4.
, the distribution of the risk concerned is obtained by multiplying the probability of the event occurring at the end of the programme, by the relevant consequences probability distribution . The probability is a parameter given by the integral (3) between the starting time*Risk distribution:**t*and the end time_{0}*t*of the aggregate occurrence probability density (2) considering_{f}*r*:_{i}

The risk probability distribution *r _{i}* will therefore be a function of the consequences:

- C.5.
the aggregate distribution of probabilities of occurrence of the undesirable event shows how the probabilities change along time, that is as the programme is developed. starting from time__Risk curve__:*t*when the event might start occurring. The risk curve therefore represents the_{0}*r*risk variation as a function of the programme state of progress:_{i}

where the consequences is a parameter determined by the expected value of the consequences probability distribution :

The risk curve therefore represents the risk variation *r _{i}* as a function of the programme progress:

By repeating the procedure for the *n* risks of matrix M, it is possible to make the program total risk explicit, that is by representing it with a single probability distribution and a single risk curve.

**D. Explicitation of the program risk**

- D.1.
it is now possible to provide*Updating of the watch list:***a**quantitative value of the risks as qualitatively prioritized, when included in the watch list and in the matrix M. Quantified risk values are calculated through the formula : . - On the basis of these values, risks are reclassified in the list according to their quantitative value as identified. Iin this way it is possible to accurately establish which risks require treatment actions.
- D.2.
the program total risk is is obtained through a combination of all the n*Distribution of the program risk:*^{th}distributions of risks , by a Monte-Carlo simulation. - D.3.
in order to obtain a complete picture of the risk the last step to be followed is to observe its variation along time as forecast. That may occur by drawing the programme risk curve, that is the curve which represents the aggregate/manifold of all the curves of the single risks .*Programme risk curve:*

**An Application of this Methodology**

The case here reported refers to a proposal about a Payload for the international market. This Payload consists of two sections, operating respectively in Ku band and in Ka band. Both sections include a repeater and the transmission/reception antennas..In addition to the Payload, the supply will include a Telemetry Command and Ranging system (TCR) through which the satellite receives commands from the earth and sends back telemetries (information about the satellite status).

Twenty outstanding risks have been identified through the initial brainstorming, for each of them, during the process of elicitation both the consequences of costs and of a delay vis-à-vis the values scheduled have been considered. The Monte-Carlo simulation on cost-risks has resulted in the distribution reported in Exhibit 2. The risk of a delay vis-a-vis the times scheduled obtained through the Monte-Carlo simulation, instead, has been attributed a financial value by introducing a “delay” random variable within the provision of a penalty for delayed delivery. The outcome of that has been the risk of a penalty distributed as shown in Exhibit 3.

Exhibit **2.** Programme cost risk with no penalty.

**Exhibit 3.**. Risk of penalty.

Now it is possible to take into account all the aspects of the risks identified. Through a Monte Carlo simulation the program total risk shown in Figure 4 is obtained, whose expected value is approx. 560 K€.

A very important result which such an analysis provides is that, by assuming the risk mean value as contingency provision for the programme, only a 61% probability is obtained of covering all the costs generated by adverse events (Exhibit 5). To obtain a 75% risk coverage, or an 85% one, for ex., larger funds should be set aside, that is, 830 K€ and 1070 K€ approx. respectively.

**Exhibit 4**. Programme risk.

**Exhibit 5**. The probability of obtaining a programme risk lower than the mean value is by 60,85%

As concerns the financial risk assessment, it is possible to draw its trend along the whole life-cycle of the programme. This trend of the risk along time is shown as the red curve representing the program risk in Exhibit 6. The program risk curve aggregates four curves representing four separate risk subsets as follow: 1) risks of penalties for the repeater; 2) risks of penalties for the antennas; 3) risks for the repeater; 4) risks for the antennas.

In T_{0} all the programme risks converge, as their potential occurrence is very high at this time. After this time, the risk depending on the availability of the equipment required in T_{0}, does not exist any more and so the programme risk curve decreases accordingly.

In the first curve constant phase, all the remaining risk potentialities aggregate at the end of which risks connected to the *engineering phase*, both as concerns the antennas and the repeater, may disappear or arise in a three month period. After this time, the second constant phase of the Manufacturing and Test curve starts. In this phase the curves of risks to the repeater remain over those affecting the antennas. Risks in this phase have a higher degree of occurrence probability mainly due to delays generating penalties. For this reason, at the end of this phase, in a period of six months at the latest, they will disappear completely sharply reducing the programme total risk.

**Exhbit 6.** Elements of the programme risk curve.

At this time both the Payload consignments will have been delivered to the Customer and the only risk left will be relative to maintenance.

**Conclucsions**

The application of the methodology as here suggested has proved that the approach is both effective.

The method, in fact, with very short implementation times, *succeeds in eliciting from the expert relevant data and information about the real trend of the risk which the expert was not aware of possessing.*

Each interview takes about twenty minutes to be performed. During this lapse of time it is also possible to verify the flexibility of the method and its capcity of adapting to any risk environment.

In order to guarantee that the final results truly represent the trend of the risk, the method proposes to elicit the knowledge of more experts. In this way the assumptions made about the trend of the risk may be confirmed.

**References**

Abrahamsson, M. (2002). *Uncertainty in Quantitative Risk Analysis – Characterisation and Methods of Treatment.* Retrieved on November 09, 2004 from: www.brand.lth.se/bibl

Ayyub, B. M.(2000 December). *Methods for expert-opinion elicitation of probabilities and consequences for corps facilities*. . Retrieved on May 02, 2004 from www.iwr.usace.army.mil/iwr/pdf

DoD (2001 February). *Risk management guide for DOD acquisition.* Retrieved on November 29, 2004 from http://ax.losangeles.af.mil/axl

Federal Aviation Administration (December 20, 1996). *Acquisition and program risk management guidance.* Retrieved on October 25, 2004 from: http://nasdocs.faa.gov/nasiHTML/risk-mgmt

Frey, H. C. (Department of Transportation 1992 September). *Quantitative analysis of uncertainty and variability in environmental policy making.* Retrieved on November 09, 2004 from www4.ncsu.edu/~frey

Frey, H. C. (1998). *Introduction to Uncertainty Analysis.* Retrieved on November 03, 2004 from www4.ncsu.edu/~frey/freytech.html

Garthwaite, P. H. Kadane, J. B. & O'Hagan A. *Elicitation.* Retrieved on May 23, 2004 from www.jouy.inra.fr/unites/miaj/public/matrisq/Contacts

Loveridge, D. (2002 June). *Experts and Foresight: Review and experience.* Retrieved on May 23, 2004 from www.mbs.ac.uk/research/centres/engineeringpolicy/publications/documents

Kerzner, H. (1998). *Project management: a systems approach to planning, scheduling, and controlling* New York John Wiley & Sons, Inc.

Pritchard C.L. (1997). *Risk management: concepts and guidance.* USA, ESI International

USACE (1992 March). *Guidelines for risk and uncertainty analysis in water resources planning (Volume I).* Retrieved on July 13, 2004 from: www.iwr.usace.army.mil/iwr/pdf

Yoe, C. (2000 December). *Risk Analysis Framework for Cost Estimation.* Retrieved on July 13, 2004 from http://www.iwr.usace.army.mil/iwr/pdf

_______________________________

^{1} AHP (Analytic Hierarchy Process) is a method supporting multi-criteria decisions (MCDA, Multi-Criteria Decision Aid) developed by Thomas Saaty in the late 70s (Saaty 1977 e 1980). The method, in general, allows the identification of priorities among different alternatives (in this case the experts) by pair comparisons.

© 2006, Gerosa, Di Placido, Lo Storto

Originally published as a part of 2006 PMI Global Congress Proceedings – Madrid, Spain

Advertisement

Advertisement

## Related Content

Advertisement