Predicting the Impact of Multiple Risks on Project Performance
A Scenario-Based Approach
Victor A. Bañuls, Department of Management and Marketing, Universidad Pablo de Olavide, Seville, Spain
Cristina López, Department of Management and Marketing, Universidad Pablo de Olavide, Seville, Spain
Murray Turoff, New Jersey Institute of Technology, Newark, New Jersey, USA
Fernando Tejedor, MSIG Smart Management, Seville, Spain
This article suggests a scenario-based approach to properly managing risks during the lifetime of a project. Our proposal aims at giving managers a structured process to predicting the impact of the occurrence of multiple risks that can affect project performance. This is a product of combining Cross-Impact Analysis (CIA) and Interpretive Structural Modeling (ISM) mechanics, which improve the predictive capacity of existing risk analysis techniques. In order to validate their risk predictions, we compare them with a sample of real projects carried out in an engineering company. The findings show a high explanatory capacity to forecast project risk influences.
KEYWORDS: scenarios; Cross-Impact Analysis (CIA); Interpretive Structural Modeling (ISM); risk events; project management
Each project developed in a business environment faces numerous difficulties and uncertainty. A report indicates that only 56% of projects meet their targets and business intent; the results are far better in high-performing firms than in low-performing ones (Project Management Institute, 2014). Since incorrect assessments and misjudgments may lead to risks, these might provoke significant damage if practitioners do not proactively monitor and deal with them. In fact, proper risk management helps practitioners to be aware of the real status of their project, its problematic aspects, and the potential existing causes of project failure (Iversen, Mathiassen, & Nielsen, 2000) thus enabling them to resolve project threats more efficiently.
To provide support to project managers in this endeavor, the scientific literature has focused its efforts on identifying, assessing, and resolving project risks for many years (Ahmed, Kayis, & Amornsawadwatana, 2007; Bannerman, 2008). Thus very diverse approaches have until now been used.
Empirical studies help managers to better understand how the different risk constructs influence each measure of project performance. Some publications have undertaken a multivariate analysis of variance to detect recurring patterns in risk dimensions across low-, medium-, and high-risk projects (Wallace, Keil, & Rai, 2004b), as well as levels of project performance (Han & Huang, 2007). In the same vein, one article explores how these may be affected by risk management strategies using a Structural Equation Model (Na, Simpson, Li, Singh, & Kim, 2007). Another investigation applies a cluster analysis to get a better understanding of how project duration can modify the level of risk exposure in each risk dimension (Huang & Han, 2008).
Multi-Criteria Decision Making has also demonstrated itself in the literature to be appropriate for calculating the risk of exposure. To do so, the Analytical Hierarchy Process is by far the most used technique to prioritize project risks (Luthra, Mangla, Xu, & Diabat, 2015; Salmeron & Lopez, 2010; Zayed, Amer, & Pan, 2008) and has even been combined with other methods such as TOPSIS (Taylan, Bafail, Abdulaal, & Kabli, 2014) to gain robustness. These studies provide a risk ranking that can help practitioners to develop response actions in a more effective way, although they do not consider the existing interdependences between project risks.
There is wide agreement in the literature that project risks are closely related (Kwan & Leung, 2011). Bearing in mind that project risk behavior can be represented as a complex network of interactions, Fang and Marle (2013) developed a matrix-based method to re-evaluate project risk propagation effects based on mathematical calculations. In the same vein, Marle (2014) provides a structured management process to identify, assess, and analyze project risks and the interactions between them, which will help project managers make response actions more accurately. Indeed, a project becomes an increasingly complex way to achieve a unique objective, which leads to the existence of uncertain events strongly connected to each other. This fact encourages the appearance of different dependency models in order to analyze risk influences on project outcomes.
Qualitative models specifically represent a simplified view of reality through a set of variables and their causal relations (Brandenburg, Govindan, Sarkis, & Seuring, 2014). Interpretive Structural Modeling (ISM) provides a systematic process to identify direct and indirect connections between project risks. ISM is a process that transforms ill-defined mental models of systems into graphical and structured models. In a project risk management context, this is a technique used to develop a multilevel hierarchy model among a set of identified risk factors based on its driving and dependence power (Aloini, Dulmin, & Mininno, 2012b; Samantra, Datta, Sankar, Bikash, & Ranjan, 2016); however, ISM is not capable of forecasting project risk effects on project performance by itself.
Accordingly, this article proposes a scenario-based approach combining ISM with the Cross-Impact Analysis (CIA) method. CIA is the general name given to a family of techniques designed to evaluate changes in the probability of the occurrence of a given set of event consequences to the actual occurrence of one of them. In this case, we will merge CIA with ISM to account for the interactions between a set of risk factors and forecast their probability of occurrence. The combination of both methodologies will permit the creation of a complete risk network based on subjective project manager evaluations as well as on historical data. In this way, it is foreseeable that the model may contain a large number of events and causal relationships between each other.
Furthermore, CIA–ISM will enable the foreseeing of the project's outcomes in accordance with the probability of occurrence of both source events (before the project begins), dynamic events (during the project's development), and outcome events at the end of the project development. This enhances the capacity of predicting risk scenarios by not constraining the propagation of the model under an established pattern; nor is the model evolution limited to successive times, since a directed graph with cycles represents the risk network. These issues reinforce the reliability of the findings of CIA–ISM with regard to other modeling techniques applied to project risk analysis. Indeed, CIA–ISM was applied with very good results in other areas, such as, for example, in emergency management (Bañuls, Turoff, & Hiltz, 2013), critical infrastructure interaction analysis (Turoff, Bañuls, Plotnick, & Hiltz, 2014), and operational risk analysis (Ramírez de la Huerga, Bañuls, & Turoff, 2015). This is the first time, however, that it has been applied in project risk analysis. In order to verify the CIA–ISM validity in project risk management, we compare the results attained from 22 real cases.
Based on the results obtained with CIA–ISM, practitioners will be able to apply measures aimed at successfully finishing their projects. Recent work defines the most suitable strategy for dealing with each risk (López & Salmeron, 2012; Zhang & Fan, 2014). Nonetheless, there is scant research that provides the mechanics to proactively manage unforeseen events throughout the entire risk process. CIA–ISM enables project managers to manage risk events throughout the entire life cycle of their enterprise projects in compliance with the specifications established in ISO 31000:2009 (International Standardization Organization, 2009).
According to A Guide to the Project Management Body of Knowledge (PMBOK ® Guide) – Fifth Edition (Project Management Institute, 2013), a project is defined as “a temporary endeavor undertaken to create a unique product, service or result.” Multiple needs, problems, and opportunities, such as developing a new product or service, improving business processes, changing organizational structures, or incorporating emerging technologies and systems, act as triggers for initiating additional projects.
Over the past couple of decades, many standards, tools, and methodologies have been proposed to guide the activities of project managers (Raz & Michael, 2001; Samad & Naveed, 2006). The International Organization for Standardization (ISO) presented the ISO 31000:2009 (International Standardization Organization, 2009), which provides principles and guidelines to support firms in the whole risk management process. Figure 1 depicts the main steps proposed in this standard.
We can observe that the risk management process is structured around its support and core elements. The first support element aims to ensure that all parties involved pursue the same target (SE1) and control all critical aspects of the risk management process (SE2). These remain active throughout the entire risk management process to properly guide core procedures development.
The core procedures represent the backbone of the risk management process (Purdy, 2010). This begins through defining the scope, limits, and goals pursued in the project, as well as exploring those external and internal factors that make it difficult to achieve them (CE1). Having established the context, project managers then assess the risks their project faces. According to the ISO 31000:2009 (International Standardization Organization, 2009), risk assessment consists of identifying (CE2), analyzing (CE3), and evaluating risks (CE4).
Figure 1: The risk management process from ISO 31000:2009.
Project managers first identify which risks might arise during project development and how, when, and why (CE2). The ISO 31000:2009 (International Standardization Organization, 2009) identifies in its Annex B some techniques—such as brainstorming, structured or semi-structured interviews, or the Delphi method—as very appropriate for carrying out project risk identification. Brainstorming is a group creativity technique by which efforts are made to find a conclusion for a specific problem by gathering a list of ideas spontaneously contributed by its members. Structured or semi-structured interviews are structured conversations in which specific questions occur in a specified order oriented to identify risk. The Delphi method is also a structured communication technique, originally developed as a systematic, interactive forecasting method that relies on a panel of experts. The experts answer questionnaires in two or more rounds to develop models of complex situations (Linstone & Turoff, 1975).
Earlier studies provide risk identification checklists and taxonomies to support this essential step in successful project risk management (Carr, Konda, Monarch, Ulrich, & Walker, 1993; Hillson & Simon, 2007; Hwang, Zhao, See, & Zhong, 2015; Schmidt, Lyytinen, Keil, & Cule, 2001; Zhou, Vasconcelos, & Nunes, 2008). These highlight the different types of project risks that may arise during the project's lifetime. For example, Wallace, Keil, and Rai (2004a) identified the following risks: organizational environment risks, user risks, requirements risks, project complexity risks, planning and control risks, and team risks. Empirical research shows that these types of risks impact project performance measures with varying levels of intensity, with requirement risks being the most critical type (Han & Huang, 2007).
It is, therefore, necessary to conduct a comprehensive analysis of the risks identified, bearing in mind the causes and consequences of each one, as well as the interdependencies between them (CE4). This permits determination of the risk level of each element based on its probability of occurrence and the impact estimated (Boehm, 1991).
During the evaluation stage, project managers compare the level of risk estimated in the analysis stage with the level of tolerance established in the contextualization stage. In those cases where the threshold is surpassed, they should consider the possible treatment steps (CE5) and prioritize them.
In line with this process, numerous studies have applied multicriteria decision-making methods for statically evaluating project risks in different domains (Rodríguez, Ortega, & Concepción, 2016; Taylan et al., 2014). Unfortunately, they are not capable of representing interactions between project risks and predicting their evolution over time. With this shortcoming in mind, other studies have developed instruments based on diverse modeling techniques.
System dynamics have been applied to forecast the impacts of incentive factor allocation on project risks (Yi & Xiao, 2008). This tool represents the behavior of the real problem by means of a mathematical expression. It is extremely difficult to define it in the project management domain, because there are no widely accepted risk measures. This fact would reduce events included in the model and, as a result, the possibility of determining which events actually affect project development.
By contrast, a Bayesian belief network enables the representation of expert knowledge under conditions of uncertainty. This technique has been widely used to evaluate project risks in different fields (Fan & Yu, 2004; Yet et al., 2016). Structural learning, however, has the form of a directed acyclic graph (Lee, Park, & Shin, 2009), which influences model evolution at successive times. Other studies evaluate project risks using neural networks (Chen & Hartman, 2000; Neumann, 2002), although the dynamic behavior of the model is also constrained.
Fuzzy cognitive maps incorporate certain characteristics from neural networks to forecast the impact of risk on project performance (Lopez & Salmeron, 2014) and provide mechanics to study the evolution of risk scenarios at successive times. An important strength of fuzzy cognitive maps lies in the propagation that, unlike neural networks, they do not follow an established pattern. However, they are not capable of quantifying the probability of occurrence of project risks, which is required to determine the critical level of each one (Boehm, 1991). Instead, the Petri net technique permits the calculating of the likelihood associated with each event but not its impact on project performance.
In the same vein, recent work presents an optimization model for selecting a risk response strategy considering risk interdependence (Zhang, 2016). Nonetheless, practitioners frequently have trouble understanding the use of these tools (López & Salmeron, 2012). Moreover, they do not support the entire risk management process during project execution.
With these issues in mind, a study proposes an integrated decision support system to model, analyze, resolve, and control project risks (Fang & Marle, 2012); yet it does not provide insight into the effects of risks on business performance, which is critical to improving the success of a project (Wang, Lin, & Huang, 2010). Therefore, there is an apparent lack of specific methods for successfully supporting the whole risk management process this study, hence, aims to close the research gap by proposing a new mechanism.
Risks and project performance are complex, closely related concepts. Moreover, their representation is complicated, unstructured, and not readily quantifiable; hence, the modeling technique selected must fulfill these specific requirements:
Requirement 1. The modeling technique should be capable of representing all possible connections between project risks. These are closely interrelated (Büyüközkan & Ruan, 2010; Fang, Marle, & Xie, 2016; Kwan & Leung, 2011; Zhang & Fan, 2014), without necessarily sticking to a linear pattern.
Requirement 2. The modeling technique should not ignore what is uncertain. Indeed, projects are undertaken in conditions of uncertainty (Pugh & Soden, 1986; Costa, Barros, & Travassos, 2007), the degree of which may obviously vary in accordance with the singularity of each one. The tool should be of great value when considering this degree of uncertainty.
Requirement 3. The modeling technique should be capable of representing real project events under conditions of uncertainty through a causal and direct graph with cycles. Otherwise, a directed acyclic graph would limit the model's evolution at successive times (Baldi & Rosen-Zvi, 2005) and its forecasting capability would therefore worsen.
Requirement 4. The dynamic behavior or propagation of the risk network should not follow an established pattern. In fact, the conditioned propagations would limit the feedback dynamic between closely related events (Wu, Kefan, Gang, & Ping, 2010).
Requirement 5. The modeling technique should allow for operating with scarce information, which is not normally available at the time of planning and/or executing the project (Dey, 2001). There are no widely accepted risk measures in the project domain either.
Requirement 6. The modeling technique should be capable of quantifying both the probability of occurrence and the severity of each project risk. These indicators enable the determination of each event's risk of exposure or critical level, which is required to better treat them. Notwithstanding, dependencies between events often hinder estimations of probability and risk impacts when these indicators are not directly measurable (Aloini, Dulmin, & Mininno, 2012a).
Table 1 depicts a comparison between modeling tools in accordance with the above-mentioned requirements. The symbol ♦ indicates a specific technique meeting a requirement (Rq). As shown in Table 1, CIA-ISM (Bañuls & Turoff, 2011) addresses the limitations of current methods regarding modeling complexity in project risk management. This technique has emerged as a useful tool for generating and analyzing scenarios using CIA. This approach aims at permitting project managers to work with large sets of related risks without using great computational infrastructures, because it is a graphical representation of complex systems following a simplified structured process. Moreover, it enables a set of plausible snapshots into the future, as along with an analysis of the interactions between critical events in the time frame specified.
|Fuzzy cognitive maps||♦||♦||♦||♦||♦|
Table 1: Comparison of the modeling techniques used to manage project risks.
The scenario generation models can be integrated with other predictive models designed to estimate the evolution of particular risks (such as conflict and non-cooperation between project team members), and provide a broader view of effects, which could occur in critical situations (Bañuls et al., 2013). In fact, CIA-ISM has been successfully applied in such different fields as emergency management (Lage, Bañuls, & Borges, 2013) and operational risk management in industrial environments (Ramirez et al., 2015).
CIA is a methodology developed to help determine how relationships between events may impact the resulting events and reduce uncertainty in the future. Due to its ability to analyze complex contexts with various interactions, CIA is one of the most commonly used techniques for generating and analyzing scenarios, both historically (Turoff, 1971) and currently (Bañuls & Turoff, 2011).
The analytical approach proposed by (Turoff, 1971) was developed specifically for restructuring cross-impact formalisms in a manner suitable for use in an interactive computer terminal. This requires users being able to modify or iterate their estimates until they consider that the conclusion inferred from them is consistent with their views. Furthermore, this method is based on the idea that an event may be unique in that it can only happen once (i.e., the development of a particular discovery or the outbreak of a specific war).
Following Turoff (1971), for this type of event there is usually no statistically significant history of occurrence that would allow the inference of the probability of the occurrence. The cross-impact problem, thus, is to infer casual relationships from some relationships among the different world views. This is established by perturbing the participant's initial view with assumed certain knowledge, such as the outcomes of individual events; that is, a subject's estimates actually cause a subject to estimate causality. Analytically, by asking subjects about the probabilities, the correlation coefficients (Cij) can be calculated using a variation of the Fermi-Dirac distribution function (Equation 1).
- Pi represents the probability of occurrence of the i-th event.
- The coefficient Cik represents the impact of the k-th event on the i-th event.
- Gi (the gamma factor) is the effect of all events not specified being in one factor in the model.
- A positive Cik means that event k enhances the occurrence of event i and if it is negative it detracts from the occurrence.
- Cik = 0 means event i has no impact on event k; its range is plus infinity to minus infinity.
- Cik Pk means the actual impact of event k upon event i.
Given the linear influence factors, we can show estimators of the consistent relative relationships between any event and those that influence it by plotting these relationships on a linear scale. We can then use a different modeling method, such as ISM, to analyze the complexity of the resulting weighted influence graph (Warfield, 1973). The following extension would allow individuals to receive a graphical visualization of their judgments and improve their ability to make improvements.
Interpretive Structural Modeling
ISM is a methodology used for dealing with complex issues such as societal systems (Warfield, 1976). The starting point of ISM methodology is a system that is made up of a set of n elements of a set S (Equation 2).
The relationship between the elements in set S is a binary relation. An n×n binary matrix A can come from the binary relation. This is also called the adjacency matrix to binary relation. All the path lengths being 1 in adjacency matrix A indicate that it is likely to be reached. Since every node can reach its own node path, lengths being 0 or 1 can then be used to indicate the possibility of reachability once the adjacency matrix is added to the identity matrix. Its mathematical equation can be shown as follows in (Equation 3).
Matrix N is known as the element connection matrix. From this matrix we can obtain the reachability matrix (M), which is a square, transitive, reflexive, binary matrix that serves to analyze which model relation is antecedent to it. Suppose Si and Sj are elements of the set S. If M(si, sj) = 1, this indicates that there is a path between node Si and node Sj. If M(si, sj) = 0, this indicates that it is impossible to go from node Si to node Sj. Every element in set S can be considered as a node and solved by graph theory.
The transformation from the CIA results in the ISM model is a perturbation process in which one takes the largest absolute value impacts between any two events and converts them to 1 and the rest 0 to form a standard binary network of 0 or 1 relationships (Bañuls et al., 2013). One examines the results and keeps adding more of the Cij values converted to 1 to see how the graph gradually changes and to determine what might be a good point to use as a final conceptual model of the overall scenario. In this way, factors are not randomly incorporated, but rather incorporated on the basis of the |Cij| values. M(si, sj) are ranked in descending order according to its |Cij| values.
The order of inclusion is found by arranging its |Cij| values in order of decreasing size and then reviewing the resulting diagraph for all Cij's greater than (or equal) to an absolute value. The process will finish when the model reaches the limit of the forecasted scenario (Bañuls & Turoff, 2011). This is the |Cij| value for which an Si event has at the same time—as antecessor or successor—the occurrence and nonoccurrence of an event Sj. The results provided by the ISM extension thus enable managers to receive a linear visualization of the risk interaction maps in their projects.
Validating the Predicability of the CIA-ISM Method in Project Risk Management
This section describes an illustrative application of CIA-ISM for supporting project risk management. The proposed method was implemented in a European industrial engineering company. This company is an SME focused on high-value projects for oil, chemical, and critical infrastructure industries. These projects are developed just for senior consultants with a volume of approximately 25 projects per year. The surveillance of this company, therefore, relies on its capability of managing complex projects.
The task for validating the study proposed has been the designing of a structural model to analyze the key risk factors for the proper performance of the consulting projects, which should comply with the indications of the ISO 31000:2009 standard for risk management (International Standardization Organization, 2009). With this in mind, the working model will be able to support decision-making processes (before and during real projects) by generating different scenarios. It had to be dynamic, in the sense that it would be able to adjust the probabilities of the outcome events as soon as new information about the source/dynamic events of a project become available.
In accordance with these requirements and the modeling techniques comparison described in the Methodological Background section, the project team decided to apply the CIA-ISM method. The application of CIA-ISM in this case study consisted of five different phases or steps, which are described in detail as follows:
STEP 1: Definition of Event Set and Initial Probabilities
The first step in this study consisted of defining the working model and its components; that is, the initial events set and the initial probabilities. The events set and the initial probabilities have been developed by both the CEO and the CIO of the company (by means of structured interviews). The data are based on the company's historical records and the considerable experiences of both (more than 20 years managing large engineering projects). Due to the specific area of activity of the company (organizational engineering), all projects have a common set of core risks; thus a set of 13 events was defined: four source events, five dynamic events, and five outcome events. In other complex contexts, working with this set of risks could not be fully representative; in this specific case, however, it describes the most representative organizational risks of their activity as demonstrated in the validation section of the method (see Step 5).
At this point it is important to note that they had total freedom to define the event set and initial probabilities. With this structure, they aimed at assessing the impact of different risks in the outcome of a project, measuring the outcome of a project in terms of image, customer satisfaction, delivery, and benefits. We present each of these items in the next section.
Table 2 depicts the source events identified and their initial probability. Following CIA-ISM terminology, source events are events that might happen before starting the triggering event of the scenarios, which in our case is the beginning of the project.
In this case, the initial probability represents a ratio of occurrence of the event based on the company's historical records. Experts also indicated the probability of occurrence of each element. For this purpose, they can assign a score between 0 and 100, where 100 means that the occurrence of the event is extremely probable and 0 extremely improbable. For instance, a value of 70 in “Project Profitability” (see Table 4) means that 70% of the projects of the company are profitable. Each source item represents a risk that might become a real problem before starting the project.
|Event Number||Label||Description||Initial Probability|
|1||Suitable profile||The profile of the project team (project manager and consultants) selected is adequate in terms of experience and abilities.||80|
|2||Requirements identification||Customer needs have been identified and analyzed in detail before starting the project.||70|
|3||Appropriate approach||The project's goals have been clearly established and a suitable methodological approach has been proposed for fulfilling them.||75|
|4||Internal relevance||The project is relevant to the client's decision makers. It has internal support from someone relevant in the company.||70|
Table 2: Source events: Events that might happen before starting the project.
|Event Number||Label||Description||Initial Probability|
|5||Customer collaboration||The client responds to requests from project managers at all times.||60|
|6||Adequate planning||Planning fits with the project's progress on schedule with minimal deviations.||60|
|7||Proper project management||The project is managed properly both in managerial and technical terms.||75|
|8||Proper execution by the consultants||The project is correctly executed by the consultants.||70|
|9||Changes in customer requirements||The client incorporates new requirements into the final product/service during the project, which are significant enough to affect it.||40|
Table 3: Dynamic events: Events that might occur during the project's lifetime.
Table 3 shows each dynamic event identified and its initial probability. A dynamic event represents all relevant risks that might arise during the project's lifetime. These elements only have an impact on outcome elements. Finally, Table 4 indicates the outcome events and their initial probabilities; these represent key performance indicators of projects. A previous investigation assesses project risks regarding its own project performance parameters as risks in the model (Lee et al., 2009). The outcome events might be affected by the behavior of source and dynamic events.
STEP 2: Cross-Impact Analysis
The second step of this study was to apply the CIA method to the event set in order to calculate the impact values for the impact matrix. Note that some values are zero when there is no influence relationship between the two events. Once the set of events in the model had been established, we built the cross-impact matrix through the inputs elicitation process. Following the CIA methodology—by asking subjects about the initial and the conditional probabilities—the correlation coefficients (Cij) can be calculated using (1). Based on the resulting estimations, we obtained the cross-impact matrix represented in Table 5.
|Event Number||Label||Description||Initial Probability|
|10||Delivery on time||The product/service is delivered on time.||60|
|11||Customer satisfaction||The customer is satisfied with the product/service.||75|
|12||Project profitability||The project has been profitable in the margin initially planned.||70|
|13||Professional image||The company's corporate image has benefited in professional terms due to the results of the project.||75|
Table 4: Outcome events: Events that might occur after that project has been completed.
The rows (i) and columns (j) of the matrix are the events; the cells are the influence factors Cij, the diagonal being the overall probabilities (OPV). Note that the cross-impact matrix is associated with the G vector. This represents the influence of external events (not explicitly specified in the model) on each i-th event. The G vector represents the weight of data uncertainty on scenarios that, in this case, is the 35.06% of the model. This G vector is included in the simulation as an external event (the impact of the environment on final outputs).
To read the Cij components from this matrix, we must proceed in the following way. Given that requirements identification (source event 4) happens, this generates an impact of +2.56 on customer collaboration (dynamic event 5). In this way, we can identify, categorize, and sort the greatest impacts and which of them are globally more important. In order to estimate the impact of source events (arising before the beginning of the project) and dynamic events (emerging during its development), we apply the following linear sums of Cij for the cross-impact matrix (Equation 4 and Equation 5):
Table 6 summarizes the influences of the source and dynamic events. Customer collaboration (event 5) is the dynamic event most impacted by source events, with a score of 9.08 (in this example, we add the absolute Cij values of all source events—columns 1 to 4—in row 5). That is, 28.7% of the total impact of source events influences customer collaboration.
The findings also reveal that customer satisfaction (event 11) is the outcome event that has received the strongest impact from source events. Hence, we can conclude that if managers seek to avoid customer problems in their projects, they should proactively manage events arising before the beginning of the project. On the other hand, the findings show that dynamic events have a greater influence on delivery on time (event 10) than the rest of the outcome sources; yet it can also be observed that the influence on outcome sources is similar.
Table 5: Cross-impact matrix and G vector.
STEP 3: Graphical Analysis
By applying the CIA approach described in the Methodological Background section, we can represent the scenario forecasted by means of a digraph. To do so, we applied the ISM method described in the Interpretative Structural Modeling section. Figure 2 represents the interrelations between the risks, |Cij|> = 2.03 being the limit of this forecasted scenario (Bañuls & Turoff, 2011). This means that the digraph only represents the pair of risks with impacts higher than or equal to 2.03 in absolute weight. The digraph includes 84.06% of the linear sums of |Cij|, which is a good indicator of the boundary of its level of representativeness. In addition, the digraph takes percentile 40 as the cutting point of the |Cij| distribution (Table 7). This indicates that the output of the CIA-ISM includes the highest 60% of the values of the distribution. In the same vein, the project team considered the digraph to be the most representative in accordance with their mental model of the problem.
The graph shows the sequence and potential cascading effects between risks. This is the representation of Node 4 (internal relevance) being a triggering event in the risk scenario; hence, in order to avoid failures in the project's execution and unsuccessful outcomes, practitioners should carefully mitigate its potential consequences.
In addition, we can observe that all the events have positive impacts on each other, except for event 9 (changes in customer requirements). This highlights only a change in customer requirements directly detracting from the occurrence of proper project management (event 7). Both the rest of the dynamic events (5, 6, and 8) and outcome events (10, 11, 12, and 13) will be indirectly affected by the cascading effects between risks. Project managers should therefore consider any change in customer requirements critical; hence, after approving a request, they have to carefully monitor its consequences.
Table 6: Total influences of source and dynamic events.
STEP 4: Scenario Analysis
Once this first scenario has been created as a starting point, it is possible to recreate a simulation using a predictions system developed by (Turoff, 1972). In order to calculate the probabilities (Pi) or predictions associated with project risk scenarios, we applied a variation of the Fermi-Dirac distribution function (1). Since this formula considers all possible cross-impacts between events (Cij), the propagation of project risks scenarios is not limited. CIA thus allows the forecasting of all possible feedback between events. Table 8 presents an illustrative scenario-based analysis, which is just one example of the scenario-based capabilities. Findings specifically state how to vary the probability of occurrence of both dynamic and outcome events based on all possible combinations of source events.
Figure 2: Digraph for percentile 40|Cij| >2.03.
A total of 16 combinations were selected. The 16 scenarios are the results of the combination of the four source events that are the events that might happen before starting a project. This type of analysis is useful in the planning stage of the projects, which helps forecast potential risks that might occur during the project and propose actions to mitigate potential negative outcomes.
Table 7: Percentile of highest to lowest absolute values of |Cij|.
Please note that in order to make it simpler and more coherent, we have changed sense event 9 from “There are changes in customer requirements” to “There are no changes in customer requirements.” As can be observed in Figure 2, it was the only event formulated in the negative; thus, when some event is supposed to happen (high probability), this means something positive (likely) and when it is not (low probability), it can be interpreted as something negative (possibly not).
We used a verbal scale to clarify the matter for the experts. Table 9 provides the probability scale used. This table is especially interesting for intermediate scenarios in which we are assuming that the project starts with certain threats. This helps to control specific risks in the planning stage and handle them further. For instance, in scenario 3 we are assuming no risk in source events except for the approach not being appropriate. This fact will mostly affect a risk of failures in the planning stage and the project's profitability.
In order to enlighten the simulation method and the construction of Table 8 we are going to build one of these scenarios (scenario 3) step by step. The initial hypothesis of scenario 3 is that source events 1, 2, and 4 occur and event 3 does not occur. In probability terms, we will change the initial probabilities of events 1, 2, and 4 to 1 (100%) and the initial probability of event 3 to 0. Table 10 depicts the initial and new probabilities of source events for scenario 3.
Based on these hypotheses, we can recalculate the probabilities of the remaining events by applying (1). Since it does not constrain the very changing and multiple impacts between project risks, the findings will reveal the updated probabilities considering all the changes produced. In Table 11 we show the new probabilities based on the simulation as well as the interpretation of each probability value based on Table 9.
Based on these initial hypotheses, we have scenario 3 in which failures in the establishment of the project's goals and the methodological approach will impact negatively on the adequate planning, the delivery in time, as well as on the project profitability of the project. This negative impact will decrease the probability of occurrence of these until a forecast of non-occurrence, the probability of the proper execution by the consultants being close to 50% (unknown). We have repeated this process 16 times until attaining all the scenarios contained in Table 8. We will validate these results in the next step.
STEP 5: Validation of the Method
In order to validate CIA-ISM for supporting project risk management, we compared predictions of the method with information obtained from 22 real finished projects (all projects developed by the company during one year). The simulations have been made using CIASS software (www.ciass.org).
Through this validation method we are introducing partial information into the model for each of the 22 projects and then comparing the expected values with real values. Table 12 presents a comparative analysis of the real and expected results. In column S (%) we find the deviation, in percentage, between the expected values and the real values of all the dynamic and outcome events when we introduced only the four source events into the simulator. In column S&D we see the deviation between the expected values and the real values when we introduced dynamic and source events into the simulator. The value represents the failure of the model in predicting a risk. A value of 100% would represent that the model had failed in all the forecasts concerning an event. A value of 0% indicated that the model correctly predicted all that had been forecasted concerning an event.
Table 8: Risks scenarios.
Table 9: Probability scale.
|Event Number||Label||Initial Probability||New Probability|
Table 10: Initial probabilities of source events for scenario 3.
|Event Number||Label||Initial Probability||New Probability||Interpretation of New Probability|
|7||Proper project management||75||68.41||Likely|
|8||Proper execution by the consultants||70||50.06||Unknown|
|9||No changes in customer requirements||60||79.08||Probable|
|10||Delivery on time||60||42.58||Possibly not|
|12||Project profitability||70||41.96||Possibly not|
Table 11: Simulation of the new probabilities for dynamic and outcome events for scenario 3.
Focusing on the deviation of source events, the results show that the model failed in 6.35% of the forecast, the failure when introducing the dynamic and the source events being 3.57%. In light of these results, we can conclude that the model presents a fairly high explanatory capacity; thus, it may be considered valid for project risk management.
Illustrative Real Cases of CIA-ISM Applications
Now that CIA-ISM has been validated, the following section demonstrates how the proposed method can been applied in real project risk management. In Table 8 we presented as an illustrative example, 16 of the millions of different risks scenarios that can be generated for 13 risk events using CIA-ISM methodology. Notwithstanding, this is just an example of how to create scenarios based on the 16 potential combinations of four source risks. Nevertheless, as previous studies have noted (Bar-Yam, 2003; Yassine, Joglekar, Braha, Eppinger, & Whitney, 2003), reality is much more complex and difficult to anticipate. With the proposed CIA-ISM methodology we could forecast other potential scenarios, including combinations of initial conditions and dynamic events. With this in mind, we explain in detail three real projects carried out in the company to which we are applying the methodology (see the Appendix for summaries of these descriptions). The project managers specifically used CIA-ISM to support their decisions, which concretely enables them to generate risk scenarios across the whole project life cycle, allowing the project manager to make corrective decisions to avoid project failures.
|Event||S (%)||S&D (%)|
Table 12: Comparison between the method's predictions and real results.
It is important to underline that, based on the proposed model, an infinite number of scenarios might be generated and not just the 16 shown in Table 8. In these cases, we start by illustrating the intial stage of each project (T0) and then the sequential events that happened (Tn), comparing the final forecast with the final result of the project (Tf). All simulations have been carried out using CIASS software (www.ciass.org), and the results of the projects have been transcribed previously in an interview. ISM generates a representation of the most likely project risk scenario. Looking at Figure 2, we can thus observe all possible ways; hence, it is representative of sequential events in the entire project life cycle.
Project A – EAC
At the beginning of the project, all source events had their probabilities equal to the initial probabilities of the model except for the internal relevance (event 4), which was slightly higher (80%). Figure 3 represents the initial scenario. Nonetheless, a contextual change in a negative way in the internal relevance dramatically changed the project's level of internal relevance from 80% to 1%. This fact negatively affected most of the remaining events and in all result events except for project profitability, which was unknown. In order to mitigate the potential negative consequences, the company authorized the project management team to use more hours, raising the probability of appropriate project management to 90%.
Figure 3: Project A – T0.
If we include all in the simulation model (Figure 4), we can see that the main risk (in terms of results) is the delivery out of time, which is was what happened in reality. The project was delivered after the planned deadline passed with a probability of 38.43%; there was a negative image of the company without losses and there was customer dissatisfaction without losses as well. Table 13 presents a comparison between the scenario forecasted with CIA-ISM and the real findings.
Project B – EUN
When project B started, all source events had their probabilities equal to the initial probabilities of the working model (Figure 5 and Figure 6). During project development, both the internal relevance (event 4) and customer collaboration (event 5) saw a slightly moderate decrease of 20%. These new circumstances negatively influenced most of the remaining events. Regarding the results event, customer satisfaction and project image changed from being probable and possible, whereas project profitability was unknown. The most negatively affected event was delivery on time, which experienced an evolution from being likely (60%) to improbable (30.57%). Given these difficulties, the firm decided to carry out two response actions and adapted project planning to the new context, raising the probability from 60% to 70% (event 6). At the same time, the firm reinforced the actions of the consultants from 70% to 90% (event 8).
Table 14 presents the result events forecasted with CIA-ISM using the above-mentioned response actions. The method specifically indicated that project B would be completed successfully, which matched with what happened in reality. Project B finishes on time without losses, with a positive image of the firm and customer satisfaction.
Figure 4: Project A – Tf.
|Event Number||Label||Probability Tf||Probability Interpretation||Real Result|
|10||Delivery on time||38.43||Unlikely||The project was not delivered on time|
|11||Customer satisfaction||48.98||Unknown||The customer was not unsatisfied|
|12||Project profitability||61.27||Likely||The project did not cause losses|
|13||Professional image||54.55||Unknown||The image was not damaged|
Table 13: Comparison of the project A forecast with the real final results.
Figure 5: Project B – T0.
Project C – EAP
Project C began without changes in the probabilities of almost all source events; only the suitable profile (event 1) attained a slightly lower probability (76.67%) (Figure 7). This fact impacted negatively and slightly on all the dynamic and result events, yet these still represented an optimistic scenario. This was due to the initial planning of the project, the assignment of consultant profiles, and the sizing of resources and deadlines being made with the information available at that time, which later proved to be inadequate and not reflecting reality. Once activity had already started, the plan proved to be inadequate, changing from 60% to 40%. Furthermore, the project management and customer collaboration had also been damaged. The changes severely affected all result events. In order to reverse these negative consequences, the enterprise authorized the project management team to use more hours, raising the probability of an appropriate project management to 80%. Likewise, it replaced the consultants, increasing the probability of their proper execution to 85%. The final scenario is shown in Figure 8.
After including modifications in the simulation model, we see that the main risk was project profitability. This forecast again matched with what happened at the end of project C (Table 15). The project profitability was marginal; nonetheless, the remaining result events were successfully achieved.
Implications for Project Management
This case study shows how project managers can successfully apply the CIA-ISM method to manage risks in their projects. We found this proposed methodology to be very interesting and to enhance contributions of the CIA-ISM combination method to the literature. This combination allows us to improve the results of traditional risk analysis, offering the possibility of making predictions, forecasting scenarios, and generating event maps in order to have a better knowledge of the big picture of risk analysis in this area. CIA-ISM thus provides project managers with the required support for project risk management in accordance with ISO 31000:2009 (International Standardization Organization, 2009).
Figure 6: Project B – Tf.
|Event Number||Label||Probability Tf||Probability Interpretation||Real Result|
|10||Delivery on time||55.53||Unknown||The project was delivered on time|
|11||Customer satisfaction||82.14||Highly probable||The customer was satisfied|
|12||Project profitability||74.13||Probable||The project generates profitability|
|13||Professional image||82.14||Highly probable||The image was improved|
Table 14: Comparison of the project B forecast with the real final results.
Figure 7: Project C – T0.
In this case study, the methodology provided project managers with an initial working model,which consisted of a set of four source events, five dynamic events, and four outcome events. A previous study also recognized different risks occuring in each project stage (Salmeron & Lopez, 2010). The project managers thus knew the sources of risks, the events that might occur during project execution, and their potential consequences. This feedback is required in establishing the project context stage (SE1) and the risk identification stage (SE2) indicated in ISO 31000:2009 (International Standardization Organization, 2009).
With the working model in mind, the methodology required that project managers analyze risks (SE3) in accordance with the standard. They then considered the causes and sources of risks, their positive and negative influences on project outcomes, and obtaining the probability of occurrence of each element in the working model.
Subsequently, the project managers applied the CIA method to calculate the influence of each internal event (specified in the model) on the remaining, as well as external events (not explicitly specified in the model) represented by the G vector. This offers an excellent measure of the validity of the working model. In fact, the higher the G values, the less representative of the phenomena the model will be. The G value obtained in the case studied was low enough to accept the working model; subsequently, the project manager applied the ISM based on these outcomes. In this way, the working model was represented by means of a digraph, which contains both the sequence and the potential cascading effects between risks. Supporting decision making in project risk management based on graphical analysis is indeed a widespread practice (Kwan & Leung, 2011; Zhang & Fan, 2014).
Figure 8: Project C – Tf.
Based on the outputs of risk analysis, project managers evaluated (CE5) the level of risk of each event and prioritized them. The findings supported them in decision making during the risk treatment stage (CE6). These findings highlight the source event impact more strongly on customer-related events, whereas dynamic events have the greatest influence on project time-related events. Therefore, depending on whether the project leans toward one or the other goal, managers should focus their efforts on managing events arising before the beginning of the project or during its development. The results also reveal that Node 4 (internal relevance) may act as a trigger event in the risk scenario. This bears out the findings from studies (Belassi & Tukel, 1996; Pinto & Slevin, 1987), which indicate that continuous support from someone in a position of authority in the company has a significant role in project success; therefore, project managers should encourage the support of parties involved. This ought to be carried out from establishing the context stage (SE1) according to ISO 31000:2009 (International Standardization Organization, 2009).
|Event Number||Label||Probability Tf||Probability Interpretation||Real Result|
|10||Delivery on time||53.17||Unknown||The project was delivered on time|
|11||Customer satisfaction||58.24||Possible||The customer was not unsatisfied|
|12||Project profitability||45.09||Unknown||The project profitability was extremely low|
|13||Professional image||62.23||Likely||The image was improved|
Table 15: Comparison of the project C forecast with the real final results.
Additionally, CIA-ISM provides project managers with a predictions system to simulate different risk scenarios. This specifically permits participants to simulate different situations in order to detect what treatment options were required. It also makes control of risk (SE2) easier throughout the course of the project as the standard suggests (International Standardization Organization, 2009).
Finally, the new simulation analysis capabilities permit us to compare our results with historical results in order to validate the model and the methods. The comparison highlighted a fairly high explanatory capacity of CIA-ISM in the project risk management context.
The CIA-ISM technique aims at helping project managers to handle and measure cascading effects and contributes to addressing the problem of large numbers of experts to collaborate on the difficult problem of building complex models containing a large number of events, enabling experts to work with a broad range of events. It specifically models risk effects on the key performance indicators of projects over time and even allows differentiating between the influence of source and dynamic effects.
Despite the progress made, some limitations of the CIA-ISM approach should be considered. During the definition stage, practitioners describe the event set with their initial probabilities. An incorrect inputs statement may risk producing findings that are unrealistic for the context. Hence, the accuracy of the CIA-ISM method is strongly dependent on the practitioners’ judgments. In order to execute CIA-ISM properly, the panel of participants must be carefully selected, and other instruments must also be added to validate the results provided by the method.
With a view to validating this approach, the risk scenarios obtained in the case study were compared with real finished projects. The comparison highlights that the method proposed has a high predictability rate; hence, we can conclude that CIA-ISM will help practitioners to manage their projects in the most effective way. Future studies should focus on encouraging alternative ways to corroborate findings when historical information is not available. Notwithstanding, it would be interesting to validate the different risk scenarios included in Table 8 with more real cases.
Moving forward, it should also be stressed that the CIA-ISM method is easily generalizable and adaptable to meeting the specifics of a wide range of projects. In this sense, working with 13 risk models may not be representative of the actual complexity of some large-scale projects, even at the management level, which is the case in industrial, mining, and civil engineering projects. Indeed, large engineering projects should be managed taking into account a network of interrelated risks, which is why extended research using CIA-ISM to foresee risk scenarios including other factors that impact on safety, environment, or social outputs would be very appropriate.
The authors would like to thank the participants for contributing their experience, knowledge, and practical understanding of this case study. This is a revised and expanded version of a paper that originally appeared in the Proceedings of the 13th International Conference on Information Systems for Crisis Response and Management 2016, held in Rio de Janeiro, Brazil.
Ahmed, A., Kayis, B., & Amornsawadwatana, S. (2007). A review of techniques for risk management in projects. Benchmarking: An International Journal, 14(1), 22–36.
Aloini, D., Dulmin, R., & Mininno, V. (2012a). Modelling and assessing ERP project risks: A Petri Net approach. European Journal of Operational Research, 220(2), 484–495. doi.org/10.1016/j.ejor.2012.01.062
Aloini, D., Dulmin, R., & Mininno, V. (2012b). Risk assessment in ERP projects. Information Systems, 37(3), 183–199. doi.org/10.1016/j.is.2011.10.001
Baldi, P., & Rosen-Zvi, M. (2005). On the relationship between deterministic and probabilistic directed graphical models: From Bayesian networks to recursive neural networks. Neural Networks, 18(June 2005), 1080–1086. doi.org/10.1016/j.neunet.2005.07.007
Bannerman, P. L. (2008). Risk and risk management in software projects: A reassessment. Journal of Systems and Software, 81(12), 2118–2133. doi.org/10.1016/j.jss.2008.03.059
Bañuls, V. A., & Turoff, M. (2011). Scenario construction via Delphi and cross-impact analysis. Technological Forecasting and Social Change, 78(9), 1579–1602. doi.org/10.1016/j.techfore.2011.03.014
Bañuls, V. A., Turoff, M., & Hiltz, S. R. (2013). Collaborative scenario modeling in emergency management through cross-impact. Technological Forecasting and Social Change, 80(9), 1756–1774. doi.org/10.1016/j.techfore.2012.11.007
Bar-Yam, Y. (2003). When systems engineering fails—Toward complex systems engineering. IEEE International Conference on Systems Man and Cybernetics, 2, 2021–2028. doi.org/10.1109/ICSMC.2003.1244709
Belassi, W., & Tukel, O. I. (1996). A new framework for determining critical success. International Journal of Project Management, 14(3), 141–151.
Boehm, B. W. (1991). Software risk management: Principles and practices. IEEE Software, (January), 32–40. Retrieved from http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=62930
Brandenburg, M., Govindan, K., Sarkis, J., & Seuring, S. (2014). Quantitative models for sustainable supply chain management: Developments and directions. European Journal of Operational Research, 233(2), 299–312. doi.org/10.1016/j.ejor.2013.09.032
Büyüközkan, G., & Ruan, D. (2010). Choquet integral based aggregation approach to software development risk assessment. Information Sciences, 180(3), 441–451. doi.org/10.1016/j.ins.2009.09.009
Carr, M. J., Konda, S. L., Monarch, I., Ulrich, F. C., & Walker, C. F. (1993). Taxonomy-based risk identification. Technical Report CMU/SEI-93-TR-6 ESC-TR-93-183.
Chen, D., & Hartman, F. T. (2000). A neural network approach to risk assessment and contingency allocation. AACE International Transactions, RI71–RI76.
Costa, H. R., Barros, M. de O., & Travassos, G. H. (2007). Evaluating software project portfolio risks. Journal of Systems and Software, 80(1), 16–31. doi.org/10.1016/j.jss.2006.03.038
Dey, P. K. (2001). Decision support system for risk management: A case study. Management Decision, 39(8), 634–649. doi.org/10.1108/00251740110399558
Fan, C., & Yu, Y.-C. (2004). BBN-based software project risk management. The Journal of Systems and Software, 73(2), 1–23. doi.org/10.1016/j.jss.2003.12.032
Fang, C., & Marle, F. (2012). A simulation-based risk network model for decision support in project risk management. Decision Support Systems, 52(3), 635–644. doi.org/10.1016/j.dss.2011.10.021
Fang, C., & Marle, F. (2013). Dealing with project complexity by matrix-based propagation modelling for project risk analysis. Journal of Engineering Design, 24(4), 239–256. doi.org/10.1080/09544828.2012.720014
Fang, C., Marle, F., & Xie, M. (2016). Applying importance measures to risk analysis in engineering project using a risk network model. IEEE Systems Journal, PP(99), 1–9. doi.org/10.1109/JSYST.2016.2536701
Han, W.-M., & Huang, S.-J. (2007). An empirical analysis of risk components and performance on software projects. Journal of Systems and Software, 80(1), 42–50. doi.org/10.1016/j.jss.2006.04.030
Han, W. M., & Huang, S. J. (2007). An empirical analysis of risk components and performance on software projects. Journal of Systems and Software, 80(1), 42–50. doi.org/10.1016/j.jss.2006.04.030
Hillson, D., & Simon, P. (2007). Practical project risk management: The ATOM methodology (1st ed.). Tysons Corner, VA: Management Concepts.
Huang, S. J., & Han, W. M. (2008). Exploring the relationship between software project duration and risk exposure: A cluster analysis. Information & Management, 45(3), 175–182. doi.org/10.1016/j.im.2008.02.001
Hwang, B.-G., Zhao, X., See, Y. L., & Zhong, Y. (2015). Addressing risks in green retrofit projects: The case of Singapore. Project Management Journal, 46(4), 76–87. doi.org/10.1002/pmj
International Standardization Organization. (2009). International Standardization Organization. Retrieved from https://www.iso.org/obp/ui/#iso:std:iso:31000:ed-1:v1:en
Iversen, J. H., Mathiassen, L., & Nielsen, P. (2000). Managing risk in software process improvement: An action research approach. MIS Quarterly, 28(3), 395–433.
Kwan, T. W., & Leung, H. K. N. (2011). A risk management methodology for project risk dependencies. IEEE Transactions on Software Engineering, 37(5), 635–648. doi.org/10.1109/TSE.2010.108
Lage, B.B., Bañuls, V., Borges, M. (2013). Supporting course of actions development in emergency preparedness through cross-impact analysis. In Proceedings of the 10th International ISCRAM Conference (pp. 714–723). Baden-Baden, Germany.
Lee, E., Park, Y., & Shin, J. G. (2009). Large engineering project risk management using a Bayesian belief network. Expert Systems with Applications, 36(3 PART 2), 5880–5887. doi.org/10.1016/j.eswa.2008.07.057
Linstone, H. A., & Turoff, M. (1975). Delphi method: Techniques and applications. Glenview, IL: Addison-Wesley Educational Publishers Inc.
López, C., & Salmeron, J. L. (2014). Dynamic risks modelling in ERP maintenance projects with FCM. Information Sciences, 256, 25–45. doi.org/10.1016/j.ins.2012.05.026
López, C., & Salmeron, J. L. (2012). Risks response strategies for supporting practitioners decision-making in software projects. Procedia Technology, 5, 437–444. doi.org/10.1016/j.protcy.2012.09.048
Luthra, S., Mangla, S. K., Xu, L., & Diabat, A. (2015). Using AHP to evaluate barriers in adopting sustainable consumption and production initiatives in a supply chain. International Journal of Production Economics. doi.org/10.1016/j.ijpe.2016.04.001
Marle, F. (2014). A structured process to managing complex interactions between project risks. International Journal of Project Organisation and Management, 6(1–2), 4–32.
Na, K.-S., Simpson, J. T., Li, X., Singh, T., & Kim, K.-Y. (2007). Software development risk and project performance measurement: Evidence in Korea. Journal of Systems and Software, 80(4), 596–605. doi.org/10.1016/j.jss.2006.06.018
Neumann, D. E. (2002). An enhanced neural network technique for software risk analysis. IEEE Transactions on Software Engineering, 28(9), 904–912.
Pinto, J. K., & Slevin, D. P. (1987). Critical factors in successful project implementation. IEEE Transactions on Engineering Management, EM-34(1), 167–190. doi.org/http://dx.doi.org/10.1002/9780470172353.ch20
Project Management Institute (PMI). (2013). A guide to the project management body of knowledge (PMBOK® guide) – Fifth edition. Newtown Square, PA: Author. doi.org/10.1002/pmj.20125
Project Management Institute (PMI). (2014). PMI's Pulse of the profession®: The high cost of low performance. Newtown Square, PA: Author. Retrieved from www.pmi.org/~/media/PDF/Business-Solutions/PMI_Pulse_2014.ashx
Pugh, L. A., & Soden, R. G. (1986). Use of risk analysis techniques in assessing the confidence of project cost estimates and schedules. International Journal of Project Management, 4(3), 158–162. doi.org/10.1016/0263-7863(86)90047-5
Purdy, G. (2010). ISO 31000:2009—Setting a new standard for risk management: Perspective. Risk Analysis, 30(6), 881–886. doi.org/10.1111/j.1539-6924.2010.01442.x
Ramirez, M., Bañuls, V. A., & Turoff, M. (2015). A CIA–ISM scenario approach for analyzing complex cascading effects in operational risk management. Engineering Applications of Artificial Intelligence, Volume 46 (Part B), 289–302. doi.org/10.1016/j.engappai.2015.07.016
Raz, T., & Michael, E. (2001). Use and benefits of tool for project risk management. International Journal of Project Management, 19(19), 9–17. doi.org/10.1016/j.earlhumdev.2005.10.003
Rodríguez, A., Ortega, F., & Concepción, R. (2016). A method for the evaluation of risk in IT projects. Expert Systems with Applications, 45, 273–285. doi.org/10.1016/j.eswa.2015.09.056
Salmeron, J. L., & Lopez, C. (2010). A multicriteria approach for risks assessment in ERP maintenance. Journal of Systems and Software, 83(10), 1941–1953. doi.org/10.1016/j.jss.2010.05.073
Samad, J., Naveed, I. (2006). Managing the risks: An evaluation of risk management processes. In Multitopic Conference, 2006. INMIC ‘06. IEEE (pp. 281–287), Islamabad, Pakistan.
Samantra, C., Datta, S., Sankar, S., Bikash, M., & Ranjan, D. (2016). Interpretive structural modelling of critical risk factors in software engineering project. Benchmarking: An International Journal, 23(1), 2–24.
Schmidt, R., Lyytinen, K., Keil, M., & Cule, P. (2001). Identifying software project risk: An international Delphi study. Journal of Managment Information System, 17(4), 5–36. doi.org/10.1080/07421222.2001.11045662
Taylan, O., Bafail, A. O., Abdulaal, R. M. S., & Kabli, M. R. (2014). Construction projects selection and risk assessment by fuzzy AHP and fuzzy TOPSIS methodologies. Applied Soft Computing Journal, 17, 105–116. doi.org/10.1016/j.asoc.2014.01.003
Turoff, M. (1972). An alternative approach to cross impact analysis. Technological Forecasting and Social Change, 3, 309–339. doi.org/10.1016/S0040-1625(71)80021-5
Turoff, M., Bañuls, V. A., Plotnick, L., & Hiltz, S. R. (2014). Development of a dynamic scenario model for the interaction of critical infrastructures. In ISCRAM 2014 Conference Proceedings—11th International Conference on Information Systems for Crisis Response and Management (pp. 414–423). Retrieved from http://www.scopus.com/inward/record.url?eid=2-s2.0-84905841716&partnerID=40&md5=cd2880920bfff697218ed18243da76d3
Wallace, L., Keil, M., & Rai, A. (2004a). How software project risk affects project performance: An investigation of the dimensions of risk and exploratory model. Decision Sciences, 35(2), 289–321. doi.org/10.1111/j.00117315.2004.02059.x
Wallace, L., Keil, M., & Rai, A. (2004b). Understanding software project risk: A cluster analysis. Information & Management, 42(1), 115–125. doi.org/10.1016/j.im.2003.12.007
Wang, J., Lin, W., & Huang, Y.-H. (2010). A performance-oriented risk management framework for innovative R&D projects. Technovation, 30(11–12), 601–611. doi.org/10.1016/j.technovation.2010.07.003
Warfield, J. (1973). Binary matrices in system modeling. IEEE Transactions on Systems, Man, and Cybernetics, SMC-3(5), 441–449. doi.org/10.1109/TSMC.1973.4309270
Warfield, J. N. (1976). Societal systems: Planning, policy and complexity. New York, NY: John Wiley and Sons.
Wu, D. D., Kefan, X., Gang, C., & Ping, G. (2010). A risk analysis model in concurrent engineering product development. Risk Analysis, 30(9), 1440–1453. doi.org/10.1111/j.1539-6924.2010.01432.x
Yassine, A., Joglekar, N., Braha, D., Eppinger, S., & Whitney, D. (2003). Information hiding in product development: The design churn effect. Research in Engineering Design, 14(3), 145–161. doi.org/10.1007/s00163-003-0036-2
Yet, B., Constantinou, A., Fenton, N., Neil, M., Luedeling, E., & Shepherd, K. (2016). A Bayesian network framework for project cost, benefit and risk analysis with an agricultural development case study. Expert Systems with Applications, 60, 141–155. doi.org/10.1016/j.eswa.2016.05.005
Yi, T., & Xiao, G. (2008). Applying system dynamics to analyze the impact of incentive factors allocation on construction cost and risk. In International Conference on Machine Learning and Cybernetics (Vol. 2, pp. 676–680). doi.org/10.1109/ICMLC.2008.4620490
Zayed, T., Amer, M., & Pan, J. (2008). Assessing risk and uncertainty inherent in Chinese highway projects using AHP. International Journal of Project Management, 26(4), 408–419. doi.org/10.1016/j.ijproman.2007.05.012
Zhang, Y. (2016). Selecting risk response strategies considering project risk interdependence. International Journal of Project Management, 34(5), 819–830. doi.org/10.1016/j.ijproman.2016.03.001
Zhang, Y., & Fan, Z.-P. (2014). An optimization method for selecting project risk response strategies. International Journal of Project Management, 32(3), 412–422. doi.org/10.1016/j.ijproman.2013.06.006
Zhou, L., Vasconcelos, A., & Nunes, M. (2008). Supporting decision making in risk management through an evidence-based information systems project risk checklist. Information Management and Computer Security, 16(2), 166–186. doi.org/10.1108/09685220810879636
Dr. Victor A. Bañuls is Associate Professor of Management Information Systems at the Universidad Pablo de Olavide (UPO), Seville, Spain. He has also served as visiting research scholar at the UFRJ, Rio de Janeiro, Brazil; CIEM, Agder, Norway; the New Jersey Institute of Technology, Newark, NJ, USA; and Tilburg University, Tilburg, The Netherlands. His research has been published in journals, including Technological Forecasting and Social Change, Technovation, EAAI, IEEE SMC, and Futures, among others, and he is also the editor of two books. He is the chair of several postgraduate programs, including an Executive Master program on Integrated Management Systems and an Executive Security Management program. His current research efforts are focused on foresight and emergency management, e-learning, management science, and information systems assessment. Dr. Bañuls is co-founder and Research Director of the safety engineering company, MSIG, and assessor of the Regional Spanish Government of Andalusia in Security Policy. He has been co-chair of the track “Foresight, Planning and Risk Analysis in Emergency Management” at the ISCRAM conference since 2010; he is a member of the board of ISCRAM and chair of the Publications and Academic Standards Committee since 2014; and he recently became co-editor in chief of the International Journal of ISCRAM. He can be contacted at firstname.lastname@example.org
Dr. Cristina López received MSc (2007) and PhD (2011) degrees in business from the University of Seville, Spain and is currently an Associate Professor in the Faculty of Business at the University Pablo of Olavide, Seville, Spain and has been a visiting scholar at the Portsmouth Business School of the University of Portsmouth (United Kingdom). Dr. López's papers have been published in the Journal of Systems and Software, IEEE Transactions on Software Engineering, Information Sciences, Computer Standards & Interfaces, Journal of Applied Research and Technology, and others. Her major research interests include enterprise systems, artificial intelligence techniques, multi-criteria methods, and risk management. She can be contacted at email@example.com
Murray Turoff is a Distinguished Professor Emeritus at the New Jersey Institute of Technology, Newark, New Jersey, USA. He is a co-editor of a recent book on Emergency Management Information Systems (M. E. Sharpe, 2010). In addition to his early and continuing work with the Delphi Method, he has spent most of his academic research career in the design and evaluation of computer-mediated communication systems. After 9/11, he turned his attention back to his early work in emergency management and, in 2004, became co-founder of the international organization ISCRAM (Information Systems for Crisis Response and Management). Along with Starr Roxanne Hiltz, he co-authored a prizewinning book, which published in 1978, and predicted the World Wide Web as we know it today: The Network Nation: Human Communication via Computers (reprinted by MIT Press in 1993). Professor Turoff's basic research area has always been the design of collaborative systems operations on computers, based upon the nature of the group and the nature of the problem; in recent years, he has been focusing on emergency management information systems. He can be contacted at firstname.lastname@example.org
Dr. Fernando Tejedor is co-founder and Consultancy Director of the safety engineering company, MSIG, and assessor of the Regional Spanish Government of Andalusia in Security Policy. He is a member of the Technical Committee CTN 66 of the Spanish Body of Standardization AENOR, and has also held the position of Director of Certification of ISO Management Systems in AENOR for Andalusia. His current research efforts are focused on safety management systems and advanced tools, decision support systems, critical mission management systems, and emergency response support systems. He can be contacted at email@example.com
Appendix: Summary of three associated projects supporting this article.
Project Management Journal, Vol. 48, No. 5, 95–114
© 2017 by the Project Management Institute
Published online at www.pmi.org/PMJ