RISK OPTIMIZATION IN PROJECT MANAGEMENT Y. Mordecai and Y. Gerchak Tel Aviv University ABSTRACT Projects, systems and complex processes are characterized by the presence of multiple risks to their objectives, e.g. duration, cost, and specification. Risks emanate from various sources, such as technological, natural, socio-economical, organizational and informational uncertainties. Risks may often be traded-off among objectives. For instance, a technical issue may either be dealt with by schedule buffers, additional resource allocation, specification compromise, or any one of several combinations thereof. Therefore, Risk Modeling should incorporate all available Risk Alternatives. Nevertheless, defining preference relations among objectives is a challenge, for which we apply a consolidated risk measure – the Expected Disutility. We tackle the decisionmaking problem with a novel Risk Optimization Model. The model consists of Stochastic Convolution-based Expected-Disutility minimization through Binary-Programming. 1. INTRODUCTION Risk is a measure of undesired results and the negative impact of uncertainty. Risk Management is the process of identifying, assessing and mitigating risks as they emerge and evolve throughout the lifecycle of processes, systems and projects. Risk Management is a multi-objective, multivariate and resource-constrained effort, which aims at overall risk minimization. Risk is the key to new business opportunities and future profit sources, and therefore, risk should not necessarily be handled solely by avoidance, but also by elimination, reduction (of likelihood or damage), sharing (with other parties), absorption, or by any creative combination of these strategies [8]. Risk Management has recently been studied as an Optimization Problem. A "Risk Cost" minimization model was offered by Ben David et al. [1]. However, their model ignores duration and specification, fundamental project objectives, and counter-risk action selection is not referred to. Fan et al. [3] propose an analytical risk handling strategy selection model, but they deal with a single risk and its handling cost optimization, and wrongly assume that costs cannot be both financial and schedule. Other Risk Optimization models are also available in the literature, and yet, most models are too narrow, partial, theoretically wanting, or oversimplified. Quantitative and Probabilistic Risk Assessment is essential in Risk Management, and poses a serious challenge, as well as the combination of several assessments. Since objective data is usually unavailable in complex and unprecedented situations and systems, analysis Mordecai and Gerchak draws on subjective opinion of experts and professionals. The use of expert opinion is common in science and practice, and has been thoroughly studied [2]. Experts' assessments may often be biased. The most popular assessment technique, the famous 5X5 likelihood/severity matrix is used in various cases and studies and widely implemented ([5],[7],[8],[9]). Other NXN variations are also in use. Figure 1 displays the common 5X5 benchmark. The expected impact, i.e. the product of the Likelihood index and the Severity index, is also classified by impact ranges with a color pattern. Modeling through discrete NXN matrices has many limitations and mathematical flaws. First, only one pair of events are modeled either as Bernoulli events or by a representative estimate (e.g. mean, median, mode, Likelihood likelihood and severity is displayed, which means that 5 4 3 2 1 maximum). Hence, the documented estimate is biased. Second, multiplying values from such grids is 5 4 3 2 1 1 10 15 20 8 12 16 6 9 12 4 6 8 2 3 4 2 3 4 Severity 25 20 15 10 5 5 Figure 1. Risk Impact Matrix mathematically wrong, since these numbers do not represent aligned ordinal values and therefore standard algebraic operators do not apply to them straightforwardly [6]. Bi-dimensional consideration of the measures has been advised, rather than the misleading expected value reflected by their product [10]. 2. THE PROJECT RISK OPTIMIZATION MODEL We offer a novel, rich and useful framework for Risk Optimization, which consists of quantitative risk assessment by continuous/quantile probability distributions, used for weighting a (dis)utility function, and calculation of joint risk by Expected Disutility. In order to create a joint risk distribution we employ serial convolution operations, but since the joint distribution is also affected by the risk alternatives selected, this framework is basically a Convolution-based Optimization, based on the Binary selection of Risk Handling Alternatives. Consider risk as a random force applied by a source to a target. The distribution of the risk depends on risk handling alternatives applied to the risk sources. Let Rst denote the risk applied by source s on target t. Generally, the risk is distributed over the real line and represents the relative effect to the original target value ct. The distribution of the total risk applied by all sources to each target t is a Convolution of the distributions of the individual risks. Let Rt s1 Rst , then the cumulative probability distribution (CDF) of the risk is S defined in (1), and the probability density, derived from the distribution, is defined in (2). Mordecai and Gerchak S Rt ~ Ft ( x) F1t * F2t * ... * Fst * ... * FSt ( x) C Fst ( x) (1) s 1 f t ( x) d d S Ft ( x) C Fst ( x) dx dx s1 (2) We consider undesired effects with non-linear implications through a Utility-Theoretic approach. The Disutility of a target, Dt(X), is the rate of damage incurred by the target's relative deviation from target value to the main objectives of the project and the organization. It also reflects attitude towards risk, i.e. risk aversion or risk seeking. Expected Disutility incorporates both the disutility and risk probability, and expresses the assessed result pattern more precisely than Expected Value. The Objective function of each target t is best described by its Expected Disutility, as shown in (3). Yt E Dt ( X ) D ( x) f ( x)dx t t (3) x0 Target value distributions are affected by selected Risk Alternatives, modeled through distribution functions, reflecting trade-off among the objectives. Let j=1,..,J denote possible mitigation alternatives. The decision variables Z1,..Zj,..,ZJ are binary variables, Z j {0,1}, j , indicating the selection of alternatives {1,..,j,..,J}. The risk probability distribution underlying the Expected Disutility is convolved from all risks affecting the specific target, which in turn are convolved from the initial risk assessment and the subset of mitigation alternative distributions over each risk. The Expected Disutility can be expressed directly as the convolution of the initial assessments with the selected mitigation alternatives' distributions: E[ Dt ( X )] D ( x) dx [ F d t 1t 0 * F1t1Z1 * .. * F1tJ Z J ] * .. *[ FSt 0 * FSt1Z1 * .. * FStJ Z J ]dx x0 This representation may be algebraically converted from a risk-source oriented convolution to a decision oriented one, such that its components are grouped together by the problem's decision variables, {Zj}, as expressed in (4). E[ Dt ( X )] Dt ( x) x0 S S d S F ( x ) * F ( x ) Z * .. * FstJ ( x) Z J dx st 0 st1 1 C C C dx s1 s 1 s 1 Yt E[ Dt ( X )] x0 Dt ( x) d J S C C Fstj ( x) Z j dx dx j0 s1 (4) In order to consider the joint effect of all defined objectives, a joint measure is required. Mordecai and Gerchak Let Ω[Y1,.., YT] denote the global optimization objective function, as given in (5). T MIN Ω tYt , t 1 t 1 (5) 3. IMPLEMENTATION The Risk Optimization Model's complexity is extremely high, O(n3 (2m) n ) , where n is the order of the number of sources, targets and alternatives, and m is the number of qunatiles, or the resolution to which X is discretized in continuous cases for the convolution computation, which is typically 5, 10 or 20. This complexity allows for a full search within a reasonable time of only quite small problems, around the size of n≤7 and m≤10. We have designed and developed a software tool prototype, "Risk Mediator", which implements the model presented, and provides a full search approach for small-sized cases. We developed a calculation method, Pattern Proximity, which identifies the closest known solution to each new solution calculated, and calculates the new solution as a delta to the existing solution. For instance, a solution coded as "01010110", in which each bit stands for one decision variable, may be calculated as the convolution of the distribution of the solution coded "01000100" and the function effected by the selection of decision variables 2 and 5 from the right. This mechanism saves a lot of time and computation effort. 4. SUMMARY Modeling risk in complex systems, processes or projects, is neither trivial nor straightforward. It involves the proper consideration of risk sources, risk targets and risk alternatives, and the proper assessment of risk under each triplet. With various and multiple risks present in the system, and the need to optimize risk handling effort while achieving goals on multiple objectives, a proper optimization model is required. This paper summarizes one part of a Master's Thesis in Industrial Engineering, in which we provide a model for the challenges described here. Our model considers all project objectives through a consolidated measure – their incurred risk, seeks the minimization of the expected disutility of risk, thus taking the non-linear effect of adverse results into consideration, and consists of the joint distribution of aggregate risk, via Convolution-based Optimization. The model is also implementable with a prototype software tool. We intentionally omitted some extensions and complications from this paper, such as hierarchical risk target modeling, risk sources which are also phantom targets (i.e., the lack Mordecai and Gerchak thereof), and elaborate proofs. Another aspect of our research considers the proper combination of assessments from several experts, who may incorporate Strategic Bias into their assessments, due to personal cost/benefit considerations. We provide a Strategy-Proof assessment combination mechanism called MEDAS, to handle this problem. Ongoing and future research includes further development and extensions of the "Risk Mediator" tool, further study into large-scale problems, with dozens of risk sources and targets, and efficient heuristics to solve them; incorporating iterative analysis, accumulated information, error correction and Bayesian learning; Replacing empiric assessments by experts with "cookbook" distributions which may be tampered with, reshaped and calibrated according to the assessors' insight; Optimization of Efficient Frontiers, i.e. multidimensional sets of objective measures that should not necessarily be joined through a weighted aggregate; and more. REFERENCES 1. Ben-David I, Rabinowitz G, Raz T (2002), Economic Optimization of Project Risk Management Efforts, working paper, Tel Aviv University 2. Cooke R (1991), Experts in Uncertainty: Opinion and Subjective Probability in Science, Oxford University Press 3. Fan M, Lin NP, Sheu C (2008), Choosing a Project Risk-Handling Strategy: An Analytical Model, Int. J. Production Economics 112, pp. 700–713 4. Haimes, YY (2009), Risk Modeling, Assessment and Management, John Wiley and Sons, 3rd edition 5. IEEE/ISO/IEC (2006), Systems and Software Engineering – Life Cycle Processes – Risk Management, IEEE /ISO/IEC Technical Publication 16085:2006, 2 nd Edition 6. Pennock MJ, Haimes YY (2002), Principles and Guidelines for Project Risk Management, Systems Engineering, Volume 5 Issue 2, pp. 89-108 7. Project Management Institute (2006), The Project Management Body of Knowledge – PMBOK, PMI 8. Shtub A, Bard JF, Globerson S (2005), Project Management: Processes, Methodologies, and Economics, Prentice-Hall 9. Stoneburner G, Goguen A, and Feringa A (2004), Risk Management Guide for Information Technology Systems, NIST, Special publication 800-30 Rev. A 10. Williams, TM (1996), The Two-Dimensionality of Project Risk, International Journal of Project Management, vol. 14, no. 3, pp. 185-186