SlideShare a Scribd company logo
1 of 134
Download to read offline
Managing in the Presence of
Uncertainty and the Resulting Risk
The naturally occurring uncertainties (Aleatory) in cost, schedule, and techncial
performance can be modeled in a Monte Carlo Simulation tool. The Event Based
uncertainties (Epistemic) require capture, modeling of their impacts, defining
handling strategies, modeling the effectiveness of these handling efforts, and the
residual risks, and their impacts of both the original risk and the residual risk on
the program.
The management of Uncertainties in cost, schedule, and technical performance;
and the Event Based uncertainty and the resulting risk are both critical success
factors for the programs. Risk Management starts with capturing Event Based
Risks and their impacts, then with the modeling of the statistical uncertainty of
the normal work. 1
“It is moronic to predict without first establishing an error rate for the prediction
and keeping track of one’s past record of accuracy”
— Nassim Nicholas Taleb, Fooled By Randomness
14
V8.5
2
Risk Management is How Adults Manage Projects – Tim Lister, IBM
AleatoryEpistemic
 Uncertainty creates the opportunity for risk
 Reducing uncertainty may reduce risk
 Two types of uncertainty†
– One that can be reduced
– One that cannot
 A risk informed PMB starts with the WBS
 8 steps are needed to build a risk informed PMB
3
Quick View of How to Manage in the
Presence of Uncertainty and Risk
14. Risk
Risk informed program performance management is the goal
† Distinguishing Two Dimensions of Uncertainty, Craig Fox and Gülden Ülkumen, in Perspectives of Thinking, Judging, and Decision Making
 Lack of precision about the underlying uncertainty
 Lack of accuracy about the possible values in the
uncertainty probability distributions
 Undiscovered Biases used in defining the range of
possible outcomes of project processes
 Natural variability from uncontrolled processes
 Undefined probability distributions for project
processes and technology
 Unknowability of the range of the probability
distributions
 Absence of information about the probability
distributions
4
Sources of Uncertainty
14. Risk
5
Uncertainties are
things we can not
be certain about.
Uncertainty is
created by
Incomplete
knowledge; not
Ignorance
14. Risk
 When we say uncertainty, we speak about a
future state of an external system that is not
fixed or determined
 Uncertainty is related to three aspects of our
program management domain:
– The external world – the activities of the program
– Our knowledge of this world – the planned and
actual behaviors of the program
– Our perception of this world – the data and
information we receive about these behaviors
6
Some words about Uncertainty
14. Risk
 Risk has two dimensions
– The degree of possibility that an event will take
place or occur sometime in the future
– The consequences of that event, once it has
occurred
 The degree of possibility is qualified as the
Probability of Occurrence
 The consequences are usually taken to be
undesirable and qualified as the magnitude of
harm and the remaining probability of a
recurrence of the same risk
7
Some Words About the Risk
Resulting from the Uncertainty
14. Risk
 Naturally occurring uncertainty
and its resulting risk, impacts
the probability of a successful
outcome
What is the probability of
making a desired completion
date or cost target?
8
All Program Activities have
Naturally Occurring Uncertainty
 The statistical behavior of these activities, their
arrangement in a network of activities, and correlation
between their behaviors creates risk
 Adding margin protects the outcome from the impact of
this naturally occurring uncertainty
14. Risk
 Uncertainty is present when probabilities
cannot be quantified in a rigorous or valid
manner, but can described as intervals within
a probability distribution function (PDF)
 Risk is present when the uncertainty of the
outcome can be quantified in terms of
probabilities or a range of possible values
 This distinction is important for modeling the
future performance of cost, schedule, and
techncial outcomes of a program
9
Relationship between
Uncertainty and Risk
14. Risk
TWO TYPES OF UNCERTAINTY IN OUR
PROGRAM MANAGEMENT DOMAIN
Uncertainty that we can gather more knowledge is – Epistemic
 These are Event based uncertainties
 There is a probability that something will happen in the future
 We can state this probability of the event, and do something about
reducing this probability of occurrence
Uncertainty that we can not gather more knowledge about – Aleatory
 These are Naturally occurring Variances in the underlying processes
of the program
 These are variances in work duration, cost, technical performance
 We can state the probability range of these variances
10
14.1
14. Risk
 Aleatory (stochastic, Type A) uncertainties are
those that are random in nature and are
therefore irreducible
 Epistemic (subjective, Type B) uncertainties
are knowledge-based and are reducible by
further effort
 Separating these classes helps in design of
assessment calculations and in presentation of
results for the integrated program risk
assessment
11
Aleatory and Epistemic Uncertainty
14. Risk
 Nuclear regulatory guidance in the UK makes a
distinction between uncertainties that,
– Can be reliably quantified
– Cannot be reliably quantified
 An uncertainty cannot be reliably quantified if,
– It is not possible to acquire relevant data, or
– If acquiring enough data to evaluate it statistically
could only be done at disproportionate cost
 Quantifiable uncertainties – numerical risk
assessment
 Unquantifiable uncertainties – separate
consideration
12
An Alternative Classification
14. Risk
 Scenario uncertainty
– What might happen in the future?
 Modeling uncertainty
– Have we understood the system correctly, and
have we implemented this understanding
adequately in our numerical model?
 Uncertainty in values assigned to variables
(parameter uncertainty)
– Have we given suitable values to the variables in
our model?
13
Another Perspective On
Uncertainty
14. Risk
 Precision – how small is the variance of the estimates
 Accuracy – how close is the estimate to the actual
values
 Bias – what impacts on precision and accuracy come
from the human judgments (or misjudgments)
14
Measurement Uncertainty
 Accuracy
 Precision
 Accuracy
 Precision
 Accuracy
 Precision
 Accuracy
 Precision
14. Risk
 Credible estimates of program variables
require both Accuracy and Precision
15
Precision and Accuracy
14. Risk
 Good measurements are both precise and
accurate
 It is easier to work with data that are
imprecise (broad variance) than with data that
are inaccurate (not close to the actual values)
 It’s the Measurement Bias that is difficult to
detect
16
Measurement Uncertainty
14. Risk
 Variability is an inherent property of natural
systems
 Variability is not always the same as
uncertainty
 We may need a ‘representative’ value for our
calculations – introduces uncertainty
 Statistical techniques can be used to describe
variability
17
Variability
14. Risk
 We cannot be certain about most things on the
program
 Failure to reduce uncertainty has economic costs
that may be very large
 People (government, regulators, and the public)
do not like uncertainty – it has a social cost as
well as time and money
 Response to uncertainty and the resulting risk is
not always rational
 It is not always possible to manage and
communicate something that is not understood
18
Why Start with Uncertainty?
14. Risk
 Cost
 Schedule
 Capacity for
work
 Productivity
 Quality of
results
 Activity
correlation
19
Naturally Occurring Uncertainty
in the IMS Creates Risk
With the naturally occurring uncertainty between -5% to 20% in
our work effort durations, we have an 80% confidence of
completing on or before our target date – PP&C speaking to PM
14. Risk
 Knowing the underlying
statistics of the past, and
a model of the behavior,
we can forecast the
probability of the future
behavior.
20
Events have an Uncertainty of
Occurring and they Create Risk
 Improving our knowledge with better data can
be used for better models,
– Improves the forecast of the probability of impact
– Reduces damage through better preparation at a
lower cost
14. Risk
 Given that each outcome in the sample space
 is equally likely, the probability of an event
A is
21
The Probability of the Occurrence
of an Event is …
 
A
P A 

14. Risk
The Probability of a future event
impacting the project creates risk
There is a 68% probability Hurricane Katrina will strike New Orleans in
the next 24 to 36 hours, with an 85% confidence.
Evacuate Now 22
14. Risk
ELICITING THE NATURALLY OCCURRING AND
EVENT BASED UNCERTAINTY VALUES
Discovering the uncertainties that then create risk is a process of
elicitation.
This process takes on many forms. The first is to look to the past to see
what went wrong before, how was that discovered, how as it handled,
and what did we learn – Lessons Learned.
Next is the Subject Matter Expert approach. What can go wrong if you
know how things work.
SME’s many times ignore obvious
23
14.2
14. Risk
 Starting with the WBS Dictionary
– What are we producing?
– What are the impediments to this effort?
– What can go wrong with the produced item?
– What are the responses to those impediments?
 Placing all these in the Risk Register
– What are their probabilities of occurrence?
– What are the impacts?
– What will it cost to handle the risk?
– What is the residual probability of occurrence after
the handling efforts?
24
Looking for Event Based
Uncertainty means …
14. Risk
 Staffing
 Funding
 Facilities
 Supply chain
 Regulatory and Government guidance
 Weather
 All the thing you don’t have direct control over
25
Looking for Externalities that
create Uncertainty that drive Risk
14. Risk
 Variances in:
– Past performance
– Capacity for work
– Quality of the outcomes
– Performance variances
– Effectiveness variances
 Develop class of these variance for application
to the IMS as Reference Classes and apply
these to the current work processes
26
Examining the Naturally Occurring
Uncertainties that Drives Risk
14. Risk
 Direct use of historical data
 Direct assignments or estimates
 Use of standard probability distributions:
Rayleigh, Weibull, Poisson, or Kolmogorov-Smirov
tests
 Use of detailed modeling of phenomena and
processes, with event trees, fault trees and
Bayesian belief networks
 Monte Carlo simulation to obtain the
probabilities based on the models
27
Specifying a Probability Distribution for
both Natural and Event Uncertainty†
† Misconceptions of Risk, Terje Aven, University of Stavanger, Norway, John Wiley & Sons, 2010
Classical Inference and the Linear Model. Kendall's Advanced Theory of Statistics. 2A (Sixth ed.), Stuart, Keith, and Steven, 1999.
But this probabilistic view does not capture everything about risk
14. Risk
Terms used to separate the two
classes of uncertainty and their risks
 Aleatory Uncertainty† of an attribute must be addressed
in the Integrated Master Schedule (IMS) with schedule
and cost margin
 Epistemic Uncertainty‡ of an event must be addressed in
the Risk Register with risk retirement (mitigation) plans
placed in the IMS
 Risk events without planned retirement are assigned to
Management Reserve
 Aleatory risk can be modeled through Reference Class
Forecasting or past performance data to determine the
needed cost and schedule margin
28
† Naturally occurring variances in the underlying processes that cannot be removed
‡ Risk due to the lack of knowledge that can be reduced with further knowledge or specific actions
14. Risk
Clarity of Purpose for the
Risk Management Processes
29
14. Risk
 There are many terms used in risk
management that have common and
overlapping meanings
– Risk
– Uncertainty
– Probability
– Confidence
– Statistical percent
 Many times these words are used without
actually understanding what they mean
30
Terminology in Risk Management
14. Risk
 Not known for sure
 Not a precise value – varies in some way
 Absence of information
 Not possible to know
 Changeable
 Is a probabilistic process
31
What is Uncertainty?
14. Risk
 Why classify?
– Different types of uncertainties may require different
approaches to identify and manage
– Assessment context may require a particular
classification
– Separate assessment and / or presentation of
different types of uncertainty may aid understanding
 Various classifications are available for different
purposes
 Classifications are not unique or exhaustive
– Be aware of overlaps and omissions
32
Classifying Uncertainty
14. Risk
“Probability is the most important
concept in modern science,
especially as nobody has the
slightest notion of what it means.”
– Bertrand Russell, 1929
33
14. Risk
A QUICK PROCESS CHECK
With definitions of Naturally Occurring and Event Based uncertainty and
their creation of their related classes of risk, let’s confirm our
understanding of these concepts before proceeding to put them to work.
34
14.3
14. Risk
A Quick Process Check
35
For example…
The probability of a leakage in a process plant is a risk.
This risk event is subject to uncertainty, but the risk
concept is restricted to the event ‘leakage’ – the
uncertainties and how people judge the uncertainties
constitute a different domain.
Risk Results from both
Natural Uncertainty and Probabilistic Events
14. Risk
The Defense Acquisition Guide
(DAG) says…
36
Risk is the measure of future uncertainties in achieving
program performance goals and objectives within
defined cost, schedule, and performance constraints.
Risk can be associated with all aspects of a program
(e.g., threat environment, hardware, software, human
interface, technology maturity, supplier capability,
design maturation, performance against plan,) as these
aspects relate across the work breakdown structure
and Integrated Master Schedule.
14. Risk
1st Notion of Risk†
37† The works of Alexander Budzier and Bent Flyvbjerg, University of Oxford, 2011
The causes for risks clearly lie in our
incomplete knowledge of the subject
matter, thus if a project establishes all
possible causes of risks they can be
managed away.
“It ain’t what you don’t know that gets you into trouble.
It’s what you know for sure that just ain’t so.”
– Mark Twain
This of course that is simply not possible
14. Risk
Some Classes of Risk
Risk Class The Risk Impact
Performance
The ability of a design to meet desired quality criteria and
the consequences of this risk
Schedule
The ability of a project to develop an acceptable design
within a span of time and the consequences of this risk
Cost
The ability of a project to develop an acceptable design
within a given budget and the consequences of this risk
Technology
Capability of technology to provide performance benefits
and the consequences of this risk
Business
Political, economic, labor, societal, or other factors in the
business environment and the consequences thereof 38
14. Risk
2nd Notion of Risk
39
Risk is derived from Uncertainty
There are two classes of uncertainty:
1. Natural variances in the underlying processes work
processes
2. Missing knowledge about something that is going happen
in the future
These two uncertainties are the source of two type of risk
1. Aleatory uncertainty – naturally occurring uncertainty
defined in a probability density function (pdf) of possible
values that will impact a process
2. Epistemic uncertainty – event based uncertainty, defined
by a probability of occurrence, which impacts a process
14. Risk
Aleatory Uncertainty Drives Risk
40
Aleatory uncertainty (stochastic or random uncertainty) is the
inherent variation associated with a physical system or
environment under consideration.
Aleatory uncertainties can be singled out from other
uncertainties by their representation as distributed quantities
that take on values in an established or known range. The exact
values will vary by chance from unit to unit or time to time.
This random variability is characterized as an irreducible
uncertainty, new information can not be obtained to reduce the
uncertainty, only margin can be used to offset these
uncertainties.
This randomness itself, may be defined or qualified by the
underlying epistemic assumptions †
† “Ex-post identification and remedies of adverse effects,” Institute of Transport Economics (TØI), Norway, 27 September 2010
14. Risk
Epistemic Uncertainty Drives Risk
41
† Risk-informed Decision-making In The Presence Of Epistemic Uncertainty, Didier Dubois1, Dominique Guyonnet, "International
Journal of General Systems 40, 2 (2011) 145-167
Epistemic uncertainty is any lack of knowledge or information in
any phase or activity of the project.
This uncertainty and the resulting epistemic risk can be reduced
through testing, modeling, past performance assessments,
research, comparable systems and processes.
Epistemic uncertainty can be further classified into model,
phenomenological, and behavioral uncertainty.†
The probability of occurrence is the start of Event Based risk
management, but impacts, cost to mitigate, residual risk and its
impact, and cost to mitigate the residual risk must also be
considered, but any credible risk management plan can be in place
14. Risk
 Both Aleatory and Epistemic uncertainty exist for cost,
schedule, and technical performance
 Both these uncertainties create risk for the program
 Determining which type of uncertainty is straight
forward …
– Variances in cost and schedule due to normal fluctuations
of the work processes that cannot be corrected with
management actions are Aleatory
– Event Based risks from a probabilistic occurrence of an
undesirable occurrence and a probabilistic unfavorable
outcome, after the occurrence are Epistemic risks
In Our DoD domain …
Using the term uncertainty is not sufficient.
The resulting risk must be further categorized as being responsive to
new information or simply part of the normal operations of the program
14. Risk
43
Elements of Risk Modeling
 For future building this is
aleatory
– No addition testing will
reduce variability
 For existing buildings it is
epistemic
– Testing can confirm strength
of installed product
Risk arises from Uncertainty in the random
variables of the program
 The compressive strength of concrete has a
range of uncertainty
14. Risk
Sources Of Risk Due To Uncertainty
Type Description
Parameter Exact value for experimental models are unknown
Structural Model bias or model inconsistencies
Algorithmic Numeric errors or approximation
Parametric Variability on input values
Experimental Observation errors
Interpolation Extrapolation need for lack of model data
Aleatory
Statistical uncertainty – the natural variability of
the processes
Epistemic
Systematic uncertainty – information known in
principle but not in practice 44
14. Risk
Risk Driver Relationship Processes
Reduce
Ambiguity
Reduce
Uncertainty
Residual
Risk
Consequence of
Uncertainty
Epistemic Uncertainty – Event Based Risk
Remaining
Aleatory
Uncertainty
Aleatory
Uncertainty
Severity of
Consequences
45
Sources of
Uncertainty
14. Risk
 Epistemic uncertainty results from gaps in
knowledge. For example, we can be uncertain of
an outcome because we have never used a
particular technology before.
– Such uncertainty is essentially a state of mind and
hence subjective.
 Aleatory uncertainty results from variability that
is intrinsic to the behavior of some systems. For
example, we can be confident regarding the long
term frequency of throwing sixes but I remain
uncertain of the outcome of any given throw of a
dice.
– This uncertainty can be objectively determined.
46
Some more background on
Aleatory and Epistemic risk
14. Risk
 Frequentist probability theory is used to analyze
systems that are subject to aleatory uncertainty
 Bayesian probability theory is used to analyze
epistemic uncertainty
 For most risk assessments there is both epistemic and
aleatory uncertainty
 But epistemic uncertainty is always significant due to
the novelty of the situation under assessment
 Standard Monte Carlo Simulation uses frequentist
probability theory to analyze risk and should only be
used for Aleatory Risks – standard variances in cost,
schedule, and technical performance
We will use both branches of
Probability Theory for Risk Management
The cardinal sin of risk management is applying frequentist (Monte
Carlo Simulation) probability to model epistemic uncertainty 47
14. Risk
 When Monte Carlo Simulation is used to model schedule risk,
the schedule uncertainties are being treated as if they are
aleatory, even though they may be predominantly epistemic
 Using standard Monte Carlo Simulation alone to analyze
schedule risk also requires unrealistic assumptions be made
about the correlations between the probabilities for the
individual outcomes
 In practice, correlations must be considered when analyzing
schedule risk
 These can be both a positive and negative correlations
 As a result the use of Monte Carlo Simulation should be used
with care when the historical data of past performance is
incomplete 48
The core problem with Aleatory
Risk Management of Schedules
14. Risk
Identify the Reference Class variability from:
 Reference classes of similar past work
activities
 Establish the probability distribution for the
selected reference class for the parameter
that is being forecast
 Compare the specific set of activities with the
reference class distribution, to establish the
most likely outcome for the specific durations
assigned in the current project
49
How To Fix This Core Problem
14. Risk
 Every single thing or event has an indefinite
number of properties or attributes observable in
it, and might therefore be considered as
belonging to an indefinite number of different
classes of things – John Venn (1834 – 1923)†
 If we are asked to find the probability holding for
an individual future event, we must first
incorporate the event into a suitable reference
class. An individual thing or event may be
incorporated in many reference classes, from
which different probabilities will result – Hans
Reichenbach (1891 – 1953)‡
50
Reference Class Forecasting
† J. Venn, The Logic of Chance (2nd ed, 1876), p. 194
‡ H. Reichenbach, The Theory of Probability (1949), p. 374
14. Risk
LET’S BUILD A RISK INFORMED PMB
IN EIGHT STEPS
A Risk Informed PMB means that both Aleatory and Epistemic risk
mitigations are embedded in the PMB. For non-mitigated Epistemic risks,
Management Reserve must be in place to cover risks that are not being
mitigated in the IMS.
While DCMA would object, this Management Reserve needs to be
assigned to specific risks or classes of risk to assure that sufficient MR is
available and use is pre-defined.
51
14.4
14. Risk
Assemble a credible WBS and the Integrated Master
Plan / Integrated Master Schedule (IMP/IMS)
– WBS Dictionary says what will be built
– IMP Narrative says how, where, and what processes
are used to built it
Assess the aleatory uncertainties in the WBS and IMP
Adjust activity durations and sequence to create the
needed margin to handle the aleatory uncertainty
Assign schedule and cost margin to protect end item
deliverables
52
How to Build a Risk Adjusted IMS
in 8 Steps
0
1
2
3
14. Risk
Identify Event Based uncertainties from WBS
Dictionary and IMP Narratives
Assign these uncertainties to the Risk Register
Determine risk retirement plans and place them in
the IMS
Determine cost and schedule impacts of unmitigated
risks and develop Management Reserve
Assemble mitigated aleatory and epistemic
uncertainties with the unmitigated epistemic risk into
the Total Allocated Budget
53
Building a Risk Adjusted IMS in 8
Steps (Concluded)
4
5
6
7
8
14. Risk
Risks Identified with WBS
elements
 Each risk identified in the elicitation process
 WBS contained deliverables assigned to risk
retirement processes
 Risk water fall defined by Program Event
ID Risk Title
Initial
Risk
Risk at
IBR
Risk at
PDR Risk Type WBS
038 Center-of-Gravity Limits 16 15 10 Technical 2.1.5
006 Gross Liftoff Weight 16 15 10 Technical 2.1.5
090 Flight & Mission-Critical Software Development Effort 16 11 10 Schedule 2.1.4
101 Unattended launch system design 16 12 8 Schedule 6.2.14
082 Achieving Component, Subsystem- & System Quals 15 14 11 Schedule 2.1.7
244 Vehicle Production timing 12 12 10 Schedule 6.5
095 Autonomous Rendezvous flight pattern design 12 10 9 Schedule 6.2.12
017 EMI Anti-Jam Protection System Development 12 10 7 Technical 6.2.5
243 Landing and Impact Attenuation 12 12 6 Technical 6.2.11
098 Recover/Landing System (RLS) Rigging Complexity 12 12 6 Technical 6.2.11
088 Qualification of EEE Parts 12 10 4 Schedule 2.1.9.3
091 Uncertain To Achieve Payload Mounting Limits 12 8 3 Schedule 604604
54
0
14. Risk
 Variances in duration and cost are applied to
the Most Likely values for the work activities
 Apply these variances in the IMS
 Model the outcomes using a Monte Carlo
Simulation tool
 The result is a model of the confidence of
completing on or before a date and at or
below a cost
55
Assess the Aleatory Uncertainties in the
WBS and IMS
1
14. Risk
 Using the outcomes from the Monte Carlo
Simulation develop the needed schedule and cost
margin
 Place margin in front of key deliverables to
protect their commitment dates and costs
56
Adjust activity durations and sequence
to create the needed margin
2
5 Days Margin
5 Days Margin
Plan B
Plan A
Plan B
Plan AFirst Identified Risk Alternative in IMS
Second Identified Risk
Alternative in IMS
3 Days Margin Used
Downstream
Activities shifted to
left 2 days
Duration of Plan B < Plan A + Margin
2 days will be added
to this margin task
to bring schedule
back on track
14. Risk
 This margin is on baseline in the PMB
 Unused margin should be capable of being
shifted to the right to increase available
margin in future deliverables
57
Assign schedule and cost margin to
protect end item deliverables
3
30% Probability
of failure
70% Probability
of success
Plan B
Plan A Current Margin Future Margin
80% Confidence for completion
with current margin
Duration of Plan B Plan A + Margin
14. Risk
 These uncertainties are defined in the IMS
 They can be assigned to work activities
 Work can be assigned to reduce or retire the
risk associated with these uncertainties
58
Identify Event Based uncertainties from
WBS Dictionary and IMP Narratives
4
25
24
23
22
21
20
19
18
17
16
15
14
13
12
11
10
9
8
7
6
5
4
3
2
1
0
Risk ID: CEV-038—Center-of-Gravity Limits
RiskScore
2005 2006 2007 2008 2009 2010 2011 2012
DP048-TV-1029
1 2
4
5 6
8
3
11
10
12
13
17
19
14
16
20
21
22
23
SDR PDR
LAS-1
Test Flt CDR
LAS-3
Test Flt
RRF-1
Test Flt
RRF-2/3
Test Flt
ISS-1
Flt
LAS-2
Test Flt
7
9
15
18
25
24
23
22
21
20
19
18
17
16
15
14
13
12
11
10
9
8
7
6
5
4
3
2
1
0
Risk ID: CEV-038—Center-of-Gravity Limits
RiskScore
2005 2006 2007 2008 2009 2010 2011 2012
DP048-TV-1029
1 2
4
5 6
8
3
11
10
12
13
17
19
14
16
20
21
22
23
SDR PDR
LAS-1
Test Flt CDR
LAS-3
Test Flt
RRF-1
Test Flt
RRF-2/3
Test Flt
ISS-1
Flt
LAS-2
Test Flt
7
9
15
18
14. Risk
 Risks are connected to the WBS elements in
the IMS
59
Assign these Uncertainties to the Risk
Register
5
14. Risk
 With the identified risks and their mitigations,
create packages of work to reduce the risk
 Treat these risk reduction work activities as
standard work in the IMS
– Budget
– Measures of Performance
– Measures of Effectiveness
 Report progress of the risk retirement or risk
reduction activities in the program
performance measurement process
60
Determine risk retirement plans and
place them in the IMS
6
14. Risk
 For each element in the Risk Register – either
mitigated or unmitigated – have a model of the
impact on cost schedule, or techncial
performance
 Use this information to develop the needed
Management Reserve (MR) to be held outside
the Performance Measurement Baseline (PMB)
 For mitigated Epistemic Risks, model the needed
cost and schedule reserve for the work activities
just like the normal work activities
61
Determine cost and schedule impacts of
unmitigated risks and develop Management
Reserve
7
14. Risk
 Aleatory risks and their cost and schedule
margins
 Mitigated Epistemic risks with their retirement
or reduction activities
 Unmitigated risks with cost and schedule
margin held in the Management Reserve
register
 All these costs and schedule impacts are rolled
up to the TAB
62
Assemble mitigated aleatory and epistemic
uncertainties with the unmitigated epistemic
risk into the Total Allocated Budget
8
14. Risk
RISK HANDLING STRATEGIES
Handling risk means dealing with the sources of risk and the
consequences of the risk when it comes true. Handling is a better term
than mitigation. Handling covers all the responses to the risk that results
from the underlying uncertainties – both aleatory and epistemic.
Handling plans describe the specific responses to reduce the uncertainty
– of possible – that create the risk. These can be funded on baseline or
held in Management Reserve. The irreducible uncertainties must be
handled through margins – schedule margin or cost margin.
63
14.5
14. Risk
Understanding Inputs is the first step for Risk
Management
 Risk Register Contents
 Probability of occurrence
 Probability of cost and schedule impact
 Impact measures and their variability
 Risk mitigation effectiveness
 Residual risk after mitigation
 Residual cost and schedule impact
64
We can’t Interpret the Results Without
Understand the Inputs!
14. Risk
Components of Risk
 Risk is comprised of two core components.
– Threat – a circumstance with the potential to produce
loss.
– Consequence – the loss that will occur when a threat
is realized.
 With 3 Risk Statement Structures that the Treat
and the Consequence
Threat Consequence
Probability Impact
Cause Effect
65
14. Risk
IF-THEN
Risk Statement
IF THEN
Risk 1 If we miss our next milestone.
Then the program will fail to
achieve its product, cost, and
schedule objectives.
Risk 2
If our subcontractor is late in
getting their modules
completed on time.
Then the program’s schedule
will slip.
Probability
66
1
14. Risk
CONDITION-CONCERN
Risk Statement
Condition Concern
Risk 1
Data indicates that some tasks
are behind schedule and
staffing levels may be
inadequate.
The program could fail to
achieve its product, cost, and
schedule objectives.
Risk 2
Our subcontractor has not
provided much information
regarding the status of its
tasks.
The program’s schedule could
slip.
Probability
67
2
14. Risk
CONDITION-EVENT-CONSEQUENCE
Risk Statement
Condition Event Consequence
Risk 1
Data indicates that
some tasks are
behind schedule
and staffing levels
may be inadequate.
We could miss our
next milestone.
The program will fail
to achieve its
product, cost, and
schedule objectives.
Risk 2
The subcontractor
has not provided
much information
regarding the status
of its tasks.
The subcontractor
could be late in
getting its modules
completed on time.
The program’s
schedule will slip.
Probability 68
3
14. Risk
Risk Handling Strategies
 Risk handling is the outcome of the risk
management strategy – they are not the same
 Risk Handling consists of:
– Assumption – understand what potential impacts may
occur and have resources available to deal with them
– Avoidance –make a change in the situation that
creates the risk
– Control or Mitigation – develop a proactive
implementation approach to reduce the risk
– Transfer – determine who (internally or external) can
better handle the risk
69
14. Risk
Risk Analysis
70
C
A B C D E
B
A
D
E
C
B
A
D
E
CBA D E
14. Risk
1
2
3
4
5
1 2 3 4 5
Low
Moderate
High
Consequence
Likelihood
16. GLP compliance at BSL–4
USAMRIID required for The Animal
Rule
1. FDA requires additional toxicology
and/or ADME studies
2. FDA requires PK in pivotal animal
studies
17. Two Segment II tox studies in
non–rodent and/or Segment I and
Segment III studies required for
Category B label
18. FDA demands aerosol exposure
(i.e. viral challenge) experiments
be performed in nonhuman
primate efficacy studies [L/H]
10. Irreversible kidney toxicity is seen
in a subset of healthy volunteers at
therapeutic dose levels
11. Clinical trial enrolls more slowly
than expected.
12. Positive signal in QTc study
13. FDA requests clinical data in
Special Populations pre–licensure
14. FDA requests larger clinical safety
database than initially proposed
19. One of the pivotal animal efficacy
studies fails to achieve primary
clinical efficacy endpoint
20. No Observed Adverse Effect
Level is significantly lower than
expected [L/H]
3. Insufficient subunit purification at
vendor
4. Failure of purification equipment at
J–M
5. New impurities appear as a result of
scale up from 8L to 50L
6. Subunits or API temporarily
unavailable
7. Lot failures of subunits, API or drug
product
8. One or more manufacturers not
cGMP
15. Unsuccessful synthesis
scale–up from 50L to 300L
16. New impurities appear as a
result of scale up
Example Risk Summary Grid
71
14. Risk
 Poor Resolution
– can correctly and unambiguously compare only a small fraction (e.g., less than
10%) of randomly selected pairs of hazards.
– can assign identical ratings to quantitatively very different risks ("range
compression").
 Errors
– can mistakenly assign higher qualitative ratings to quantitatively smaller risks.
– For risks with negatively correlated frequencies and severities, provide no real
information
 Suboptimal Resource Allocation
– Allocation of risk mitigation resources cannot be based on the categories
provided by risk matrices
 Ambiguous Inputs and Outputs
– Categorizations of severity cannot be made objectively for uncertain
consequences.
– Inputs to risk matrices and resulting outputs require subjective interpretation
 Don’t provide time frames for the exposure, mitigations, and impacts
72
The Trouble with Risk Matrices†
† What’s Wrong with Risk Matrices, Tony Cox, Risk Analysis, Vol. 28, No 2, 2008
14. Risk
 Modeling random data is not the same as modeling
random processes
 Data modeling assumes convenient functional forms and
makes best fits to historical data
– Functional forms might be arbitrarily chosen
– Functional forms may have built-in bias
– Goodness of fit is the only criterion (and is not falsifiable)
– No theoretical justification is derived from the nature of the
process
 Data modeling considers only project outcomes; process
modeling considers how we get to the outcomes and
provides testable ideas
– Improve predictability and understanding by using knowledge of
the nature of the process to guide data modeling random
processes
73
A Core Flaw of Risk Modeling
Actual projects have fat tail distributions†
† Fat Tailed Distributions For Cost And Schedule Risks, John Neatrour, SCEA, Jan 19, 2011
14. Risk
Three Mandatory Steps In
Successful Risk Management†
 A high quality project schedule
– Represents all work
– Logically linked
– No constraints
– Resource loaded
– Unbiased duration estimates
 A contingency-free cost estimate
– Items do not have padding built in to accommodate risk
– No below-the-line contingency included.
 Good quality risk data
– Qualitatively identified risks
– Probability and impact data
74†Integrated Cost and Schedule Risk Analysis using Monte Carlo Simulation of a CPM Model, AACEI No. 57R-09
14. Risk
 Likelihood the project’s cost and schedule
targets can be met
 Time and cost margin needed to meet the risk
threshold
 Risk priorities to be handled to achieve
schedule and cost estimates
 Joint time and schedule analysis showing the
probability of meeting time and cost targets
jointly – the Joint Confidence Level (JCL)
75
Outputs of a Successful Risk
Management Process
14. Risk
 Risk work shop using a variety of identification
techniques, specific tools for risk categorization
and an explicit step that allocates each risk to a
single risk owner
 Meta‐language for describing risks that clearly
separates cause, risk event and effect
 Major review meetings at the start of every
project phase
 Information on risk status and response actions in
the Risk Register to record the risk status, date
and reason of exclusion
76
Basis for Good Risk Management
Outcomes
14. Risk
 Develop a project‐specific Risk Management
Plan (RMP)
 Plan, allocate and report explicitly on risk
responses and risk treatment actions
 Assign an internal project Risk Champion for
communication, control and monitoring
 Adequate use of range estimates in schedule
and cost forecasting for factors influencing
project forecasts and estimates minimized by
using range estimates in schedules and costs
77
Basis of Good Risk Management
Outcomes (Continued)
14. Risk
 Planning‐based Quantitative Risk Analysis of
risk response planning, estimate
contingencies, compare alternatives,
optimization of resource allocation and show
the effectiveness of planned responses and
risk treatment actions.
 Establish a “mature” risk culture
 Assure top management commitment
 Confirm everyone on the program is trained
78
Basis of Good Risk Management
Outcomes (Concluded)
14. Risk
Both Probabilistic Risk and Statistical
Uncertainty measures are needed
Statistical Uncertainty
 Naturally occurring (stochastic†)
variance in the work efforts or
cost
 Like the weather, these variances
are always there and are always
changing
 Uncertainty can be modeled with
a Monte Carlo Simulation tool
and Reference Class Forecasting
based on past performance
Probabilistic Risk Events
 Probability of an event occurring
in the future that results in an
unfavorable outcome
 When this event occurs the
consequential may be
probabilistic as well.
 Probability of occurrence and
impact are used to model the
cost and schedule
79
The natural statistical variation of the
project activities. Variance and impacts
need cost and schedule margin
There is a probability that something will
happen that impacts cost, schedule, and
technical performance of our deliverables
† Stochastic (from the Greek στόχος for aim or guess) is an adjective that refers to systems whose
behavior is intrinsically non-deterministic, sporadic, and categorically not intermittent (i.e. random).
14. Risk
Risk and Uncertainty
 In 1921 Frank Knight made the
distinction between risk
(randomness with knowable
probabilities) and uncertainty
(randomness with unknowable
probabilities).
 Today, these components of
uncertainty are termed aleatory
and epistemic uncertainties.
 Knight, F. H. (1921). Risk,
Uncertainty, and Profit Boston:
Houghton Mifflin Company
80
14. Risk
Risk and Uncertainty
Risk stems from unknown
probability distributions
 A probabilistic event that when it
occurs has an unfavorable impact on
cost, schedule, and technical
performance – or some combination
 Risk events can be retired or mitigated
prior to their occurrence
 After mitigation or retirement, risk
events may still have a probability of
occurrence
 Expressed as an expected probability
of occurrence of an event
accompanied by undesirable
consequences
Uncertainty stems from known
probability distributions
 Uncertainty produces variation from
many small influences and yields a
range of cost and schedule values on a
particular activity
– Schedule Perturbations
– Budget Perturbations
– Re–work, and re–test phenomena
that naturally occur in the course of
work
 Uncertainties can be handled with
cost, schedule, and technical
performance Margin
81
Risk is Event Focused
There is a 15% chance our stir welding
process will result in faulty seams in the
combustion chamber of the ascent engine
Uncertainty creates the risk of an Event
In the past, our C&DH box development
efforts have a -5%/+15% variance. We need
a 12% buffer to protect our deliverable
14. Risk
The Meaning of Uncertainty
 Uncertainty in plain English is about the “lack of certainty”
– Uncertainty is about “variability” in relation to performance
measures like cost, duration, or quality
– Uncertainty is about “ambiguity” associated with a lack of this
clarity
 Known and unknown sources of bias and ignorance is about
how much effort it is worth expending to clarify the
situation
– This is the underlying process driving uncertainty
 As well, uncertainty arises from the basic processes of work
– This is Deming uncertainty
– It is the statistical “noise” in the work process
 Both of these sources of uncertainty impact cost and
schedule
– Trying to control the “noise” of this variance adds no value
– Trying to control the “lack of certainty” arising from ambiguity
and lack of clarity does have value
82
14. Risk
Speaking in “Uncertainty” Terms
 When we state a date it needs to be qualified
with one of two phrases
– A range of possible value
• The completion date for software requirements flow down
will be no later than March 13th and no earlier than February
12th
– A confidence on the desired or a target value
• The software requirements flow down will be completion
March 13th with 80% confidence
 The “risk adjusted” vocabulary must be
represented in the IMS as well
 Separating deterministic planning from
probabilistic planning is the starting point for
building a Risk Tolerant IMS 83
14. Risk
Planning in the Presence of
Uncertainty
 In the presence of uncertainty we need to speak
about how we can improve our confidence …
– As time passes the confidence intervals on an
estimate should improve, as shown in the next slide.
– This improvement can represent technical risk
reduction or programmatic risk reduction.
 But “risk tolerance” still needs to address the
unknown and unknowable risks in the
programmatic risk tolerance sense
– The IMS must show how these disruptive activities can
be tolerated without reducing the confidence in the
deterministic plan
84
14. Risk
Epistemic and Aleatory Uncertainty
Both Uncertainties Exist on Programs
 Aleatory – an inherent variation – a stochastic process –
associated with the physical system or an environment:
– For discrete variables – the duration of a work activity – the
randomness is parameterized by the probability of each possible
value
– For continuous variables – the mass of a space craft component –
the randomness is parameterized by the probability density
function
 Epistemic – probabilistic uncertainties that can be reduce by
obtaining knowledge of quantities or processes :
– For discrete random variables – the epistemic uncertainty is
modeled by alternative probability distributions
– For continuous random variables, the epistemic uncertainty is
modeled by alternative probability density functions.
85
14. Risk
Epistemic Uncertainty and Aleatory
Variability are both risk drives†
Epistemic Uncertainty
 Epistemic uncertainty is the
scientific uncertainty due to limited
data and knowledge in the model
of the process
 Epistemic uncertainty can, in
principle, be eliminated with
sufficient study
 Epistemic (or internal) uncertainty
reflects the possibility of errors in
our general knowledge.
Aleatory Variability
 Aleatory uncertainties arise from
the inherent randomness of a
variable and are characterized by a
Probability Density Function
 The knowledge of experts cannot
be expected to reduce aleatory
uncertainty although their
knowledge may be useful in
quantifying the uncertainty
86† Uncertainty in Probabilistic Risk Assessment: A Review, A.R. Daneshkhan
Randomness With Knowable Probabilities Randomness With Unknowable Probabilities
The probability of occurrence can be defined
through a variety of methods. The outcome is
a probability of occurrence of the event
A Probability Density Function (PDF) generates
a collection of random variables used to
model durations and costs
14. Risk
Structure of Program Risks
87
Risk management in small construction projects, Kajsa Simu, Luleå University of Technology Department of Civil and
Environmental Engineering Division of Architecture and Infrastructure
14. Risk
Examples of Aleatory and Epistemic Risks –
both drive unfavorable outcomes on projects
 If a component were required to operate for 17 years with 90%
confidence during a flight to other planets, and it had only been
tested for 1 year, the evaluation of whether it meets the 90%
confidence requirement would have to include both aleatory
uncertainty (e.g., the possibility of a premature failure given a
known mean failure rate) and epistemic uncertainty (e.g.,
uncertainty in the mean failure rate due to the limited test
time).
 It is important to include both types of uncertainty in evaluating
the performance risk.
 It is also important to know the relative contribution of each
type of failure, since the former source of risk could not be
reduced by more testing (without design modification) but the
latter source could. 88
14. Risk
A Word of Caution
 Common approach is to not separate aleatory and
epistemic uncertainties and their resulting risks
– Represent epistemic uncertainty with a uniform probability
distribution
– For a quantity that is a mixture of aleatory and epistemic uncertainty,
use second-order probability theory
 It is slowly being recognized that the above
procedures (especially the first) can underestimate
uncertainty in:
– Physical parameters
– Geometry of a systems
– Initial conditions
– Boundary conditions
– Scenarios and environments
The first approach can result in large underestimation of uncertainty in
system responses 89
14. Risk
Why Epistemic Uncertainty is a
major risk driver
 Epistemic uncertainty is presumed to be caused
by lack of knowledge or data
 The lack of knowledge part of the uncertainty can
be represented in the model auxiliary non-
physical variables
 These variables capture information obtained
through the gathering of more data
 These auxiliary variables define statistical
dependencies – the correlations between the
uncertainties – in a clear and transparent manner
90
14. Risk
A Reminder Again of
Aleatory and Epistemic Risk
 The key difference between aleatory and
epistemic risk
– Aleatory uncertainties arise from possible
variations and random errors in the values of the
parameters and their estimates.
– Epistemic or ontological uncertainty can
potentially be reduced by improving our
knowledge
– Epistemic uncertainties are subjective and are
related to the lack of knowledge of the particular
process.
91
14. Risk
MODELING THE UNCERTAINTY THAT
IS THE SOURCE OF RISK
Many times the term Risk Mitigation is used to represent several actions
that are actually Risk Handling Strategies.
Mitigation is one strategy. Mitigation buys down the uncertainty and
reduces the risk from that uncertainty.
But another handling strategy is to ignore the uncertainty, transfer the
uncertainty and the risk to someone else, or simply accept that the
uncertainty is present and the resulting risk as well.
92
14.6
14. Risk
Taxonomy of Uncertainty
93
Uncertainty
Aleatory Epistemic
Natural
Variability
Ambiguity
Ontological
Uncertainty
Probabilistic
Events
Probabilistic
Impacts
Periods of
Exposure
14. Risk
Another Taxonomy of Uncertainty
94
14. Risk
 Unknowns that differ each time the model of
the IMS is assessed
 Uncertainties the program controls staff
cannot do anything about
 Uncertainties that cannot be suppressed or
removed
 Risk is created when we have
– Not accounted for this natural variance in our plan
– Do not have sufficient buffer to protect the plan
from these naturally occurring variances.
95
Aleatory Uncertainty
14. Risk
 Systematic uncertainty
 Caused by things we know about in principle,
but don’t know about in practice
 Risk is created when we have:
– Not measured the quantity sufficiently accurately
– The model neglects certain effects
– The data is not available to quantify the risk
96
Epistemic Uncertainty
14. Risk
Dealing with Aleatory Uncertainty
and the Resulting Risk
 Aleatory uncertainty is expressed as process
variability
– Work effort variance
– Productivity variance
– Quality of product and resulting rework valance
 Aleatory risk is always expressed in relation to a
duration – a percentage of the duration
 The classical response to such variability is to
build a margin that reduces risk over the duration
This is the motivation for short Packages Of Work that
produce defined outcomes on fine grained boundaries 97
14. Risk
Dealing with Epistemic Uncertainty
and the Resulting Risk
 Reducing epistemic risk requires improvement
our knowledge of the system of interest or
avoiding implementations that increase this
uncertainty
 Uncertainty introduced by design assumptions
are reduced by making all assumptions an
explicit part of the design – Technical
Performance Measures – and revisiting these
assumptions on a regular basis to confirm
they remain valid or whether they can be
removed and real data substituted
98
14. Risk
Sources of Epistemic Uncertainty
 Epistemic uncertainty is introduced every time an assumption
about the world in which the system is embedded is made
 The assumption could be made because of the lack of data
– Ontological uncertainty
 The assumption can be simplified to make the job easier
– Epistemic uncertainty
 Probability uncertainty – failure rates of components are epistemic
 Subjectivity of evaluation – an Epistemic risk when the likelihood of
a rare event is made with little or no empirical data
 Incompleteness problem – a major hazard or condition not
identified or a causal mechanism remains undetected
 Undetected design errors – introduced an ontological uncertainty
into the systems behavior
99
14. Risk
Monte Carlo Sampling used for
Aleatory Uncertainty Propagation
100
Duration distribution of
work in the network
Network of
activities
Probability of completing
on or before a specific date
14. Risk
Monte Carlo Sampling used for
Epistemic Interval Propagation
101
Possible values of a
parameter
Mass model of
the vehicle
Possible outcomes from
the model
14. Risk
Duration uncertainty (Aleatory)
represented in the IMS baseline
 The independence or
dependency of each task
with others in the network,
greatly influences the
outcome of the total project
duration
 Understanding these
dependencies is critical to
assessing the credibility of
the IMS as well as the total
completion time
102
 Any path could be critical depending on the probability distributions
of the underlying task completion probability functions
We must know something about the
probability distributions of the work efforts
14. Risk
Uncertainty in the IMS drives cost and
schedule as a Dynamic Network System
 The programmatic and planning dynamics act as a system
 The “system response” is the transfer function between input and
output
Inputs
Outputs
 Understanding this
transfer function is critical
to understanding the
dynamics of the program
– It is part of the stochastic
dynamic response to
disruptions in our plans
– “What if” really means
“what if” at this point in
the response curve of the
system
103
The response
curve is likely non-
linear as well,
requiring further
modeling of the
IMS dynamics
14. Risk
 When Monte Carlo is used to model schedule risk, the
schedule uncertainties are treated as aleatory, even
though they may be epistemic
 This is considered to be unrealistic and is known to give
biased results, but is used anyway
 The analysis of schedule risk requires assumptions to
be made regarding the correlations between the
probabilities for the individual outcomes:
– It is assumed there are no correlations or that they are all
of the same nature
– In practice, there are correlations to be considered when
analyzing schedule risk and they are of both a positive and
negative nature
104
Some More Words of Caution
14. Risk
Probability Distributions used for
modeling uncertainty
Distribution Application
Uniform
Appropriate for uncertainty quantities where the range can be established (maximum and
minimum values can be defined) based on physical arguments, expert knowledge or historical
data. If the range of parameter values is large (greater than one order of magnitude), a log
uniform distribution is preferred to a uniform one.
Triangular
When little relevant information exits, but extremes and most likely values are known,
typically on the basis of subjective judgment. If the parameter values cover a wide range a log
triangular distribution is preferred.
Empirical
Useful when some relevant data exists, but cannot be represented by any standard statistical
distribution. A piecewise uniform (empirical) distribution is recommended in this case.
Normal
When a substantial amount of relevant data exits. Can represent errors due to additive
processes. It is useful for modeling symmetric distributions of many natural process and
phenomena. Is often used as a “default” distribution for representing uncertainties.
Log normal
Useful as an asymmetrical model for a parameter that can be expressed as a quotient of other
variables, so they are useful for representing physical quantities, such as concentrations.
Poisson
Useful for describing the frequency of occurrence of random, independent events within a
given time interval.
Beta
It is often used to represent judgments about uncertainty. Also to bounded, unimodal,
random parameters. 105
14. Risk
Deterministic versus Probabilistic
Planning at the Program Level
106
Baseline
Plan
80%
Mean
Missed
Launch
Period
Launch
Period
Ready
Early
Oct 07
Nov 07
Dec 07
Jan 08
Feb 08
Mar 08
Apr 08
May 08
Jun 08
Margin
Risk
Margin
Current Plan
with risks is the
stochastic schedule
CDR
PDR
SRR
FRR
ATLO
20%
Aug 05 Jan 06 Aug 06 Mar 07 Dec 07 Feb 08
Current Plan
with risks is the
deterministic schedule
Plan
Title
Probability
distribution varies as
time passes
14. Risk
 In 1979, Tversky and Kahneman proposed an alternative to
Utility theory. Prospect theory asserts that people make
predictably irrational decisions.
 The way that a choice of decisions is presented can sway a
person to choose the less rational decision from a set of
options.
 Once a problem is clearly and reasonably presented, rarely
does a person think outside the bounds of the frame.
 Source:
– “The Causes of Risk Taking By Project Managers,” Proceedings of
the Project Management Institute Annual Seminars &
Symposium November 1–10, 2001, Nashville, Tennessee
– Tversky, Amos, and Daniel Kahneman. 1981. The Framing of
Decisions and the Psychology of Choice. Science 211 (January
30): 453–458
107
Sobering Facts About Naïve Use of
Three Point Estimates
14. Risk
 Building a risk tolerant IMS
– Explicit technical risk mitigation must be embedded in the IMS
– Explicit schedule margin must be embedded in the IMS
• Margin values identified through Monte Carlo simulations
• Margin assigned in front gating events
– Technical risks connected to Risk Register in some form
– Cost and Schedule risks connected in the IMS and a modeling
tool
 Assessing the Risk Tolerant IMS – what does risk tolerant
mean?
– Weekly status, monthly Earned Value, forecast of risk impacts
– Weekly Monte Carlo assessment of confidence intervals and
their historical changes – are we getting better or worse?
– Performance forecast based on likelihood outcomes from
Monte Carlo simulations, not just “adding up the numbers”
108
Actionable Outcomes for Credible
Risk Management
14. Risk
 Forward looking – leading indicators reveal
opportunities for corrective actions
 Trending information must forecast outcomes
– Cost trends
– Schedule trends
– Performance trend
– Risk trends
 EAC / ECD driven forecasts from past
performance, trends, and actions to control
trends
109
Risk Register Based Decision
Making processes of the IMP/IMS
14. Risk
 Some simple steps to identifying risk opportunities in the IMS
– Scenario based planning – “what if this happens?”
– Event impact planning – “what inhibits success?”
 Both must focus on the consequences in order to identify the
mitigations
110
Implementing Programmatic Risk
Assessment is Straight Forward
Initiating Event
Selection
Scenario
Development
Scenario Logic
Modeling
Scenario
Frequency
Modeling
Consequence
Modeling
Risk Integration
14. Risk
 DoD Guidance
– DAU “Risk Management Guide for DoD
Acquisition”
– Air Force, “Acquisition Risk
Management”
– Air Force “SMC Systems Engineering
Primer and Handbook”
111
Continuous Risk Management
(CRM) is required
CRM Activity IMS Representation
Identify Risk items with IMP/IMS #’s, CA/WP & resource assignments
Analyze Risk management responsibilities assigned
Plan Mitigation plans with durations and resource assignments
Track Status reported from Risk Management to IMS
Control Risk tasks reporting in weekly status process
Communicate IMS status reporting
14. Risk
112
Level Likelihood
E Near Certainty
D Highly Likely
C Likely
B Low Likelihood
A Not Likely
Level Technical Performance Schedule Cost
A
Minimal or no consequence to
technical performance
Minimal or no impact Minimal or no impact
B
Minor reduction in technical
performance or supportability
Able to meet key dates
Budget increase or unit
production cost
increases.
< **(1% of Budget)
C
Moderate reduction in technical
performance or supportability with
limited impact on program objectives
Minor schedule slip. Able to
meet key milestones with
no schedule float.
Budget increase or unit
production cost
increase
< **(5% of Budget)
D
Significant degradation in technical
performance or major shortfall in
supportability
Program critical path
affected
Budget increase or unit
production cost
increase
< **(10% of Budget)
E
Severe degradation in technical
performance
Cannot meet key program
milestones.
Slip > X months
Exceeds budget
increase or unit
production cost
threshold
This matrix must be built for each
category of risk (reference class).
The decision for each dimension
comes from Subject Matter
Experts and the Risk Management
team.
E
D
C
B
A
A B C D E
14. Risk
 Two functions of Event Based Risk Management
– Identification, recording, ranking, and reviewing risks,
mitigation, and response plans, and all associated risk
information
– Risk analysis to determine how risks affect cost, schedule, and
technical performance
 Notional categories of risk. If the risk happens …
– Duration and cost – we’re late and over budget
– Safety – an unsafe condition is created
– Legal – a litigation even is created
– Performance – a less than acceptable performance condition
results
– Technical – our product or service is noncompliant
– Environmental – the external environment is placed in an
unfavorable condition
113
Event Based Risk Management
14. Risk
 Known Unknowns – general uncertainties and
uncertain events that were identified and
quantified
 Biases – conscious or subconscious systematic
errors occurring when identifying and quantifying
general uncertainties and uncertain events
 Unknown Unknowns – factors that were missed,
including some types of organizational and
psychological bias when identifying general
uncertainties and uncertain events
114
Build the Event Based Risk Model†
† Chapman, C., Ward, S., 2003. Project Risk Management. Processes, Techniques and Insights, second ed. John Wiley & Sons, England
14. Risk
 It would be a rare occurrence if two risks were
not correlated in some way in a large program
 The correlation coefficient between X and Y is
given by …
115
Risk Events Are Correlated
14. Risk
 Naturally occurring uncertainty drives cost and
schedule through uncontrolled variance
 Probabilistic events drives disruptions in the
planned order of the work
 Both impact the EAC
– Cost and schedule variance can be handled
through margin for naturally occurring uncertainty
– Management Reserve can be used for probabilistic
events that occur within the scope of the program
116
Uncertainty and Risk Drives EAC
14. Risk
 Completion dates move to the right by
naturally occurring variance in work activity
durations
 Completion dates move to the right when
unmitigated uncertainties become issues
117
Uncertainty and Risk Drives ECD
14. Risk
 Break process flow into small steps of clearly defined
activities, modeling predecessors and successors
 Estimate
– Time duration of each step based on probable work time
for each type of labor involved
– Yield statistics at each step – what fraction of a products
output are expected to be compliant
 Define the rework loops if possible
 Combine step duration to obtain an estimate of total
time require to meet specific milestones
 Identify the Critical Path through the network that will
delay the program
118
Analyzing the IMS for Risk
14. Risk
 Weight of components and subsystems
 Power, cooling, attitude control
 Integration and testing
 Data memory
 Number of source lines of code to be written
 Software testing complexity
 Special mission equipment
 Subcontract interrelationships
119
Technical Schedule Drivers
14. Risk
 The most likely estimate of the duration of a
task is optimistic
 Tasks done in parallel take longer than
planned
 Tasks uncertainties are correlated
 Estimates of task duration uncertainty are too
narrow
 Risk events not included
120
Programmatic Schedule Drivers
14. Risk
Task Durations Are Correlated†
Even Uncorrelated is Correlated
121† David Voss, Project Schedule Risk Analysis, VOSE SOFTWARE BVBA
14. Risk
 An integrated tool is needed to connect the
Event Based risk (Epistemic) with the variance
uncertainty (Aleatory) in the IMS
 Risk Drivers must be modeled as well
 Management Reserve modeling is needed for
the un-mitigated Epistemic risk
 Schedule and Cost modeling is needed for the
Aleatory risks created by duration and cost
variances
122
Modeling Uncertainty and Risk
14. Risk
 Least complex elicitation is the uncertainty of
an event – its presence or absence
 Next level is when the event is resolved into
more than two outcomes
 Sometime the outcome is a numerical
quantity with a large (possibly infinite)
number of possible values.
 For the last case we need a Probability Density
Function (PDF)
123
Eliciting Probability Distributions
14. Risk
 Electing this information is only one method
of obtaining probabilities
 Historical data, with a stable process that
generated that data can be used to develop
new data.
 Reference Class Forecasting is the current
basis of historical data used to forecast classes
of project activities and their Aleatory
variance
124
Eliciting Probability Distributions
(Concluded)
14. Risk
 Probabilities should be informative
– Probabilities closer to 0.0 or 1.0 should be
preferred to those closer to .5 as the more
extreme probabilities provide greater certainty
about the outcome of an event
 Probabilities should authentically represent
uncertainty
– For events that are given an assessed probability
of p, the relative frequency of occurrence of those
events should approach p
125
Probabilities Must Have Desirable
Properties
14. Risk
 The process of expressing knowledge in terms
of probabilities is not simple and is subject to
repeatable types of errors
 Representiveness heuristics – using relevant
evidence associated with the target event
 Availability heuristics – information that is
easier to recall gives more weight in forming
probability judgments
126
Heuristics and Biases in Forming
Probability Judgments
14. Risk
Risk Chains – Across The WBS
127
14. Risk
Risk Management Processes for
Program Management
 An approach to programmatic and technical risk
14. Risk
Risks in Risk Register connected to WBS
elements provide cost impact analysis
 Risk ID traceable to IMS for schedule impacts
 WBS elements collect cost impact of risk
 Risk handling strategies connected to IMP,
IMS, WBS, SOW, and TPM measures
14. Risk
Connecting Risk Retirement with
the work activities in the IMS
130
 “Buying down” risk is
planned in the IMS.
 MoE, MoP, and KPP
defined in the work
package for the critical
measure – weight.
 If we can’t verify
we’ve succeeded,
then the risk did not
get reduced.
 The risk may have
gotten worse
Risk: CEV-037 - Loss of Critical Functions During Descent
Planned Risk Level Planned (Solid=Linked, Hollow =Unlinked, Filled=Complete)RiskScore
24
22
20
18
16
14
12
10
8
6
4
2
0
Conduct Force and Moment Wind
Develop analytical model to de
Conduct focus splinter review
Conduct Block 1 w ind tunnel te
Correlate the analytical model
Conduct w ind tunnel testing of
Conduct w ind tunnel testing of
Flight Application of Spacecra
CEV block 5 w ind tunnel testin
In-Flight development tests of
Damaged TPS flight test
31.Mar.05
5.Oct.05
3.Apr.06
3.Jul.06
15.Sep.06
1.Jun.07
1.Apr.08
1.Aug.08
1.Apr.09
1.Jan.10
16.Dec.10
1.Jul.11
Weight risk
reduced from
RED to Yellow
Weight confirmed
ready to fly – it’s
GREEN at this point
14. Risk
Management Reserve Log (MRL) provides
the integrity for all changes to the PMB
 All changes authorized through the BCR process
 All impacts recorded in BCR and Management
Reserve impacts (ups and downs) recorded in the
same meeting
14. Risk
 Are characterized by uncertainty, non-linearity
and reclusiveness, best viewed as dynamic
and evolving systems.
 So why do we pretend they are predictable,
definable and fixed – and why do we use
linear lifecycle models to manage them
132
Risk in Complex Programs†
† Complexity in Defence Projects How Did We Get Here?, Concept Symposium 2010, Oscarsborg Norway. Mary McKinlay
14. Risk
The Final Notion of Risk
133
The causes for risks clearly lie in our
incomplete knowledge of the subject matter,
thus if a project establishes all possible
causes of risks they can be managed away.
And of course that is simply not possible
This puts the focus on discovering and
delaying with Epistemic Risks
Aleatory Risks can be easily modeled with
Reference Class Forecasting using past
performance
14. Risk
Beware the Black Swan
134
14. Risk

More Related Content

What's hot

Risk management
Risk managementRisk management
Risk managementMECandPMV
 
Risk Management
Risk ManagementRisk Management
Risk Managementcgeorgeo
 
Risk Assessment Step PowerPoint Presentation Slides
Risk Assessment Step PowerPoint Presentation SlidesRisk Assessment Step PowerPoint Presentation Slides
Risk Assessment Step PowerPoint Presentation SlidesSlideTeam
 
Decisions Under Risk and Uncertainty - UP.pptx
Decisions Under Risk and Uncertainty - UP.pptxDecisions Under Risk and Uncertainty - UP.pptx
Decisions Under Risk and Uncertainty - UP.pptxEileenPelo1
 
Risk Management Tools And Techniques PowerPoint Presentation Slides
Risk Management Tools And Techniques PowerPoint Presentation SlidesRisk Management Tools And Techniques PowerPoint Presentation Slides
Risk Management Tools And Techniques PowerPoint Presentation SlidesSlideTeam
 
Project Risk Management
 Project Risk Management Project Risk Management
Project Risk ManagementHayat Denzi
 
Risk Assessment: Creating a Risk Matrix
Risk Assessment: Creating a Risk MatrixRisk Assessment: Creating a Risk Matrix
Risk Assessment: Creating a Risk MatrixEtQ, Inc.
 
Chapter 1 risk management (3)
Chapter 1  risk management (3)Chapter 1  risk management (3)
Chapter 1 risk management (3)rafeeqameen
 
Asset Management Presentation
Asset Management PresentationAsset Management Presentation
Asset Management PresentationNeeraj Kumar
 
Risk and return analysis
Risk and return analysisRisk and return analysis
Risk and return analysisBabasab Patil
 
Project Risk Management
Project Risk ManagementProject Risk Management
Project Risk ManagementMarkos Mulat G
 
Risk Management (1) (1).ppt
Risk Management (1) (1).pptRisk Management (1) (1).ppt
Risk Management (1) (1).pptAjjuSingh2
 
Decision Theory
Decision TheoryDecision Theory
Decision Theorykzoe1996
 
Business risk assessment
Business risk assessmentBusiness risk assessment
Business risk assessmentUzair Khan
 

What's hot (20)

Risk management
Risk managementRisk management
Risk management
 
Risk Management
Risk ManagementRisk Management
Risk Management
 
Risk Assessment Step PowerPoint Presentation Slides
Risk Assessment Step PowerPoint Presentation SlidesRisk Assessment Step PowerPoint Presentation Slides
Risk Assessment Step PowerPoint Presentation Slides
 
Decisions Under Risk and Uncertainty - UP.pptx
Decisions Under Risk and Uncertainty - UP.pptxDecisions Under Risk and Uncertainty - UP.pptx
Decisions Under Risk and Uncertainty - UP.pptx
 
Risk Management Tools And Techniques PowerPoint Presentation Slides
Risk Management Tools And Techniques PowerPoint Presentation SlidesRisk Management Tools And Techniques PowerPoint Presentation Slides
Risk Management Tools And Techniques PowerPoint Presentation Slides
 
Project Risk Management
 Project Risk Management Project Risk Management
Project Risk Management
 
Introduction to Risk Management
Introduction to Risk ManagementIntroduction to Risk Management
Introduction to Risk Management
 
Risk Management
Risk ManagementRisk Management
Risk Management
 
Risk management
Risk managementRisk management
Risk management
 
Risk Assessment: Creating a Risk Matrix
Risk Assessment: Creating a Risk MatrixRisk Assessment: Creating a Risk Matrix
Risk Assessment: Creating a Risk Matrix
 
Risk management
Risk managementRisk management
Risk management
 
Internal rate of return
Internal rate of returnInternal rate of return
Internal rate of return
 
Organizational Risk Management
Organizational Risk Management Organizational Risk Management
Organizational Risk Management
 
Chapter 1 risk management (3)
Chapter 1  risk management (3)Chapter 1  risk management (3)
Chapter 1 risk management (3)
 
Asset Management Presentation
Asset Management PresentationAsset Management Presentation
Asset Management Presentation
 
Risk and return analysis
Risk and return analysisRisk and return analysis
Risk and return analysis
 
Project Risk Management
Project Risk ManagementProject Risk Management
Project Risk Management
 
Risk Management (1) (1).ppt
Risk Management (1) (1).pptRisk Management (1) (1).ppt
Risk Management (1) (1).ppt
 
Decision Theory
Decision TheoryDecision Theory
Decision Theory
 
Business risk assessment
Business risk assessmentBusiness risk assessment
Business risk assessment
 

Viewers also liked

Big data meets evm (submitted)
Big data meets evm (submitted)Big data meets evm (submitted)
Big data meets evm (submitted)Glen Alleman
 
Agile in the government
Agile in the government Agile in the government
Agile in the government Glen Alleman
 
Measurement News Webinar
Measurement News WebinarMeasurement News Webinar
Measurement News WebinarGlen Alleman
 
Earned Value Management Meets Big Data
Earned Value Management Meets Big DataEarned Value Management Meets Big Data
Earned Value Management Meets Big DataGlen Alleman
 
Agile at scale resources
Agile at scale resourcesAgile at scale resources
Agile at scale resourcesGlen Alleman
 
Five immutable principles
Five immutable principlesFive immutable principles
Five immutable principlesGlen Alleman
 
Cpm 200 C technical performance measures ipm2016
Cpm 200 C technical performance measures ipm2016Cpm 200 C technical performance measures ipm2016
Cpm 200 C technical performance measures ipm2016Glen Alleman
 
Building the perfect schedule (v6)
Building the perfect schedule (v6)Building the perfect schedule (v6)
Building the perfect schedule (v6)Glen Alleman
 
Paradigm of agile project management
Paradigm of agile project managementParadigm of agile project management
Paradigm of agile project managementGlen Alleman
 
Principles and Practices of Performance-Based Project Management®
Principles and Practices of Performance-Based Project Management®Principles and Practices of Performance-Based Project Management®
Principles and Practices of Performance-Based Project Management®Glen Alleman
 
Notes on balanced scorecard
Notes on balanced scorecardNotes on balanced scorecard
Notes on balanced scorecardGlen Alleman
 

Viewers also liked (13)

Big data meets evm (submitted)
Big data meets evm (submitted)Big data meets evm (submitted)
Big data meets evm (submitted)
 
Agile in the government
Agile in the government Agile in the government
Agile in the government
 
Ev+agile=success
Ev+agile=successEv+agile=success
Ev+agile=success
 
Measurement News Webinar
Measurement News WebinarMeasurement News Webinar
Measurement News Webinar
 
Earned Value Management Meets Big Data
Earned Value Management Meets Big DataEarned Value Management Meets Big Data
Earned Value Management Meets Big Data
 
Agile at scale resources
Agile at scale resourcesAgile at scale resources
Agile at scale resources
 
Five immutable principles
Five immutable principlesFive immutable principles
Five immutable principles
 
Cpm 200 C technical performance measures ipm2016
Cpm 200 C technical performance measures ipm2016Cpm 200 C technical performance measures ipm2016
Cpm 200 C technical performance measures ipm2016
 
Building the perfect schedule (v6)
Building the perfect schedule (v6)Building the perfect schedule (v6)
Building the perfect schedule (v6)
 
Paradigm of agile project management
Paradigm of agile project managementParadigm of agile project management
Paradigm of agile project management
 
Principles and Practices of Performance-Based Project Management®
Principles and Practices of Performance-Based Project Management®Principles and Practices of Performance-Based Project Management®
Principles and Practices of Performance-Based Project Management®
 
Control systems
Control systemsControl systems
Control systems
 
Notes on balanced scorecard
Notes on balanced scorecardNotes on balanced scorecard
Notes on balanced scorecard
 

Similar to Managing in the presence of uncertainty

Risk management of the performance measurement baseline
Risk management of the performance measurement baselineRisk management of the performance measurement baseline
Risk management of the performance measurement baselineGlen Alleman
 
Quantification of Risks in Project Management
Quantification of Risks in Project ManagementQuantification of Risks in Project Management
Quantification of Risks in Project ManagementVenkatesh Ganapathy
 
Programmatic risk management workshop (handbook)
Programmatic risk management workshop (handbook)Programmatic risk management workshop (handbook)
Programmatic risk management workshop (handbook)Glen Alleman
 
Risk-Management-05012023-025512pm.ppt
Risk-Management-05012023-025512pm.pptRisk-Management-05012023-025512pm.ppt
Risk-Management-05012023-025512pm.pptYasirShaikh34
 
project_risk_mgmt_final.ppt
project_risk_mgmt_final.pptproject_risk_mgmt_final.ppt
project_risk_mgmt_final.pptavisha23
 
project_risk_mgmt_final.ppt
project_risk_mgmt_final.pptproject_risk_mgmt_final.ppt
project_risk_mgmt_final.pptAyidAlmgati
 
PMI project_risk_management_final_2022.ppt
PMI project_risk_management_final_2022.pptPMI project_risk_management_final_2022.ppt
PMI project_risk_management_final_2022.pptDorraLamouchi1
 
Building Risk Tolerance into the Program Plan and Schedule
Building Risk Tolerance into the Program Plan and ScheduleBuilding Risk Tolerance into the Program Plan and Schedule
Building Risk Tolerance into the Program Plan and ScheduleGlen Alleman
 
Risk management (final review)
Risk management (final review)Risk management (final review)
Risk management (final review)Glen Alleman
 
Risk management(software engineering)
Risk management(software engineering)Risk management(software engineering)
Risk management(software engineering)Priya Tomar
 
Software IT risk-management
Software IT risk-managementSoftware IT risk-management
Software IT risk-managementgufranresearcher
 
Increasing the Probability of Success with Continuous Risk Management
Increasing the Probability of Success with Continuous Risk ManagementIncreasing the Probability of Success with Continuous Risk Management
Increasing the Probability of Success with Continuous Risk ManagementGlen Alleman
 
Relating Risk to Vulnerability
Relating Risk to Vulnerability Relating Risk to Vulnerability
Relating Risk to Vulnerability Resolver Inc.
 
project_risk_mgmt_final 1.ppt
project_risk_mgmt_final 1.pptproject_risk_mgmt_final 1.ppt
project_risk_mgmt_final 1.pptBetshaTizazu2
 
CYBOK: Risk Management Governance KA Webinar slides.pdf
CYBOK: Risk Management Governance KA Webinar slides.pdfCYBOK: Risk Management Governance KA Webinar slides.pdf
CYBOK: Risk Management Governance KA Webinar slides.pdfHari319621
 
Continuous Risk Management
Continuous Risk ManagementContinuous Risk Management
Continuous Risk ManagementGlen Alleman
 

Similar to Managing in the presence of uncertainty (20)

Risk management of the performance measurement baseline
Risk management of the performance measurement baselineRisk management of the performance measurement baseline
Risk management of the performance measurement baseline
 
What is risk?
What is risk?What is risk?
What is risk?
 
Quantification of Risks in Project Management
Quantification of Risks in Project ManagementQuantification of Risks in Project Management
Quantification of Risks in Project Management
 
Risk management
Risk managementRisk management
Risk management
 
Programmatic risk management workshop (handbook)
Programmatic risk management workshop (handbook)Programmatic risk management workshop (handbook)
Programmatic risk management workshop (handbook)
 
Project/Program Risk management
Project/Program Risk managementProject/Program Risk management
Project/Program Risk management
 
Risk-Management-05012023-025512pm.ppt
Risk-Management-05012023-025512pm.pptRisk-Management-05012023-025512pm.ppt
Risk-Management-05012023-025512pm.ppt
 
project_risk_mgmt_final.ppt
project_risk_mgmt_final.pptproject_risk_mgmt_final.ppt
project_risk_mgmt_final.ppt
 
project_risk_mgmt_final.ppt
project_risk_mgmt_final.pptproject_risk_mgmt_final.ppt
project_risk_mgmt_final.ppt
 
PMI project_risk_management_final_2022.ppt
PMI project_risk_management_final_2022.pptPMI project_risk_management_final_2022.ppt
PMI project_risk_management_final_2022.ppt
 
Building Risk Tolerance into the Program Plan and Schedule
Building Risk Tolerance into the Program Plan and ScheduleBuilding Risk Tolerance into the Program Plan and Schedule
Building Risk Tolerance into the Program Plan and Schedule
 
Risk management (final review)
Risk management (final review)Risk management (final review)
Risk management (final review)
 
Risk management(software engineering)
Risk management(software engineering)Risk management(software engineering)
Risk management(software engineering)
 
Risk Management as an enabler for project success
Risk Management as an enabler for project successRisk Management as an enabler for project success
Risk Management as an enabler for project success
 
Software IT risk-management
Software IT risk-managementSoftware IT risk-management
Software IT risk-management
 
Increasing the Probability of Success with Continuous Risk Management
Increasing the Probability of Success with Continuous Risk ManagementIncreasing the Probability of Success with Continuous Risk Management
Increasing the Probability of Success with Continuous Risk Management
 
Relating Risk to Vulnerability
Relating Risk to Vulnerability Relating Risk to Vulnerability
Relating Risk to Vulnerability
 
project_risk_mgmt_final 1.ppt
project_risk_mgmt_final 1.pptproject_risk_mgmt_final 1.ppt
project_risk_mgmt_final 1.ppt
 
CYBOK: Risk Management Governance KA Webinar slides.pdf
CYBOK: Risk Management Governance KA Webinar slides.pdfCYBOK: Risk Management Governance KA Webinar slides.pdf
CYBOK: Risk Management Governance KA Webinar slides.pdf
 
Continuous Risk Management
Continuous Risk ManagementContinuous Risk Management
Continuous Risk Management
 

More from Glen Alleman

Managing risk with deliverables planning
Managing risk with deliverables planningManaging risk with deliverables planning
Managing risk with deliverables planningGlen Alleman
 
A Gentle Introduction to the IMP/IMS
A Gentle Introduction to the IMP/IMSA Gentle Introduction to the IMP/IMS
A Gentle Introduction to the IMP/IMSGlen Alleman
 
Increasing the Probability of Project Success
Increasing the Probability of Project SuccessIncreasing the Probability of Project Success
Increasing the Probability of Project SuccessGlen Alleman
 
Process Flow and Narrative for Agile+PPM
Process Flow and Narrative for Agile+PPMProcess Flow and Narrative for Agile+PPM
Process Flow and Narrative for Agile+PPMGlen Alleman
 
Practices of risk management
Practices of risk managementPractices of risk management
Practices of risk managementGlen Alleman
 
Principles of Risk Management
Principles of Risk ManagementPrinciples of Risk Management
Principles of Risk ManagementGlen Alleman
 
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...Glen Alleman
 
From Principles to Strategies for Systems Engineering
From Principles to Strategies for Systems EngineeringFrom Principles to Strategies for Systems Engineering
From Principles to Strategies for Systems EngineeringGlen Alleman
 
NAVAIR Integrated Master Schedule Guide guide
NAVAIR Integrated Master Schedule Guide guideNAVAIR Integrated Master Schedule Guide guide
NAVAIR Integrated Master Schedule Guide guideGlen Alleman
 
Building a Credible Performance Measurement Baseline
Building a Credible Performance Measurement BaselineBuilding a Credible Performance Measurement Baseline
Building a Credible Performance Measurement BaselineGlen Alleman
 
Integrated master plan methodology (v2)
Integrated master plan methodology (v2)Integrated master plan methodology (v2)
Integrated master plan methodology (v2)Glen Alleman
 
IMP / IMS Step by Step
IMP / IMS Step by StepIMP / IMS Step by Step
IMP / IMS Step by StepGlen Alleman
 
DHS - Using functions points to estimate agile development programs (v2)
DHS - Using functions points to estimate agile development programs (v2)DHS - Using functions points to estimate agile development programs (v2)
DHS - Using functions points to estimate agile development programs (v2)Glen Alleman
 
Making the impossible possible
Making the impossible possibleMaking the impossible possible
Making the impossible possibleGlen Alleman
 
Heliotropic Abundance
Heliotropic AbundanceHeliotropic Abundance
Heliotropic AbundanceGlen Alleman
 
Capabilities based planning
Capabilities based planningCapabilities based planning
Capabilities based planningGlen Alleman
 
Process Flow and Narrative for Agile
Process Flow and Narrative for AgileProcess Flow and Narrative for Agile
Process Flow and Narrative for AgileGlen Alleman
 
Building the Performance Measurement Baseline
Building the Performance Measurement BaselineBuilding the Performance Measurement Baseline
Building the Performance Measurement BaselineGlen Alleman
 
Program Management Office Lean Software Development and Six Sigma
Program Management Office Lean Software Development and Six SigmaProgram Management Office Lean Software Development and Six Sigma
Program Management Office Lean Software Development and Six SigmaGlen Alleman
 
Policy and Procedure Rollout
Policy and Procedure RolloutPolicy and Procedure Rollout
Policy and Procedure RolloutGlen Alleman
 

More from Glen Alleman (20)

Managing risk with deliverables planning
Managing risk with deliverables planningManaging risk with deliverables planning
Managing risk with deliverables planning
 
A Gentle Introduction to the IMP/IMS
A Gentle Introduction to the IMP/IMSA Gentle Introduction to the IMP/IMS
A Gentle Introduction to the IMP/IMS
 
Increasing the Probability of Project Success
Increasing the Probability of Project SuccessIncreasing the Probability of Project Success
Increasing the Probability of Project Success
 
Process Flow and Narrative for Agile+PPM
Process Flow and Narrative for Agile+PPMProcess Flow and Narrative for Agile+PPM
Process Flow and Narrative for Agile+PPM
 
Practices of risk management
Practices of risk managementPractices of risk management
Practices of risk management
 
Principles of Risk Management
Principles of Risk ManagementPrinciples of Risk Management
Principles of Risk Management
 
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...
 
From Principles to Strategies for Systems Engineering
From Principles to Strategies for Systems EngineeringFrom Principles to Strategies for Systems Engineering
From Principles to Strategies for Systems Engineering
 
NAVAIR Integrated Master Schedule Guide guide
NAVAIR Integrated Master Schedule Guide guideNAVAIR Integrated Master Schedule Guide guide
NAVAIR Integrated Master Schedule Guide guide
 
Building a Credible Performance Measurement Baseline
Building a Credible Performance Measurement BaselineBuilding a Credible Performance Measurement Baseline
Building a Credible Performance Measurement Baseline
 
Integrated master plan methodology (v2)
Integrated master plan methodology (v2)Integrated master plan methodology (v2)
Integrated master plan methodology (v2)
 
IMP / IMS Step by Step
IMP / IMS Step by StepIMP / IMS Step by Step
IMP / IMS Step by Step
 
DHS - Using functions points to estimate agile development programs (v2)
DHS - Using functions points to estimate agile development programs (v2)DHS - Using functions points to estimate agile development programs (v2)
DHS - Using functions points to estimate agile development programs (v2)
 
Making the impossible possible
Making the impossible possibleMaking the impossible possible
Making the impossible possible
 
Heliotropic Abundance
Heliotropic AbundanceHeliotropic Abundance
Heliotropic Abundance
 
Capabilities based planning
Capabilities based planningCapabilities based planning
Capabilities based planning
 
Process Flow and Narrative for Agile
Process Flow and Narrative for AgileProcess Flow and Narrative for Agile
Process Flow and Narrative for Agile
 
Building the Performance Measurement Baseline
Building the Performance Measurement BaselineBuilding the Performance Measurement Baseline
Building the Performance Measurement Baseline
 
Program Management Office Lean Software Development and Six Sigma
Program Management Office Lean Software Development and Six SigmaProgram Management Office Lean Software Development and Six Sigma
Program Management Office Lean Software Development and Six Sigma
 
Policy and Procedure Rollout
Policy and Procedure RolloutPolicy and Procedure Rollout
Policy and Procedure Rollout
 

Recently uploaded

Onemonitar Android Spy App Features: Explore Advanced Monitoring Capabilities
Onemonitar Android Spy App Features: Explore Advanced Monitoring CapabilitiesOnemonitar Android Spy App Features: Explore Advanced Monitoring Capabilities
Onemonitar Android Spy App Features: Explore Advanced Monitoring CapabilitiesOne Monitar
 
Welding Electrode Making Machine By Deccan Dynamics
Welding Electrode Making Machine By Deccan DynamicsWelding Electrode Making Machine By Deccan Dynamics
Welding Electrode Making Machine By Deccan DynamicsIndiaMART InterMESH Limited
 
Go for Rakhi Bazaar and Pick the Latest Bhaiya Bhabhi Rakhi.pptx
Go for Rakhi Bazaar and Pick the Latest Bhaiya Bhabhi Rakhi.pptxGo for Rakhi Bazaar and Pick the Latest Bhaiya Bhabhi Rakhi.pptx
Go for Rakhi Bazaar and Pick the Latest Bhaiya Bhabhi Rakhi.pptxRakhi Bazaar
 
Intermediate Accounting, Volume 2, 13th Canadian Edition by Donald E. Kieso t...
Intermediate Accounting, Volume 2, 13th Canadian Edition by Donald E. Kieso t...Intermediate Accounting, Volume 2, 13th Canadian Edition by Donald E. Kieso t...
Intermediate Accounting, Volume 2, 13th Canadian Edition by Donald E. Kieso t...ssuserf63bd7
 
Memorándum de Entendimiento (MoU) entre Codelco y SQM
Memorándum de Entendimiento (MoU) entre Codelco y SQMMemorándum de Entendimiento (MoU) entre Codelco y SQM
Memorándum de Entendimiento (MoU) entre Codelco y SQMVoces Mineras
 
Darshan Hiranandani [News About Next CEO].pdf
Darshan Hiranandani [News About Next CEO].pdfDarshan Hiranandani [News About Next CEO].pdf
Darshan Hiranandani [News About Next CEO].pdfShashank Mehta
 
1911 Gold Corporate Presentation Apr 2024.pdf
1911 Gold Corporate Presentation Apr 2024.pdf1911 Gold Corporate Presentation Apr 2024.pdf
1911 Gold Corporate Presentation Apr 2024.pdfShaun Heinrichs
 
business environment micro environment macro environment.pptx
business environment micro environment macro environment.pptxbusiness environment micro environment macro environment.pptx
business environment micro environment macro environment.pptxShruti Mittal
 
EUDR Info Meeting Ethiopian coffee exporters
EUDR Info Meeting Ethiopian coffee exportersEUDR Info Meeting Ethiopian coffee exporters
EUDR Info Meeting Ethiopian coffee exportersPeter Horsten
 
BAILMENT & PLEDGE business law notes.pptx
BAILMENT & PLEDGE business law notes.pptxBAILMENT & PLEDGE business law notes.pptx
BAILMENT & PLEDGE business law notes.pptxran17april2001
 
trending-flavors-and-ingredients-in-salty-snacks-us-2024_Redacted-V2.pdf
trending-flavors-and-ingredients-in-salty-snacks-us-2024_Redacted-V2.pdftrending-flavors-and-ingredients-in-salty-snacks-us-2024_Redacted-V2.pdf
trending-flavors-and-ingredients-in-salty-snacks-us-2024_Redacted-V2.pdfMintel Group
 
Horngren’s Financial & Managerial Accounting, 7th edition by Miller-Nobles so...
Horngren’s Financial & Managerial Accounting, 7th edition by Miller-Nobles so...Horngren’s Financial & Managerial Accounting, 7th edition by Miller-Nobles so...
Horngren’s Financial & Managerial Accounting, 7th edition by Miller-Nobles so...ssuserf63bd7
 
Effective Strategies for Maximizing Your Profit When Selling Gold Jewelry
Effective Strategies for Maximizing Your Profit When Selling Gold JewelryEffective Strategies for Maximizing Your Profit When Selling Gold Jewelry
Effective Strategies for Maximizing Your Profit When Selling Gold JewelryWhittensFineJewelry1
 
Introducing the Analogic framework for business planning applications
Introducing the Analogic framework for business planning applicationsIntroducing the Analogic framework for business planning applications
Introducing the Analogic framework for business planning applicationsKnowledgeSeed
 
NAB Show Exhibitor List 2024 - Exhibitors Data
NAB Show Exhibitor List 2024 - Exhibitors DataNAB Show Exhibitor List 2024 - Exhibitors Data
NAB Show Exhibitor List 2024 - Exhibitors DataExhibitors Data
 
digital marketing , introduction of digital marketing
digital marketing , introduction of digital marketingdigital marketing , introduction of digital marketing
digital marketing , introduction of digital marketingrajputmeenakshi733
 
Healthcare Feb. & Mar. Healthcare Newsletter
Healthcare Feb. & Mar. Healthcare NewsletterHealthcare Feb. & Mar. Healthcare Newsletter
Healthcare Feb. & Mar. Healthcare NewsletterJamesConcepcion7
 
Excvation Safety for safety officers reference
Excvation Safety for safety officers referenceExcvation Safety for safety officers reference
Excvation Safety for safety officers referencessuser2c065e
 
Supercharge Your eCommerce Stores-acowebs
Supercharge Your eCommerce Stores-acowebsSupercharge Your eCommerce Stores-acowebs
Supercharge Your eCommerce Stores-acowebsGOKUL JS
 
Appkodes Tinder Clone Script with Customisable Solutions.pptx
Appkodes Tinder Clone Script with Customisable Solutions.pptxAppkodes Tinder Clone Script with Customisable Solutions.pptx
Appkodes Tinder Clone Script with Customisable Solutions.pptxappkodes
 

Recently uploaded (20)

Onemonitar Android Spy App Features: Explore Advanced Monitoring Capabilities
Onemonitar Android Spy App Features: Explore Advanced Monitoring CapabilitiesOnemonitar Android Spy App Features: Explore Advanced Monitoring Capabilities
Onemonitar Android Spy App Features: Explore Advanced Monitoring Capabilities
 
Welding Electrode Making Machine By Deccan Dynamics
Welding Electrode Making Machine By Deccan DynamicsWelding Electrode Making Machine By Deccan Dynamics
Welding Electrode Making Machine By Deccan Dynamics
 
Go for Rakhi Bazaar and Pick the Latest Bhaiya Bhabhi Rakhi.pptx
Go for Rakhi Bazaar and Pick the Latest Bhaiya Bhabhi Rakhi.pptxGo for Rakhi Bazaar and Pick the Latest Bhaiya Bhabhi Rakhi.pptx
Go for Rakhi Bazaar and Pick the Latest Bhaiya Bhabhi Rakhi.pptx
 
Intermediate Accounting, Volume 2, 13th Canadian Edition by Donald E. Kieso t...
Intermediate Accounting, Volume 2, 13th Canadian Edition by Donald E. Kieso t...Intermediate Accounting, Volume 2, 13th Canadian Edition by Donald E. Kieso t...
Intermediate Accounting, Volume 2, 13th Canadian Edition by Donald E. Kieso t...
 
Memorándum de Entendimiento (MoU) entre Codelco y SQM
Memorándum de Entendimiento (MoU) entre Codelco y SQMMemorándum de Entendimiento (MoU) entre Codelco y SQM
Memorándum de Entendimiento (MoU) entre Codelco y SQM
 
Darshan Hiranandani [News About Next CEO].pdf
Darshan Hiranandani [News About Next CEO].pdfDarshan Hiranandani [News About Next CEO].pdf
Darshan Hiranandani [News About Next CEO].pdf
 
1911 Gold Corporate Presentation Apr 2024.pdf
1911 Gold Corporate Presentation Apr 2024.pdf1911 Gold Corporate Presentation Apr 2024.pdf
1911 Gold Corporate Presentation Apr 2024.pdf
 
business environment micro environment macro environment.pptx
business environment micro environment macro environment.pptxbusiness environment micro environment macro environment.pptx
business environment micro environment macro environment.pptx
 
EUDR Info Meeting Ethiopian coffee exporters
EUDR Info Meeting Ethiopian coffee exportersEUDR Info Meeting Ethiopian coffee exporters
EUDR Info Meeting Ethiopian coffee exporters
 
BAILMENT & PLEDGE business law notes.pptx
BAILMENT & PLEDGE business law notes.pptxBAILMENT & PLEDGE business law notes.pptx
BAILMENT & PLEDGE business law notes.pptx
 
trending-flavors-and-ingredients-in-salty-snacks-us-2024_Redacted-V2.pdf
trending-flavors-and-ingredients-in-salty-snacks-us-2024_Redacted-V2.pdftrending-flavors-and-ingredients-in-salty-snacks-us-2024_Redacted-V2.pdf
trending-flavors-and-ingredients-in-salty-snacks-us-2024_Redacted-V2.pdf
 
Horngren’s Financial & Managerial Accounting, 7th edition by Miller-Nobles so...
Horngren’s Financial & Managerial Accounting, 7th edition by Miller-Nobles so...Horngren’s Financial & Managerial Accounting, 7th edition by Miller-Nobles so...
Horngren’s Financial & Managerial Accounting, 7th edition by Miller-Nobles so...
 
Effective Strategies for Maximizing Your Profit When Selling Gold Jewelry
Effective Strategies for Maximizing Your Profit When Selling Gold JewelryEffective Strategies for Maximizing Your Profit When Selling Gold Jewelry
Effective Strategies for Maximizing Your Profit When Selling Gold Jewelry
 
Introducing the Analogic framework for business planning applications
Introducing the Analogic framework for business planning applicationsIntroducing the Analogic framework for business planning applications
Introducing the Analogic framework for business planning applications
 
NAB Show Exhibitor List 2024 - Exhibitors Data
NAB Show Exhibitor List 2024 - Exhibitors DataNAB Show Exhibitor List 2024 - Exhibitors Data
NAB Show Exhibitor List 2024 - Exhibitors Data
 
digital marketing , introduction of digital marketing
digital marketing , introduction of digital marketingdigital marketing , introduction of digital marketing
digital marketing , introduction of digital marketing
 
Healthcare Feb. & Mar. Healthcare Newsletter
Healthcare Feb. & Mar. Healthcare NewsletterHealthcare Feb. & Mar. Healthcare Newsletter
Healthcare Feb. & Mar. Healthcare Newsletter
 
Excvation Safety for safety officers reference
Excvation Safety for safety officers referenceExcvation Safety for safety officers reference
Excvation Safety for safety officers reference
 
Supercharge Your eCommerce Stores-acowebs
Supercharge Your eCommerce Stores-acowebsSupercharge Your eCommerce Stores-acowebs
Supercharge Your eCommerce Stores-acowebs
 
Appkodes Tinder Clone Script with Customisable Solutions.pptx
Appkodes Tinder Clone Script with Customisable Solutions.pptxAppkodes Tinder Clone Script with Customisable Solutions.pptx
Appkodes Tinder Clone Script with Customisable Solutions.pptx
 

Managing in the presence of uncertainty

  • 1. Managing in the Presence of Uncertainty and the Resulting Risk The naturally occurring uncertainties (Aleatory) in cost, schedule, and techncial performance can be modeled in a Monte Carlo Simulation tool. The Event Based uncertainties (Epistemic) require capture, modeling of their impacts, defining handling strategies, modeling the effectiveness of these handling efforts, and the residual risks, and their impacts of both the original risk and the residual risk on the program. The management of Uncertainties in cost, schedule, and technical performance; and the Event Based uncertainty and the resulting risk are both critical success factors for the programs. Risk Management starts with capturing Event Based Risks and their impacts, then with the modeling of the statistical uncertainty of the normal work. 1 “It is moronic to predict without first establishing an error rate for the prediction and keeping track of one’s past record of accuracy” — Nassim Nicholas Taleb, Fooled By Randomness 14 V8.5
  • 2. 2 Risk Management is How Adults Manage Projects – Tim Lister, IBM AleatoryEpistemic
  • 3.  Uncertainty creates the opportunity for risk  Reducing uncertainty may reduce risk  Two types of uncertainty† – One that can be reduced – One that cannot  A risk informed PMB starts with the WBS  8 steps are needed to build a risk informed PMB 3 Quick View of How to Manage in the Presence of Uncertainty and Risk 14. Risk Risk informed program performance management is the goal † Distinguishing Two Dimensions of Uncertainty, Craig Fox and Gülden Ülkumen, in Perspectives of Thinking, Judging, and Decision Making
  • 4.  Lack of precision about the underlying uncertainty  Lack of accuracy about the possible values in the uncertainty probability distributions  Undiscovered Biases used in defining the range of possible outcomes of project processes  Natural variability from uncontrolled processes  Undefined probability distributions for project processes and technology  Unknowability of the range of the probability distributions  Absence of information about the probability distributions 4 Sources of Uncertainty 14. Risk
  • 5. 5 Uncertainties are things we can not be certain about. Uncertainty is created by Incomplete knowledge; not Ignorance 14. Risk
  • 6.  When we say uncertainty, we speak about a future state of an external system that is not fixed or determined  Uncertainty is related to three aspects of our program management domain: – The external world – the activities of the program – Our knowledge of this world – the planned and actual behaviors of the program – Our perception of this world – the data and information we receive about these behaviors 6 Some words about Uncertainty 14. Risk
  • 7.  Risk has two dimensions – The degree of possibility that an event will take place or occur sometime in the future – The consequences of that event, once it has occurred  The degree of possibility is qualified as the Probability of Occurrence  The consequences are usually taken to be undesirable and qualified as the magnitude of harm and the remaining probability of a recurrence of the same risk 7 Some Words About the Risk Resulting from the Uncertainty 14. Risk
  • 8.  Naturally occurring uncertainty and its resulting risk, impacts the probability of a successful outcome What is the probability of making a desired completion date or cost target? 8 All Program Activities have Naturally Occurring Uncertainty  The statistical behavior of these activities, their arrangement in a network of activities, and correlation between their behaviors creates risk  Adding margin protects the outcome from the impact of this naturally occurring uncertainty 14. Risk
  • 9.  Uncertainty is present when probabilities cannot be quantified in a rigorous or valid manner, but can described as intervals within a probability distribution function (PDF)  Risk is present when the uncertainty of the outcome can be quantified in terms of probabilities or a range of possible values  This distinction is important for modeling the future performance of cost, schedule, and techncial outcomes of a program 9 Relationship between Uncertainty and Risk 14. Risk
  • 10. TWO TYPES OF UNCERTAINTY IN OUR PROGRAM MANAGEMENT DOMAIN Uncertainty that we can gather more knowledge is – Epistemic  These are Event based uncertainties  There is a probability that something will happen in the future  We can state this probability of the event, and do something about reducing this probability of occurrence Uncertainty that we can not gather more knowledge about – Aleatory  These are Naturally occurring Variances in the underlying processes of the program  These are variances in work duration, cost, technical performance  We can state the probability range of these variances 10 14.1 14. Risk
  • 11.  Aleatory (stochastic, Type A) uncertainties are those that are random in nature and are therefore irreducible  Epistemic (subjective, Type B) uncertainties are knowledge-based and are reducible by further effort  Separating these classes helps in design of assessment calculations and in presentation of results for the integrated program risk assessment 11 Aleatory and Epistemic Uncertainty 14. Risk
  • 12.  Nuclear regulatory guidance in the UK makes a distinction between uncertainties that, – Can be reliably quantified – Cannot be reliably quantified  An uncertainty cannot be reliably quantified if, – It is not possible to acquire relevant data, or – If acquiring enough data to evaluate it statistically could only be done at disproportionate cost  Quantifiable uncertainties – numerical risk assessment  Unquantifiable uncertainties – separate consideration 12 An Alternative Classification 14. Risk
  • 13.  Scenario uncertainty – What might happen in the future?  Modeling uncertainty – Have we understood the system correctly, and have we implemented this understanding adequately in our numerical model?  Uncertainty in values assigned to variables (parameter uncertainty) – Have we given suitable values to the variables in our model? 13 Another Perspective On Uncertainty 14. Risk
  • 14.  Precision – how small is the variance of the estimates  Accuracy – how close is the estimate to the actual values  Bias – what impacts on precision and accuracy come from the human judgments (or misjudgments) 14 Measurement Uncertainty  Accuracy  Precision  Accuracy  Precision  Accuracy  Precision  Accuracy  Precision 14. Risk
  • 15.  Credible estimates of program variables require both Accuracy and Precision 15 Precision and Accuracy 14. Risk
  • 16.  Good measurements are both precise and accurate  It is easier to work with data that are imprecise (broad variance) than with data that are inaccurate (not close to the actual values)  It’s the Measurement Bias that is difficult to detect 16 Measurement Uncertainty 14. Risk
  • 17.  Variability is an inherent property of natural systems  Variability is not always the same as uncertainty  We may need a ‘representative’ value for our calculations – introduces uncertainty  Statistical techniques can be used to describe variability 17 Variability 14. Risk
  • 18.  We cannot be certain about most things on the program  Failure to reduce uncertainty has economic costs that may be very large  People (government, regulators, and the public) do not like uncertainty – it has a social cost as well as time and money  Response to uncertainty and the resulting risk is not always rational  It is not always possible to manage and communicate something that is not understood 18 Why Start with Uncertainty? 14. Risk
  • 19.  Cost  Schedule  Capacity for work  Productivity  Quality of results  Activity correlation 19 Naturally Occurring Uncertainty in the IMS Creates Risk With the naturally occurring uncertainty between -5% to 20% in our work effort durations, we have an 80% confidence of completing on or before our target date – PP&C speaking to PM 14. Risk
  • 20.  Knowing the underlying statistics of the past, and a model of the behavior, we can forecast the probability of the future behavior. 20 Events have an Uncertainty of Occurring and they Create Risk  Improving our knowledge with better data can be used for better models, – Improves the forecast of the probability of impact – Reduces damage through better preparation at a lower cost 14. Risk
  • 21.  Given that each outcome in the sample space  is equally likely, the probability of an event A is 21 The Probability of the Occurrence of an Event is …   A P A   14. Risk
  • 22. The Probability of a future event impacting the project creates risk There is a 68% probability Hurricane Katrina will strike New Orleans in the next 24 to 36 hours, with an 85% confidence. Evacuate Now 22 14. Risk
  • 23. ELICITING THE NATURALLY OCCURRING AND EVENT BASED UNCERTAINTY VALUES Discovering the uncertainties that then create risk is a process of elicitation. This process takes on many forms. The first is to look to the past to see what went wrong before, how was that discovered, how as it handled, and what did we learn – Lessons Learned. Next is the Subject Matter Expert approach. What can go wrong if you know how things work. SME’s many times ignore obvious 23 14.2 14. Risk
  • 24.  Starting with the WBS Dictionary – What are we producing? – What are the impediments to this effort? – What can go wrong with the produced item? – What are the responses to those impediments?  Placing all these in the Risk Register – What are their probabilities of occurrence? – What are the impacts? – What will it cost to handle the risk? – What is the residual probability of occurrence after the handling efforts? 24 Looking for Event Based Uncertainty means … 14. Risk
  • 25.  Staffing  Funding  Facilities  Supply chain  Regulatory and Government guidance  Weather  All the thing you don’t have direct control over 25 Looking for Externalities that create Uncertainty that drive Risk 14. Risk
  • 26.  Variances in: – Past performance – Capacity for work – Quality of the outcomes – Performance variances – Effectiveness variances  Develop class of these variance for application to the IMS as Reference Classes and apply these to the current work processes 26 Examining the Naturally Occurring Uncertainties that Drives Risk 14. Risk
  • 27.  Direct use of historical data  Direct assignments or estimates  Use of standard probability distributions: Rayleigh, Weibull, Poisson, or Kolmogorov-Smirov tests  Use of detailed modeling of phenomena and processes, with event trees, fault trees and Bayesian belief networks  Monte Carlo simulation to obtain the probabilities based on the models 27 Specifying a Probability Distribution for both Natural and Event Uncertainty† † Misconceptions of Risk, Terje Aven, University of Stavanger, Norway, John Wiley & Sons, 2010 Classical Inference and the Linear Model. Kendall's Advanced Theory of Statistics. 2A (Sixth ed.), Stuart, Keith, and Steven, 1999. But this probabilistic view does not capture everything about risk 14. Risk
  • 28. Terms used to separate the two classes of uncertainty and their risks  Aleatory Uncertainty† of an attribute must be addressed in the Integrated Master Schedule (IMS) with schedule and cost margin  Epistemic Uncertainty‡ of an event must be addressed in the Risk Register with risk retirement (mitigation) plans placed in the IMS  Risk events without planned retirement are assigned to Management Reserve  Aleatory risk can be modeled through Reference Class Forecasting or past performance data to determine the needed cost and schedule margin 28 † Naturally occurring variances in the underlying processes that cannot be removed ‡ Risk due to the lack of knowledge that can be reduced with further knowledge or specific actions 14. Risk
  • 29. Clarity of Purpose for the Risk Management Processes 29 14. Risk
  • 30.  There are many terms used in risk management that have common and overlapping meanings – Risk – Uncertainty – Probability – Confidence – Statistical percent  Many times these words are used without actually understanding what they mean 30 Terminology in Risk Management 14. Risk
  • 31.  Not known for sure  Not a precise value – varies in some way  Absence of information  Not possible to know  Changeable  Is a probabilistic process 31 What is Uncertainty? 14. Risk
  • 32.  Why classify? – Different types of uncertainties may require different approaches to identify and manage – Assessment context may require a particular classification – Separate assessment and / or presentation of different types of uncertainty may aid understanding  Various classifications are available for different purposes  Classifications are not unique or exhaustive – Be aware of overlaps and omissions 32 Classifying Uncertainty 14. Risk
  • 33. “Probability is the most important concept in modern science, especially as nobody has the slightest notion of what it means.” – Bertrand Russell, 1929 33 14. Risk
  • 34. A QUICK PROCESS CHECK With definitions of Naturally Occurring and Event Based uncertainty and their creation of their related classes of risk, let’s confirm our understanding of these concepts before proceeding to put them to work. 34 14.3 14. Risk
  • 35. A Quick Process Check 35 For example… The probability of a leakage in a process plant is a risk. This risk event is subject to uncertainty, but the risk concept is restricted to the event ‘leakage’ – the uncertainties and how people judge the uncertainties constitute a different domain. Risk Results from both Natural Uncertainty and Probabilistic Events 14. Risk
  • 36. The Defense Acquisition Guide (DAG) says… 36 Risk is the measure of future uncertainties in achieving program performance goals and objectives within defined cost, schedule, and performance constraints. Risk can be associated with all aspects of a program (e.g., threat environment, hardware, software, human interface, technology maturity, supplier capability, design maturation, performance against plan,) as these aspects relate across the work breakdown structure and Integrated Master Schedule. 14. Risk
  • 37. 1st Notion of Risk† 37† The works of Alexander Budzier and Bent Flyvbjerg, University of Oxford, 2011 The causes for risks clearly lie in our incomplete knowledge of the subject matter, thus if a project establishes all possible causes of risks they can be managed away. “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” – Mark Twain This of course that is simply not possible 14. Risk
  • 38. Some Classes of Risk Risk Class The Risk Impact Performance The ability of a design to meet desired quality criteria and the consequences of this risk Schedule The ability of a project to develop an acceptable design within a span of time and the consequences of this risk Cost The ability of a project to develop an acceptable design within a given budget and the consequences of this risk Technology Capability of technology to provide performance benefits and the consequences of this risk Business Political, economic, labor, societal, or other factors in the business environment and the consequences thereof 38 14. Risk
  • 39. 2nd Notion of Risk 39 Risk is derived from Uncertainty There are two classes of uncertainty: 1. Natural variances in the underlying processes work processes 2. Missing knowledge about something that is going happen in the future These two uncertainties are the source of two type of risk 1. Aleatory uncertainty – naturally occurring uncertainty defined in a probability density function (pdf) of possible values that will impact a process 2. Epistemic uncertainty – event based uncertainty, defined by a probability of occurrence, which impacts a process 14. Risk
  • 40. Aleatory Uncertainty Drives Risk 40 Aleatory uncertainty (stochastic or random uncertainty) is the inherent variation associated with a physical system or environment under consideration. Aleatory uncertainties can be singled out from other uncertainties by their representation as distributed quantities that take on values in an established or known range. The exact values will vary by chance from unit to unit or time to time. This random variability is characterized as an irreducible uncertainty, new information can not be obtained to reduce the uncertainty, only margin can be used to offset these uncertainties. This randomness itself, may be defined or qualified by the underlying epistemic assumptions † † “Ex-post identification and remedies of adverse effects,” Institute of Transport Economics (TØI), Norway, 27 September 2010 14. Risk
  • 41. Epistemic Uncertainty Drives Risk 41 † Risk-informed Decision-making In The Presence Of Epistemic Uncertainty, Didier Dubois1, Dominique Guyonnet, "International Journal of General Systems 40, 2 (2011) 145-167 Epistemic uncertainty is any lack of knowledge or information in any phase or activity of the project. This uncertainty and the resulting epistemic risk can be reduced through testing, modeling, past performance assessments, research, comparable systems and processes. Epistemic uncertainty can be further classified into model, phenomenological, and behavioral uncertainty.† The probability of occurrence is the start of Event Based risk management, but impacts, cost to mitigate, residual risk and its impact, and cost to mitigate the residual risk must also be considered, but any credible risk management plan can be in place 14. Risk
  • 42.  Both Aleatory and Epistemic uncertainty exist for cost, schedule, and technical performance  Both these uncertainties create risk for the program  Determining which type of uncertainty is straight forward … – Variances in cost and schedule due to normal fluctuations of the work processes that cannot be corrected with management actions are Aleatory – Event Based risks from a probabilistic occurrence of an undesirable occurrence and a probabilistic unfavorable outcome, after the occurrence are Epistemic risks In Our DoD domain … Using the term uncertainty is not sufficient. The resulting risk must be further categorized as being responsive to new information or simply part of the normal operations of the program 14. Risk
  • 43. 43 Elements of Risk Modeling  For future building this is aleatory – No addition testing will reduce variability  For existing buildings it is epistemic – Testing can confirm strength of installed product Risk arises from Uncertainty in the random variables of the program  The compressive strength of concrete has a range of uncertainty 14. Risk
  • 44. Sources Of Risk Due To Uncertainty Type Description Parameter Exact value for experimental models are unknown Structural Model bias or model inconsistencies Algorithmic Numeric errors or approximation Parametric Variability on input values Experimental Observation errors Interpolation Extrapolation need for lack of model data Aleatory Statistical uncertainty – the natural variability of the processes Epistemic Systematic uncertainty – information known in principle but not in practice 44 14. Risk
  • 45. Risk Driver Relationship Processes Reduce Ambiguity Reduce Uncertainty Residual Risk Consequence of Uncertainty Epistemic Uncertainty – Event Based Risk Remaining Aleatory Uncertainty Aleatory Uncertainty Severity of Consequences 45 Sources of Uncertainty 14. Risk
  • 46.  Epistemic uncertainty results from gaps in knowledge. For example, we can be uncertain of an outcome because we have never used a particular technology before. – Such uncertainty is essentially a state of mind and hence subjective.  Aleatory uncertainty results from variability that is intrinsic to the behavior of some systems. For example, we can be confident regarding the long term frequency of throwing sixes but I remain uncertain of the outcome of any given throw of a dice. – This uncertainty can be objectively determined. 46 Some more background on Aleatory and Epistemic risk 14. Risk
  • 47.  Frequentist probability theory is used to analyze systems that are subject to aleatory uncertainty  Bayesian probability theory is used to analyze epistemic uncertainty  For most risk assessments there is both epistemic and aleatory uncertainty  But epistemic uncertainty is always significant due to the novelty of the situation under assessment  Standard Monte Carlo Simulation uses frequentist probability theory to analyze risk and should only be used for Aleatory Risks – standard variances in cost, schedule, and technical performance We will use both branches of Probability Theory for Risk Management The cardinal sin of risk management is applying frequentist (Monte Carlo Simulation) probability to model epistemic uncertainty 47 14. Risk
  • 48.  When Monte Carlo Simulation is used to model schedule risk, the schedule uncertainties are being treated as if they are aleatory, even though they may be predominantly epistemic  Using standard Monte Carlo Simulation alone to analyze schedule risk also requires unrealistic assumptions be made about the correlations between the probabilities for the individual outcomes  In practice, correlations must be considered when analyzing schedule risk  These can be both a positive and negative correlations  As a result the use of Monte Carlo Simulation should be used with care when the historical data of past performance is incomplete 48 The core problem with Aleatory Risk Management of Schedules 14. Risk
  • 49. Identify the Reference Class variability from:  Reference classes of similar past work activities  Establish the probability distribution for the selected reference class for the parameter that is being forecast  Compare the specific set of activities with the reference class distribution, to establish the most likely outcome for the specific durations assigned in the current project 49 How To Fix This Core Problem 14. Risk
  • 50.  Every single thing or event has an indefinite number of properties or attributes observable in it, and might therefore be considered as belonging to an indefinite number of different classes of things – John Venn (1834 – 1923)†  If we are asked to find the probability holding for an individual future event, we must first incorporate the event into a suitable reference class. An individual thing or event may be incorporated in many reference classes, from which different probabilities will result – Hans Reichenbach (1891 – 1953)‡ 50 Reference Class Forecasting † J. Venn, The Logic of Chance (2nd ed, 1876), p. 194 ‡ H. Reichenbach, The Theory of Probability (1949), p. 374 14. Risk
  • 51. LET’S BUILD A RISK INFORMED PMB IN EIGHT STEPS A Risk Informed PMB means that both Aleatory and Epistemic risk mitigations are embedded in the PMB. For non-mitigated Epistemic risks, Management Reserve must be in place to cover risks that are not being mitigated in the IMS. While DCMA would object, this Management Reserve needs to be assigned to specific risks or classes of risk to assure that sufficient MR is available and use is pre-defined. 51 14.4 14. Risk
  • 52. Assemble a credible WBS and the Integrated Master Plan / Integrated Master Schedule (IMP/IMS) – WBS Dictionary says what will be built – IMP Narrative says how, where, and what processes are used to built it Assess the aleatory uncertainties in the WBS and IMP Adjust activity durations and sequence to create the needed margin to handle the aleatory uncertainty Assign schedule and cost margin to protect end item deliverables 52 How to Build a Risk Adjusted IMS in 8 Steps 0 1 2 3 14. Risk
  • 53. Identify Event Based uncertainties from WBS Dictionary and IMP Narratives Assign these uncertainties to the Risk Register Determine risk retirement plans and place them in the IMS Determine cost and schedule impacts of unmitigated risks and develop Management Reserve Assemble mitigated aleatory and epistemic uncertainties with the unmitigated epistemic risk into the Total Allocated Budget 53 Building a Risk Adjusted IMS in 8 Steps (Concluded) 4 5 6 7 8 14. Risk
  • 54. Risks Identified with WBS elements  Each risk identified in the elicitation process  WBS contained deliverables assigned to risk retirement processes  Risk water fall defined by Program Event ID Risk Title Initial Risk Risk at IBR Risk at PDR Risk Type WBS 038 Center-of-Gravity Limits 16 15 10 Technical 2.1.5 006 Gross Liftoff Weight 16 15 10 Technical 2.1.5 090 Flight & Mission-Critical Software Development Effort 16 11 10 Schedule 2.1.4 101 Unattended launch system design 16 12 8 Schedule 6.2.14 082 Achieving Component, Subsystem- & System Quals 15 14 11 Schedule 2.1.7 244 Vehicle Production timing 12 12 10 Schedule 6.5 095 Autonomous Rendezvous flight pattern design 12 10 9 Schedule 6.2.12 017 EMI Anti-Jam Protection System Development 12 10 7 Technical 6.2.5 243 Landing and Impact Attenuation 12 12 6 Technical 6.2.11 098 Recover/Landing System (RLS) Rigging Complexity 12 12 6 Technical 6.2.11 088 Qualification of EEE Parts 12 10 4 Schedule 2.1.9.3 091 Uncertain To Achieve Payload Mounting Limits 12 8 3 Schedule 604604 54 0 14. Risk
  • 55.  Variances in duration and cost are applied to the Most Likely values for the work activities  Apply these variances in the IMS  Model the outcomes using a Monte Carlo Simulation tool  The result is a model of the confidence of completing on or before a date and at or below a cost 55 Assess the Aleatory Uncertainties in the WBS and IMS 1 14. Risk
  • 56.  Using the outcomes from the Monte Carlo Simulation develop the needed schedule and cost margin  Place margin in front of key deliverables to protect their commitment dates and costs 56 Adjust activity durations and sequence to create the needed margin 2 5 Days Margin 5 Days Margin Plan B Plan A Plan B Plan AFirst Identified Risk Alternative in IMS Second Identified Risk Alternative in IMS 3 Days Margin Used Downstream Activities shifted to left 2 days Duration of Plan B < Plan A + Margin 2 days will be added to this margin task to bring schedule back on track 14. Risk
  • 57.  This margin is on baseline in the PMB  Unused margin should be capable of being shifted to the right to increase available margin in future deliverables 57 Assign schedule and cost margin to protect end item deliverables 3 30% Probability of failure 70% Probability of success Plan B Plan A Current Margin Future Margin 80% Confidence for completion with current margin Duration of Plan B Plan A + Margin 14. Risk
  • 58.  These uncertainties are defined in the IMS  They can be assigned to work activities  Work can be assigned to reduce or retire the risk associated with these uncertainties 58 Identify Event Based uncertainties from WBS Dictionary and IMP Narratives 4 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 Risk ID: CEV-038—Center-of-Gravity Limits RiskScore 2005 2006 2007 2008 2009 2010 2011 2012 DP048-TV-1029 1 2 4 5 6 8 3 11 10 12 13 17 19 14 16 20 21 22 23 SDR PDR LAS-1 Test Flt CDR LAS-3 Test Flt RRF-1 Test Flt RRF-2/3 Test Flt ISS-1 Flt LAS-2 Test Flt 7 9 15 18 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 Risk ID: CEV-038—Center-of-Gravity Limits RiskScore 2005 2006 2007 2008 2009 2010 2011 2012 DP048-TV-1029 1 2 4 5 6 8 3 11 10 12 13 17 19 14 16 20 21 22 23 SDR PDR LAS-1 Test Flt CDR LAS-3 Test Flt RRF-1 Test Flt RRF-2/3 Test Flt ISS-1 Flt LAS-2 Test Flt 7 9 15 18 14. Risk
  • 59.  Risks are connected to the WBS elements in the IMS 59 Assign these Uncertainties to the Risk Register 5 14. Risk
  • 60.  With the identified risks and their mitigations, create packages of work to reduce the risk  Treat these risk reduction work activities as standard work in the IMS – Budget – Measures of Performance – Measures of Effectiveness  Report progress of the risk retirement or risk reduction activities in the program performance measurement process 60 Determine risk retirement plans and place them in the IMS 6 14. Risk
  • 61.  For each element in the Risk Register – either mitigated or unmitigated – have a model of the impact on cost schedule, or techncial performance  Use this information to develop the needed Management Reserve (MR) to be held outside the Performance Measurement Baseline (PMB)  For mitigated Epistemic Risks, model the needed cost and schedule reserve for the work activities just like the normal work activities 61 Determine cost and schedule impacts of unmitigated risks and develop Management Reserve 7 14. Risk
  • 62.  Aleatory risks and their cost and schedule margins  Mitigated Epistemic risks with their retirement or reduction activities  Unmitigated risks with cost and schedule margin held in the Management Reserve register  All these costs and schedule impacts are rolled up to the TAB 62 Assemble mitigated aleatory and epistemic uncertainties with the unmitigated epistemic risk into the Total Allocated Budget 8 14. Risk
  • 63. RISK HANDLING STRATEGIES Handling risk means dealing with the sources of risk and the consequences of the risk when it comes true. Handling is a better term than mitigation. Handling covers all the responses to the risk that results from the underlying uncertainties – both aleatory and epistemic. Handling plans describe the specific responses to reduce the uncertainty – of possible – that create the risk. These can be funded on baseline or held in Management Reserve. The irreducible uncertainties must be handled through margins – schedule margin or cost margin. 63 14.5 14. Risk
  • 64. Understanding Inputs is the first step for Risk Management  Risk Register Contents  Probability of occurrence  Probability of cost and schedule impact  Impact measures and their variability  Risk mitigation effectiveness  Residual risk after mitigation  Residual cost and schedule impact 64 We can’t Interpret the Results Without Understand the Inputs! 14. Risk
  • 65. Components of Risk  Risk is comprised of two core components. – Threat – a circumstance with the potential to produce loss. – Consequence – the loss that will occur when a threat is realized.  With 3 Risk Statement Structures that the Treat and the Consequence Threat Consequence Probability Impact Cause Effect 65 14. Risk
  • 66. IF-THEN Risk Statement IF THEN Risk 1 If we miss our next milestone. Then the program will fail to achieve its product, cost, and schedule objectives. Risk 2 If our subcontractor is late in getting their modules completed on time. Then the program’s schedule will slip. Probability 66 1 14. Risk
  • 67. CONDITION-CONCERN Risk Statement Condition Concern Risk 1 Data indicates that some tasks are behind schedule and staffing levels may be inadequate. The program could fail to achieve its product, cost, and schedule objectives. Risk 2 Our subcontractor has not provided much information regarding the status of its tasks. The program’s schedule could slip. Probability 67 2 14. Risk
  • 68. CONDITION-EVENT-CONSEQUENCE Risk Statement Condition Event Consequence Risk 1 Data indicates that some tasks are behind schedule and staffing levels may be inadequate. We could miss our next milestone. The program will fail to achieve its product, cost, and schedule objectives. Risk 2 The subcontractor has not provided much information regarding the status of its tasks. The subcontractor could be late in getting its modules completed on time. The program’s schedule will slip. Probability 68 3 14. Risk
  • 69. Risk Handling Strategies  Risk handling is the outcome of the risk management strategy – they are not the same  Risk Handling consists of: – Assumption – understand what potential impacts may occur and have resources available to deal with them – Avoidance –make a change in the situation that creates the risk – Control or Mitigation – develop a proactive implementation approach to reduce the risk – Transfer – determine who (internally or external) can better handle the risk 69 14. Risk
  • 70. Risk Analysis 70 C A B C D E B A D E C B A D E CBA D E 14. Risk
  • 71. 1 2 3 4 5 1 2 3 4 5 Low Moderate High Consequence Likelihood 16. GLP compliance at BSL–4 USAMRIID required for The Animal Rule 1. FDA requires additional toxicology and/or ADME studies 2. FDA requires PK in pivotal animal studies 17. Two Segment II tox studies in non–rodent and/or Segment I and Segment III studies required for Category B label 18. FDA demands aerosol exposure (i.e. viral challenge) experiments be performed in nonhuman primate efficacy studies [L/H] 10. Irreversible kidney toxicity is seen in a subset of healthy volunteers at therapeutic dose levels 11. Clinical trial enrolls more slowly than expected. 12. Positive signal in QTc study 13. FDA requests clinical data in Special Populations pre–licensure 14. FDA requests larger clinical safety database than initially proposed 19. One of the pivotal animal efficacy studies fails to achieve primary clinical efficacy endpoint 20. No Observed Adverse Effect Level is significantly lower than expected [L/H] 3. Insufficient subunit purification at vendor 4. Failure of purification equipment at J–M 5. New impurities appear as a result of scale up from 8L to 50L 6. Subunits or API temporarily unavailable 7. Lot failures of subunits, API or drug product 8. One or more manufacturers not cGMP 15. Unsuccessful synthesis scale–up from 50L to 300L 16. New impurities appear as a result of scale up Example Risk Summary Grid 71 14. Risk
  • 72.  Poor Resolution – can correctly and unambiguously compare only a small fraction (e.g., less than 10%) of randomly selected pairs of hazards. – can assign identical ratings to quantitatively very different risks ("range compression").  Errors – can mistakenly assign higher qualitative ratings to quantitatively smaller risks. – For risks with negatively correlated frequencies and severities, provide no real information  Suboptimal Resource Allocation – Allocation of risk mitigation resources cannot be based on the categories provided by risk matrices  Ambiguous Inputs and Outputs – Categorizations of severity cannot be made objectively for uncertain consequences. – Inputs to risk matrices and resulting outputs require subjective interpretation  Don’t provide time frames for the exposure, mitigations, and impacts 72 The Trouble with Risk Matrices† † What’s Wrong with Risk Matrices, Tony Cox, Risk Analysis, Vol. 28, No 2, 2008 14. Risk
  • 73.  Modeling random data is not the same as modeling random processes  Data modeling assumes convenient functional forms and makes best fits to historical data – Functional forms might be arbitrarily chosen – Functional forms may have built-in bias – Goodness of fit is the only criterion (and is not falsifiable) – No theoretical justification is derived from the nature of the process  Data modeling considers only project outcomes; process modeling considers how we get to the outcomes and provides testable ideas – Improve predictability and understanding by using knowledge of the nature of the process to guide data modeling random processes 73 A Core Flaw of Risk Modeling Actual projects have fat tail distributions† † Fat Tailed Distributions For Cost And Schedule Risks, John Neatrour, SCEA, Jan 19, 2011 14. Risk
  • 74. Three Mandatory Steps In Successful Risk Management†  A high quality project schedule – Represents all work – Logically linked – No constraints – Resource loaded – Unbiased duration estimates  A contingency-free cost estimate – Items do not have padding built in to accommodate risk – No below-the-line contingency included.  Good quality risk data – Qualitatively identified risks – Probability and impact data 74†Integrated Cost and Schedule Risk Analysis using Monte Carlo Simulation of a CPM Model, AACEI No. 57R-09 14. Risk
  • 75.  Likelihood the project’s cost and schedule targets can be met  Time and cost margin needed to meet the risk threshold  Risk priorities to be handled to achieve schedule and cost estimates  Joint time and schedule analysis showing the probability of meeting time and cost targets jointly – the Joint Confidence Level (JCL) 75 Outputs of a Successful Risk Management Process 14. Risk
  • 76.  Risk work shop using a variety of identification techniques, specific tools for risk categorization and an explicit step that allocates each risk to a single risk owner  Meta‐language for describing risks that clearly separates cause, risk event and effect  Major review meetings at the start of every project phase  Information on risk status and response actions in the Risk Register to record the risk status, date and reason of exclusion 76 Basis for Good Risk Management Outcomes 14. Risk
  • 77.  Develop a project‐specific Risk Management Plan (RMP)  Plan, allocate and report explicitly on risk responses and risk treatment actions  Assign an internal project Risk Champion for communication, control and monitoring  Adequate use of range estimates in schedule and cost forecasting for factors influencing project forecasts and estimates minimized by using range estimates in schedules and costs 77 Basis of Good Risk Management Outcomes (Continued) 14. Risk
  • 78.  Planning‐based Quantitative Risk Analysis of risk response planning, estimate contingencies, compare alternatives, optimization of resource allocation and show the effectiveness of planned responses and risk treatment actions.  Establish a “mature” risk culture  Assure top management commitment  Confirm everyone on the program is trained 78 Basis of Good Risk Management Outcomes (Concluded) 14. Risk
  • 79. Both Probabilistic Risk and Statistical Uncertainty measures are needed Statistical Uncertainty  Naturally occurring (stochastic†) variance in the work efforts or cost  Like the weather, these variances are always there and are always changing  Uncertainty can be modeled with a Monte Carlo Simulation tool and Reference Class Forecasting based on past performance Probabilistic Risk Events  Probability of an event occurring in the future that results in an unfavorable outcome  When this event occurs the consequential may be probabilistic as well.  Probability of occurrence and impact are used to model the cost and schedule 79 The natural statistical variation of the project activities. Variance and impacts need cost and schedule margin There is a probability that something will happen that impacts cost, schedule, and technical performance of our deliverables † Stochastic (from the Greek στόχος for aim or guess) is an adjective that refers to systems whose behavior is intrinsically non-deterministic, sporadic, and categorically not intermittent (i.e. random). 14. Risk
  • 80. Risk and Uncertainty  In 1921 Frank Knight made the distinction between risk (randomness with knowable probabilities) and uncertainty (randomness with unknowable probabilities).  Today, these components of uncertainty are termed aleatory and epistemic uncertainties.  Knight, F. H. (1921). Risk, Uncertainty, and Profit Boston: Houghton Mifflin Company 80 14. Risk
  • 81. Risk and Uncertainty Risk stems from unknown probability distributions  A probabilistic event that when it occurs has an unfavorable impact on cost, schedule, and technical performance – or some combination  Risk events can be retired or mitigated prior to their occurrence  After mitigation or retirement, risk events may still have a probability of occurrence  Expressed as an expected probability of occurrence of an event accompanied by undesirable consequences Uncertainty stems from known probability distributions  Uncertainty produces variation from many small influences and yields a range of cost and schedule values on a particular activity – Schedule Perturbations – Budget Perturbations – Re–work, and re–test phenomena that naturally occur in the course of work  Uncertainties can be handled with cost, schedule, and technical performance Margin 81 Risk is Event Focused There is a 15% chance our stir welding process will result in faulty seams in the combustion chamber of the ascent engine Uncertainty creates the risk of an Event In the past, our C&DH box development efforts have a -5%/+15% variance. We need a 12% buffer to protect our deliverable 14. Risk
  • 82. The Meaning of Uncertainty  Uncertainty in plain English is about the “lack of certainty” – Uncertainty is about “variability” in relation to performance measures like cost, duration, or quality – Uncertainty is about “ambiguity” associated with a lack of this clarity  Known and unknown sources of bias and ignorance is about how much effort it is worth expending to clarify the situation – This is the underlying process driving uncertainty  As well, uncertainty arises from the basic processes of work – This is Deming uncertainty – It is the statistical “noise” in the work process  Both of these sources of uncertainty impact cost and schedule – Trying to control the “noise” of this variance adds no value – Trying to control the “lack of certainty” arising from ambiguity and lack of clarity does have value 82 14. Risk
  • 83. Speaking in “Uncertainty” Terms  When we state a date it needs to be qualified with one of two phrases – A range of possible value • The completion date for software requirements flow down will be no later than March 13th and no earlier than February 12th – A confidence on the desired or a target value • The software requirements flow down will be completion March 13th with 80% confidence  The “risk adjusted” vocabulary must be represented in the IMS as well  Separating deterministic planning from probabilistic planning is the starting point for building a Risk Tolerant IMS 83 14. Risk
  • 84. Planning in the Presence of Uncertainty  In the presence of uncertainty we need to speak about how we can improve our confidence … – As time passes the confidence intervals on an estimate should improve, as shown in the next slide. – This improvement can represent technical risk reduction or programmatic risk reduction.  But “risk tolerance” still needs to address the unknown and unknowable risks in the programmatic risk tolerance sense – The IMS must show how these disruptive activities can be tolerated without reducing the confidence in the deterministic plan 84 14. Risk
  • 85. Epistemic and Aleatory Uncertainty Both Uncertainties Exist on Programs  Aleatory – an inherent variation – a stochastic process – associated with the physical system or an environment: – For discrete variables – the duration of a work activity – the randomness is parameterized by the probability of each possible value – For continuous variables – the mass of a space craft component – the randomness is parameterized by the probability density function  Epistemic – probabilistic uncertainties that can be reduce by obtaining knowledge of quantities or processes : – For discrete random variables – the epistemic uncertainty is modeled by alternative probability distributions – For continuous random variables, the epistemic uncertainty is modeled by alternative probability density functions. 85 14. Risk
  • 86. Epistemic Uncertainty and Aleatory Variability are both risk drives† Epistemic Uncertainty  Epistemic uncertainty is the scientific uncertainty due to limited data and knowledge in the model of the process  Epistemic uncertainty can, in principle, be eliminated with sufficient study  Epistemic (or internal) uncertainty reflects the possibility of errors in our general knowledge. Aleatory Variability  Aleatory uncertainties arise from the inherent randomness of a variable and are characterized by a Probability Density Function  The knowledge of experts cannot be expected to reduce aleatory uncertainty although their knowledge may be useful in quantifying the uncertainty 86† Uncertainty in Probabilistic Risk Assessment: A Review, A.R. Daneshkhan Randomness With Knowable Probabilities Randomness With Unknowable Probabilities The probability of occurrence can be defined through a variety of methods. The outcome is a probability of occurrence of the event A Probability Density Function (PDF) generates a collection of random variables used to model durations and costs 14. Risk
  • 87. Structure of Program Risks 87 Risk management in small construction projects, Kajsa Simu, Luleå University of Technology Department of Civil and Environmental Engineering Division of Architecture and Infrastructure 14. Risk
  • 88. Examples of Aleatory and Epistemic Risks – both drive unfavorable outcomes on projects  If a component were required to operate for 17 years with 90% confidence during a flight to other planets, and it had only been tested for 1 year, the evaluation of whether it meets the 90% confidence requirement would have to include both aleatory uncertainty (e.g., the possibility of a premature failure given a known mean failure rate) and epistemic uncertainty (e.g., uncertainty in the mean failure rate due to the limited test time).  It is important to include both types of uncertainty in evaluating the performance risk.  It is also important to know the relative contribution of each type of failure, since the former source of risk could not be reduced by more testing (without design modification) but the latter source could. 88 14. Risk
  • 89. A Word of Caution  Common approach is to not separate aleatory and epistemic uncertainties and their resulting risks – Represent epistemic uncertainty with a uniform probability distribution – For a quantity that is a mixture of aleatory and epistemic uncertainty, use second-order probability theory  It is slowly being recognized that the above procedures (especially the first) can underestimate uncertainty in: – Physical parameters – Geometry of a systems – Initial conditions – Boundary conditions – Scenarios and environments The first approach can result in large underestimation of uncertainty in system responses 89 14. Risk
  • 90. Why Epistemic Uncertainty is a major risk driver  Epistemic uncertainty is presumed to be caused by lack of knowledge or data  The lack of knowledge part of the uncertainty can be represented in the model auxiliary non- physical variables  These variables capture information obtained through the gathering of more data  These auxiliary variables define statistical dependencies – the correlations between the uncertainties – in a clear and transparent manner 90 14. Risk
  • 91. A Reminder Again of Aleatory and Epistemic Risk  The key difference between aleatory and epistemic risk – Aleatory uncertainties arise from possible variations and random errors in the values of the parameters and their estimates. – Epistemic or ontological uncertainty can potentially be reduced by improving our knowledge – Epistemic uncertainties are subjective and are related to the lack of knowledge of the particular process. 91 14. Risk
  • 92. MODELING THE UNCERTAINTY THAT IS THE SOURCE OF RISK Many times the term Risk Mitigation is used to represent several actions that are actually Risk Handling Strategies. Mitigation is one strategy. Mitigation buys down the uncertainty and reduces the risk from that uncertainty. But another handling strategy is to ignore the uncertainty, transfer the uncertainty and the risk to someone else, or simply accept that the uncertainty is present and the resulting risk as well. 92 14.6 14. Risk
  • 93. Taxonomy of Uncertainty 93 Uncertainty Aleatory Epistemic Natural Variability Ambiguity Ontological Uncertainty Probabilistic Events Probabilistic Impacts Periods of Exposure 14. Risk
  • 94. Another Taxonomy of Uncertainty 94 14. Risk
  • 95.  Unknowns that differ each time the model of the IMS is assessed  Uncertainties the program controls staff cannot do anything about  Uncertainties that cannot be suppressed or removed  Risk is created when we have – Not accounted for this natural variance in our plan – Do not have sufficient buffer to protect the plan from these naturally occurring variances. 95 Aleatory Uncertainty 14. Risk
  • 96.  Systematic uncertainty  Caused by things we know about in principle, but don’t know about in practice  Risk is created when we have: – Not measured the quantity sufficiently accurately – The model neglects certain effects – The data is not available to quantify the risk 96 Epistemic Uncertainty 14. Risk
  • 97. Dealing with Aleatory Uncertainty and the Resulting Risk  Aleatory uncertainty is expressed as process variability – Work effort variance – Productivity variance – Quality of product and resulting rework valance  Aleatory risk is always expressed in relation to a duration – a percentage of the duration  The classical response to such variability is to build a margin that reduces risk over the duration This is the motivation for short Packages Of Work that produce defined outcomes on fine grained boundaries 97 14. Risk
  • 98. Dealing with Epistemic Uncertainty and the Resulting Risk  Reducing epistemic risk requires improvement our knowledge of the system of interest or avoiding implementations that increase this uncertainty  Uncertainty introduced by design assumptions are reduced by making all assumptions an explicit part of the design – Technical Performance Measures – and revisiting these assumptions on a regular basis to confirm they remain valid or whether they can be removed and real data substituted 98 14. Risk
  • 99. Sources of Epistemic Uncertainty  Epistemic uncertainty is introduced every time an assumption about the world in which the system is embedded is made  The assumption could be made because of the lack of data – Ontological uncertainty  The assumption can be simplified to make the job easier – Epistemic uncertainty  Probability uncertainty – failure rates of components are epistemic  Subjectivity of evaluation – an Epistemic risk when the likelihood of a rare event is made with little or no empirical data  Incompleteness problem – a major hazard or condition not identified or a causal mechanism remains undetected  Undetected design errors – introduced an ontological uncertainty into the systems behavior 99 14. Risk
  • 100. Monte Carlo Sampling used for Aleatory Uncertainty Propagation 100 Duration distribution of work in the network Network of activities Probability of completing on or before a specific date 14. Risk
  • 101. Monte Carlo Sampling used for Epistemic Interval Propagation 101 Possible values of a parameter Mass model of the vehicle Possible outcomes from the model 14. Risk
  • 102. Duration uncertainty (Aleatory) represented in the IMS baseline  The independence or dependency of each task with others in the network, greatly influences the outcome of the total project duration  Understanding these dependencies is critical to assessing the credibility of the IMS as well as the total completion time 102  Any path could be critical depending on the probability distributions of the underlying task completion probability functions We must know something about the probability distributions of the work efforts 14. Risk
  • 103. Uncertainty in the IMS drives cost and schedule as a Dynamic Network System  The programmatic and planning dynamics act as a system  The “system response” is the transfer function between input and output Inputs Outputs  Understanding this transfer function is critical to understanding the dynamics of the program – It is part of the stochastic dynamic response to disruptions in our plans – “What if” really means “what if” at this point in the response curve of the system 103 The response curve is likely non- linear as well, requiring further modeling of the IMS dynamics 14. Risk
  • 104.  When Monte Carlo is used to model schedule risk, the schedule uncertainties are treated as aleatory, even though they may be epistemic  This is considered to be unrealistic and is known to give biased results, but is used anyway  The analysis of schedule risk requires assumptions to be made regarding the correlations between the probabilities for the individual outcomes: – It is assumed there are no correlations or that they are all of the same nature – In practice, there are correlations to be considered when analyzing schedule risk and they are of both a positive and negative nature 104 Some More Words of Caution 14. Risk
  • 105. Probability Distributions used for modeling uncertainty Distribution Application Uniform Appropriate for uncertainty quantities where the range can be established (maximum and minimum values can be defined) based on physical arguments, expert knowledge or historical data. If the range of parameter values is large (greater than one order of magnitude), a log uniform distribution is preferred to a uniform one. Triangular When little relevant information exits, but extremes and most likely values are known, typically on the basis of subjective judgment. If the parameter values cover a wide range a log triangular distribution is preferred. Empirical Useful when some relevant data exists, but cannot be represented by any standard statistical distribution. A piecewise uniform (empirical) distribution is recommended in this case. Normal When a substantial amount of relevant data exits. Can represent errors due to additive processes. It is useful for modeling symmetric distributions of many natural process and phenomena. Is often used as a “default” distribution for representing uncertainties. Log normal Useful as an asymmetrical model for a parameter that can be expressed as a quotient of other variables, so they are useful for representing physical quantities, such as concentrations. Poisson Useful for describing the frequency of occurrence of random, independent events within a given time interval. Beta It is often used to represent judgments about uncertainty. Also to bounded, unimodal, random parameters. 105 14. Risk
  • 106. Deterministic versus Probabilistic Planning at the Program Level 106 Baseline Plan 80% Mean Missed Launch Period Launch Period Ready Early Oct 07 Nov 07 Dec 07 Jan 08 Feb 08 Mar 08 Apr 08 May 08 Jun 08 Margin Risk Margin Current Plan with risks is the stochastic schedule CDR PDR SRR FRR ATLO 20% Aug 05 Jan 06 Aug 06 Mar 07 Dec 07 Feb 08 Current Plan with risks is the deterministic schedule Plan Title Probability distribution varies as time passes 14. Risk
  • 107.  In 1979, Tversky and Kahneman proposed an alternative to Utility theory. Prospect theory asserts that people make predictably irrational decisions.  The way that a choice of decisions is presented can sway a person to choose the less rational decision from a set of options.  Once a problem is clearly and reasonably presented, rarely does a person think outside the bounds of the frame.  Source: – “The Causes of Risk Taking By Project Managers,” Proceedings of the Project Management Institute Annual Seminars & Symposium November 1–10, 2001, Nashville, Tennessee – Tversky, Amos, and Daniel Kahneman. 1981. The Framing of Decisions and the Psychology of Choice. Science 211 (January 30): 453–458 107 Sobering Facts About Naïve Use of Three Point Estimates 14. Risk
  • 108.  Building a risk tolerant IMS – Explicit technical risk mitigation must be embedded in the IMS – Explicit schedule margin must be embedded in the IMS • Margin values identified through Monte Carlo simulations • Margin assigned in front gating events – Technical risks connected to Risk Register in some form – Cost and Schedule risks connected in the IMS and a modeling tool  Assessing the Risk Tolerant IMS – what does risk tolerant mean? – Weekly status, monthly Earned Value, forecast of risk impacts – Weekly Monte Carlo assessment of confidence intervals and their historical changes – are we getting better or worse? – Performance forecast based on likelihood outcomes from Monte Carlo simulations, not just “adding up the numbers” 108 Actionable Outcomes for Credible Risk Management 14. Risk
  • 109.  Forward looking – leading indicators reveal opportunities for corrective actions  Trending information must forecast outcomes – Cost trends – Schedule trends – Performance trend – Risk trends  EAC / ECD driven forecasts from past performance, trends, and actions to control trends 109 Risk Register Based Decision Making processes of the IMP/IMS 14. Risk
  • 110.  Some simple steps to identifying risk opportunities in the IMS – Scenario based planning – “what if this happens?” – Event impact planning – “what inhibits success?”  Both must focus on the consequences in order to identify the mitigations 110 Implementing Programmatic Risk Assessment is Straight Forward Initiating Event Selection Scenario Development Scenario Logic Modeling Scenario Frequency Modeling Consequence Modeling Risk Integration 14. Risk
  • 111.  DoD Guidance – DAU “Risk Management Guide for DoD Acquisition” – Air Force, “Acquisition Risk Management” – Air Force “SMC Systems Engineering Primer and Handbook” 111 Continuous Risk Management (CRM) is required CRM Activity IMS Representation Identify Risk items with IMP/IMS #’s, CA/WP & resource assignments Analyze Risk management responsibilities assigned Plan Mitigation plans with durations and resource assignments Track Status reported from Risk Management to IMS Control Risk tasks reporting in weekly status process Communicate IMS status reporting 14. Risk
  • 112. 112 Level Likelihood E Near Certainty D Highly Likely C Likely B Low Likelihood A Not Likely Level Technical Performance Schedule Cost A Minimal or no consequence to technical performance Minimal or no impact Minimal or no impact B Minor reduction in technical performance or supportability Able to meet key dates Budget increase or unit production cost increases. < **(1% of Budget) C Moderate reduction in technical performance or supportability with limited impact on program objectives Minor schedule slip. Able to meet key milestones with no schedule float. Budget increase or unit production cost increase < **(5% of Budget) D Significant degradation in technical performance or major shortfall in supportability Program critical path affected Budget increase or unit production cost increase < **(10% of Budget) E Severe degradation in technical performance Cannot meet key program milestones. Slip > X months Exceeds budget increase or unit production cost threshold This matrix must be built for each category of risk (reference class). The decision for each dimension comes from Subject Matter Experts and the Risk Management team. E D C B A A B C D E 14. Risk
  • 113.  Two functions of Event Based Risk Management – Identification, recording, ranking, and reviewing risks, mitigation, and response plans, and all associated risk information – Risk analysis to determine how risks affect cost, schedule, and technical performance  Notional categories of risk. If the risk happens … – Duration and cost – we’re late and over budget – Safety – an unsafe condition is created – Legal – a litigation even is created – Performance – a less than acceptable performance condition results – Technical – our product or service is noncompliant – Environmental – the external environment is placed in an unfavorable condition 113 Event Based Risk Management 14. Risk
  • 114.  Known Unknowns – general uncertainties and uncertain events that were identified and quantified  Biases – conscious or subconscious systematic errors occurring when identifying and quantifying general uncertainties and uncertain events  Unknown Unknowns – factors that were missed, including some types of organizational and psychological bias when identifying general uncertainties and uncertain events 114 Build the Event Based Risk Model† † Chapman, C., Ward, S., 2003. Project Risk Management. Processes, Techniques and Insights, second ed. John Wiley & Sons, England 14. Risk
  • 115.  It would be a rare occurrence if two risks were not correlated in some way in a large program  The correlation coefficient between X and Y is given by … 115 Risk Events Are Correlated 14. Risk
  • 116.  Naturally occurring uncertainty drives cost and schedule through uncontrolled variance  Probabilistic events drives disruptions in the planned order of the work  Both impact the EAC – Cost and schedule variance can be handled through margin for naturally occurring uncertainty – Management Reserve can be used for probabilistic events that occur within the scope of the program 116 Uncertainty and Risk Drives EAC 14. Risk
  • 117.  Completion dates move to the right by naturally occurring variance in work activity durations  Completion dates move to the right when unmitigated uncertainties become issues 117 Uncertainty and Risk Drives ECD 14. Risk
  • 118.  Break process flow into small steps of clearly defined activities, modeling predecessors and successors  Estimate – Time duration of each step based on probable work time for each type of labor involved – Yield statistics at each step – what fraction of a products output are expected to be compliant  Define the rework loops if possible  Combine step duration to obtain an estimate of total time require to meet specific milestones  Identify the Critical Path through the network that will delay the program 118 Analyzing the IMS for Risk 14. Risk
  • 119.  Weight of components and subsystems  Power, cooling, attitude control  Integration and testing  Data memory  Number of source lines of code to be written  Software testing complexity  Special mission equipment  Subcontract interrelationships 119 Technical Schedule Drivers 14. Risk
  • 120.  The most likely estimate of the duration of a task is optimistic  Tasks done in parallel take longer than planned  Tasks uncertainties are correlated  Estimates of task duration uncertainty are too narrow  Risk events not included 120 Programmatic Schedule Drivers 14. Risk
  • 121. Task Durations Are Correlated† Even Uncorrelated is Correlated 121† David Voss, Project Schedule Risk Analysis, VOSE SOFTWARE BVBA 14. Risk
  • 122.  An integrated tool is needed to connect the Event Based risk (Epistemic) with the variance uncertainty (Aleatory) in the IMS  Risk Drivers must be modeled as well  Management Reserve modeling is needed for the un-mitigated Epistemic risk  Schedule and Cost modeling is needed for the Aleatory risks created by duration and cost variances 122 Modeling Uncertainty and Risk 14. Risk
  • 123.  Least complex elicitation is the uncertainty of an event – its presence or absence  Next level is when the event is resolved into more than two outcomes  Sometime the outcome is a numerical quantity with a large (possibly infinite) number of possible values.  For the last case we need a Probability Density Function (PDF) 123 Eliciting Probability Distributions 14. Risk
  • 124.  Electing this information is only one method of obtaining probabilities  Historical data, with a stable process that generated that data can be used to develop new data.  Reference Class Forecasting is the current basis of historical data used to forecast classes of project activities and their Aleatory variance 124 Eliciting Probability Distributions (Concluded) 14. Risk
  • 125.  Probabilities should be informative – Probabilities closer to 0.0 or 1.0 should be preferred to those closer to .5 as the more extreme probabilities provide greater certainty about the outcome of an event  Probabilities should authentically represent uncertainty – For events that are given an assessed probability of p, the relative frequency of occurrence of those events should approach p 125 Probabilities Must Have Desirable Properties 14. Risk
  • 126.  The process of expressing knowledge in terms of probabilities is not simple and is subject to repeatable types of errors  Representiveness heuristics – using relevant evidence associated with the target event  Availability heuristics – information that is easier to recall gives more weight in forming probability judgments 126 Heuristics and Biases in Forming Probability Judgments 14. Risk
  • 127. Risk Chains – Across The WBS 127 14. Risk
  • 128. Risk Management Processes for Program Management  An approach to programmatic and technical risk 14. Risk
  • 129. Risks in Risk Register connected to WBS elements provide cost impact analysis  Risk ID traceable to IMS for schedule impacts  WBS elements collect cost impact of risk  Risk handling strategies connected to IMP, IMS, WBS, SOW, and TPM measures 14. Risk
  • 130. Connecting Risk Retirement with the work activities in the IMS 130  “Buying down” risk is planned in the IMS.  MoE, MoP, and KPP defined in the work package for the critical measure – weight.  If we can’t verify we’ve succeeded, then the risk did not get reduced.  The risk may have gotten worse Risk: CEV-037 - Loss of Critical Functions During Descent Planned Risk Level Planned (Solid=Linked, Hollow =Unlinked, Filled=Complete)RiskScore 24 22 20 18 16 14 12 10 8 6 4 2 0 Conduct Force and Moment Wind Develop analytical model to de Conduct focus splinter review Conduct Block 1 w ind tunnel te Correlate the analytical model Conduct w ind tunnel testing of Conduct w ind tunnel testing of Flight Application of Spacecra CEV block 5 w ind tunnel testin In-Flight development tests of Damaged TPS flight test 31.Mar.05 5.Oct.05 3.Apr.06 3.Jul.06 15.Sep.06 1.Jun.07 1.Apr.08 1.Aug.08 1.Apr.09 1.Jan.10 16.Dec.10 1.Jul.11 Weight risk reduced from RED to Yellow Weight confirmed ready to fly – it’s GREEN at this point 14. Risk
  • 131. Management Reserve Log (MRL) provides the integrity for all changes to the PMB  All changes authorized through the BCR process  All impacts recorded in BCR and Management Reserve impacts (ups and downs) recorded in the same meeting 14. Risk
  • 132.  Are characterized by uncertainty, non-linearity and reclusiveness, best viewed as dynamic and evolving systems.  So why do we pretend they are predictable, definable and fixed – and why do we use linear lifecycle models to manage them 132 Risk in Complex Programs† † Complexity in Defence Projects How Did We Get Here?, Concept Symposium 2010, Oscarsborg Norway. Mary McKinlay 14. Risk
  • 133. The Final Notion of Risk 133 The causes for risks clearly lie in our incomplete knowledge of the subject matter, thus if a project establishes all possible causes of risks they can be managed away. And of course that is simply not possible This puts the focus on discovering and delaying with Epistemic Risks Aleatory Risks can be easily modeled with Reference Class Forecasting using past performance 14. Risk
  • 134. Beware the Black Swan 134 14. Risk