From Navy Best Practices
Traps - think have risks covered by following procedures
Alarms – assumptions that cause trouble
Consequences – if nothing is done
A re a o f
Im p a ct
T e al ch n ic
P er fo rm a n c e
S u p p o rt a b il it y
P ro g ra m m a ti cs
C o s t
Potential Risk Item
• Personnel
• Availability
• Experience Levels
• Mix of Disciplines
• Requirements
• Definition
• Stability
• Complexity
• Resource Availability
• Facilities
• Hardware
• Personnel
• Funding Profile
• Communications
• Technology
• Availability
• Maturity Levels
• System Complexity
• Operating Environment
• Design
• Methods
• Complexity
• Software Tools
• Testing, Modeling
• Language
• Operational Interfaces
• Hardware
• Sensitivities to Other Risks
• Estimating Error
• Schedule
• Cost
• Number of Critical Path Items
S ch e d u le
Weak backgrounds in risk management
Mistaken concepts
Assessment is RM – skip planning & monitoring
Mitigation is only handling strategy
All risk can be eliminated
Focus on performance, omit cost & schedule
Adjust to different project team groups
Sr. management
Working level engineers
Address issues each group is likely to face
Address tailoring RM activities to meet program needs – not one-size fits all
Virtual
Can’t physically touch/feel to assess
History of latent defects
Many interfaces
Hardware-to-software
Operating system-to-operating code
Incompatible combinations
Update frequency
Multiple versions
Upward/downward compatibility
Risk Grouping
Project Level
Project Attributes
Risk Issue
Requirements - excessive, immature, unstable, unrealistic
Lack of user involvement
Under estimation of complexity or dynamic nature
Performance - errors, quality
Unrealistic cost or schedule
Ineffective project management Management
Engineering Ineffective integration, assembly, test
Unanticipated difficulties across user interface
Work Environment Immature design, process, technologies
Inadequate configuration control
Inappropriate methods, inaccurate metrics
Poor trainiing
Other Legal, contractual issues
Obsolescence (incl. excessive duration)
Difficulties with subcontracted items
Unanticipated maintenance & support costs
Unrealistic schedules and budgets
Risk Item
Personnel shortfalls
COTS; external components
Requirements mismatch; gold plating
User Interface mismatch
Management Techniques
Staffing with top talent, tailoring processes to skill mix, training, peer reviews, team building
Design to cost, business case analysis, reuse, requirements descoping, incremental development, adding schedule and budget
Qualification testing, benchmarking, prototyping, compatibility analysis, vendor analysis
Mission analysis, CONOPS formulation, user surveys, prototyping, early user’s manuals
Prototyping, scenarios, user characterization (functionality, style, workload
Architecture, performance, quality
Risk Item
Requirements changes
Legacy software
Externally-performed tasks
Straining computer science capabilities
Management Techniques
Architecture trade-off analysis, simulations, benchmarking, modeling, prototyping
Change thresholds, incremental development, information hiding
Restructuring, design recovery, wrappers, phase-out analysis
Reference checking, pre-award audits, award fee contracts, competitive design or prototyping, team building
Technical analysis, cost benefit analysis, prototyping
Review Schedule & Networks
Review Cost Estimation Parameters
Perform Interviews
Revisit Lessons Learned
Develop a Risk taxonomy
Brainstorm and play “what if”
Re-sort the watch list (e.g., by source)
Use the planning or baseline schedule
Evaluate the activity network view
Look for nodes with:
High fan-in (many activities terminate at a single node)
High fan-out (many activities emanate from a single node)
No predecessors
Consider specific areas of concern that can lead to problems:
Personnel experience, availability
Requirement complexity, firmness
Scheduling and prediction of task and partition times
Hardware requirements, interfaces, constraints
Maturity Factor
Complexity
Factor
Dependency
Factor
Stability
Factor
0.1
Low
0.3
Moderate
0.5
High
Technology exists and can be used “as is”
Technology requires minor change before use (<25%)
Technology requires major change before use (<50%)
Simple relative to current environment
Minor complexity relative to current environment
Moderately complex relative to current environment
0.7
Very High
Technology requires significant design and engineering before use (<75%)
Significantly complex relative to current environment
0.9
Extremely
High
State of the art, some research done
Extremely complex relative to current environment
Entirely within project control
External factors will not make any changes
Depends on existing product supplied from outside organization
External factors will make minor changes (<25%)
Depends on supply and modification of existing product from outside organization
Depends on new development from outside organization
Depends on finding development from outside organization
External factors will make major changes (<50%)
External factors will make significant changes (<75%)
External factors will make constant changes
Magnitude
0.1
Low
0.3
Minor
Technical Factor
Small reduction in technical performance
Small reduction in technical performance
Cost Factor
Budget estimates not exceeded, some transfer of money
Cost estimates exceed budget by 1 to 5%
0.5
Moderate
0.7
Significant
0.9
Catastrophic
Some reduction in technical performance
Significant degradation in technical performance
Cost estimates increased by 5 to 20%
Cost estimates increased by 20 to 50%
Technical goal cannot be achieved
Cost estimates increased in excess of 50%
Schedule Factor
Negligible impact on program, slight development schedule change compensated by available schedule slack
Minor slip in schedule
(less than 1 month), some adjustment in milestones required
Small slip in schedule
Development schedule slip in excess of 3 months
Large schedule slip that affects segment milestones or has possible effect on system milestones
I SO-Risk Contours
1 .0
0 .9
0 .8
0 .7
0 .6
P robabi li ty
0 .5
0 .4
0 .3
0 .2
0 .1
0 .0
0 .0
0 .1
0 .2
0 .3
0 .4
0 .5
C onsequence
0 .6
0 .7
0 .8
0 .9
1 .0
R = .1
R = .2
R = .3
R = .4
R = .5
R = .6
R = .7
R = .8
R = .9
R = . 95
R = . 98
Avoidance - de-scoping objectives
Assumption – latent defects
Control – user acceptance testing
Transfer – from software to firmware or hardware
N e w R i s k s
S o f t w a r e R i s k I t e m s
C l o s e d R i s k s
T i m e
Different market conditions
Different best practices
Different likelihoods for similar issues
As always, tailor RM to program needs
How is risk impacted?
Commercial
Many small buyers
Many small suppliers
Market sets price
Free movement in/out market
Prices set by marginal costs
Once funding secured, usually stable
Capcity to supply adjusts to demand
DoD / NASA
Fewer large buyers
Typically few suppliers of a given item
Oligopoly pricing - biased to availble budget
Barriers to entry
Prices proportional to total cost
May have unanticipated disruptions to funding
Moderate to large excess capacity
How is risk impacted?
Commercial DoD / NASA
Evolutionary upgrades of existing systems Little reuse, many unique systems
Heavy buyer involvement (as team member) Formal development model - buyer oversees
Informal reviews
Heavy user involvement
Based on one or more industry stds
Prototyping common
Very formal reviews
Limited user involvement; buyer involved
Use gov't and industry stds
Limited prototyping
Category
Cost
Design
Integration
Support
Manufacturing
Technology
Management
Commercial DoD / NASA Comment
Highly Likely Highly Likely Whenever new development is required
Possible Likely Degree of design enhancement required
Possible
Possible
Likely
Likely
Likely
Possible
Possible
Likely
Likely
Possible
Driven by complexity
Commercial life cycles generally short
Varies with production rate & resource availability
Increases as push state of art
Depends on program complexity, performance expectations
Political Unlikely Likely External issue to program
A good tool for one organization may not be a good match for another
Tailor RM to the program needs
The tool should never dictate the process
Define process, then choose compatible tool
Be compatible with program culture
RM is more than using a RM tool
Tool must efficiently & effectively integrate into program
Resources required
Level of detail, complexity
Focus of tool – e.g., program phase
Provide sufficient configuration control
Accessible to all team members
Ability to accept anonymous comments
Support program needs
Reporting
Monitoring
Captures lessons learned
Fulfills contractual requirements
Balance costs/value
@Risk & Crystal Ball – licensed software
Monte Carlo simulation add-in for Excel
Select desired distribution function & define parameters
Provide data and generate a plausible distribution function
Provides statistics and graphical output
User provides risk analysis structure
Probability-Consequence Screening
Developed by the Air Force
Risk events assigned a probability & consequence for performance, schedule & cost
Position in consequence screening matrix determines risk score
User assigns Hi, Med, Low ranges
Generates reports and graphical output www.pmcop.dau.mil
Probability-Consequence Screening
Probability-Consequence Screening
Excel – based model
Collects inputs in watch list format
Uses best practices (ordinal) breakout for probability
& consequence
Orders events by Borda rank & assigns risk level
Generates action plan reports and graphical output www.mitre.org
Access – based, licensed software
Can establish standard values for risk categorization
Manual or automatic risk prioritization
Complies with ISO, SEI CMMI & Government standards
Generates detailed, summary & metrics reports
Demo available: www.iceincusa.com/products_tools.htm
Knowledge-based system
Utilizes SEI & Navy Best Practices to collect data on past experiences
Measures technical risk rather than cost & schedule
Most applicable to design efforts
Can tailor categories, templates & questions
Generates status, next action & overdue action reports www.bmpcoe.org
Knowledge & simulation-based tool
Assesses complexity of dependency relationships between project tasks
Measures risk in terms of schedule impact
Most applicable to design efforts
Ongoing development effort at MIT
Generates suggested sub-team groupings, & probability curves for task duration ranges www.dsmweb.org
Closed book, closed notes
You have 90 minutes for exam.
Any questions?
Turn in Part II of your project according to the schedule discussed last week