RM-week5 - Lyle School of Engineering

advertisement

Week 5 – Special Topics

Risk Management Best

Practices

Traps, Alarms & Escapes

From Navy Best Practices

Traps - think have risks covered by following procedures

Alarms – assumptions that cause trouble

Consequences – if nothing is done

Risk Checklist

A re a o f

Im p a ct

T e al ch n ic

P er fo rm a n c e

S u p p o rt a b il it y

P ro g ra m m a ti cs

C o s t

Potential Risk Item

• Personnel

• Availability

• Experience Levels

• Mix of Disciplines

• Requirements

• Definition

• Stability

• Complexity

• Resource Availability

• Facilities

• Hardware

• Personnel

• Funding Profile

• Communications

• Technology

• Availability

• Maturity Levels

• System Complexity

• Operating Environment

• Design

• Methods

• Complexity

• Software Tools

• Testing, Modeling

• Language

• Operational Interfaces

• Hardware

• Sensitivities to Other Risks

• Estimating Error

• Schedule

• Cost

• Number of Critical Path Items

S ch e d u le

Risk Management Training

Why RM Training is Needed

Weak backgrounds in risk management

Mistaken concepts

Assessment is RM – skip planning & monitoring

Mitigation is only handling strategy

All risk can be eliminated

Focus on performance, omit cost & schedule

Tailoring RM Training

Adjust to different project team groups

Sr. management

Working level engineers

Address issues each group is likely to face

Address tailoring RM activities to meet program needs – not one-size fits all

Software Risk Management

SW Risk as a Special Case

Virtual

Can’t physically touch/feel to assess

History of latent defects

Many interfaces

Hardware-to-software

Operating system-to-operating code

Incompatible combinations

Update frequency

Multiple versions

Upward/downward compatibility

Taxonomy of SW Risk

Risk Grouping

Project Level

Project Attributes

Risk Issue

Requirements - excessive, immature, unstable, unrealistic

Lack of user involvement

Under estimation of complexity or dynamic nature

Performance - errors, quality

Unrealistic cost or schedule

Ineffective project management Management

Engineering Ineffective integration, assembly, test

Unanticipated difficulties across user interface

Work Environment Immature design, process, technologies

Inadequate configuration control

Inappropriate methods, inaccurate metrics

Poor trainiing

Other Legal, contractual issues

Obsolescence (incl. excessive duration)

Difficulties with subcontracted items

Unanticipated maintenance & support costs

Boehm’s Top 10 Software Risks

Unrealistic schedules and budgets

Risk Item

Personnel shortfalls

COTS; external components

Requirements mismatch; gold plating

User Interface mismatch

Management Techniques

Staffing with top talent, tailoring processes to skill mix, training, peer reviews, team building

Design to cost, business case analysis, reuse, requirements descoping, incremental development, adding schedule and budget

Qualification testing, benchmarking, prototyping, compatibility analysis, vendor analysis

Mission analysis, CONOPS formulation, user surveys, prototyping, early user’s manuals

Prototyping, scenarios, user characterization (functionality, style, workload

Boehm’s Top 10 Software Risks

Architecture, performance, quality

Risk Item

Requirements changes

Legacy software

Externally-performed tasks

Straining computer science capabilities

Management Techniques

Architecture trade-off analysis, simulations, benchmarking, modeling, prototyping

Change thresholds, incremental development, information hiding

Restructuring, design recovery, wrappers, phase-out analysis

Reference checking, pre-award audits, award fee contracts, competitive design or prototyping, team building

Technical analysis, cost benefit analysis, prototyping

Identification Strategies

Review Schedule & Networks

Review Cost Estimation Parameters

Perform Interviews

Revisit Lessons Learned

Develop a Risk taxonomy

Brainstorm and play “what if”

Re-sort the watch list (e.g., by source)

IdentificationSchedule Risks

Use the planning or baseline schedule

Evaluate the activity network view

Look for nodes with:

High fan-in (many activities terminate at a single node)

High fan-out (many activities emanate from a single node)

No predecessors

IDCost Risk Drivers

Consider specific areas of concern that can lead to problems:

Personnel experience, availability

Requirement complexity, firmness

Scheduling and prediction of task and partition times

Hardware requirements, interfaces, constraints

ISO Risk Probability Table

Maturity Factor

Complexity

Factor

Dependency

Factor

Stability

Factor

0.1

Low

0.3

Moderate

0.5

High

Technology exists and can be used “as is”

Technology requires minor change before use (<25%)

Technology requires major change before use (<50%)

Simple relative to current environment

Minor complexity relative to current environment

Moderately complex relative to current environment

0.7

Very High

Technology requires significant design and engineering before use (<75%)

Significantly complex relative to current environment

0.9

Extremely

High

State of the art, some research done

Extremely complex relative to current environment

Entirely within project control

External factors will not make any changes

Depends on existing product supplied from outside organization

External factors will make minor changes (<25%)

Depends on supply and modification of existing product from outside organization

Depends on new development from outside organization

Depends on finding development from outside organization

External factors will make major changes (<50%)

External factors will make significant changes (<75%)

External factors will make constant changes

ISO Risk Consequence Table

Magnitude

0.1

Low

0.3

Minor

Technical Factor

Small reduction in technical performance

Small reduction in technical performance

Cost Factor

Budget estimates not exceeded, some transfer of money

Cost estimates exceed budget by 1 to 5%

0.5

Moderate

0.7

Significant

0.9

Catastrophic

Some reduction in technical performance

Significant degradation in technical performance

Cost estimates increased by 5 to 20%

Cost estimates increased by 20 to 50%

Technical goal cannot be achieved

Cost estimates increased in excess of 50%

Schedule Factor

Negligible impact on program, slight development schedule change compensated by available schedule slack

Minor slip in schedule

(less than 1 month), some adjustment in milestones required

Small slip in schedule

Development schedule slip in excess of 3 months

Large schedule slip that affects segment milestones or has possible effect on system milestones

ISO Risk Contour

I SO-Risk Contours

1 .0

0 .9

0 .8

0 .7

0 .6

P robabi li ty

0 .5

0 .4

0 .3

0 .2

0 .1

0 .0

0 .0

0 .1

0 .2

0 .3

0 .4

0 .5

C onsequence

0 .6

0 .7

0 .8

0 .9

1 .0

R = .1

R = .2

R = .3

R = .4

R = .5

R = .6

R = .7

R = .8

R = .9

R = . 95

R = . 98

SW Risk Handling

Avoidance - de-scoping objectives

Assumption – latent defects

Control – user acceptance testing

Transfer – from software to firmware or hardware

SW Metrics

N e w R i s k s

S o f t w a r e R i s k I t e m s

C l o s e d R i s k s

T i m e

Commercial vs. DoD/NASA

Perspective on Risk

Management

Commercial vs. Gov’t Perspective

Different market conditions

Different best practices

Different likelihoods for similar issues

As always, tailor RM to program needs

Market Differences

How is risk impacted?

Commercial

Many small buyers

Many small suppliers

Market sets price

Free movement in/out market

Prices set by marginal costs

Once funding secured, usually stable

Capcity to supply adjusts to demand

DoD / NASA

Fewer large buyers

Typically few suppliers of a given item

Oligopoly pricing - biased to availble budget

Barriers to entry

Prices proportional to total cost

May have unanticipated disruptions to funding

Moderate to large excess capacity

SW Development Best Practices

How is risk impacted?

Commercial DoD / NASA

Evolutionary upgrades of existing systems Little reuse, many unique systems

Heavy buyer involvement (as team member) Formal development model - buyer oversees

Informal reviews

Heavy user involvement

Based on one or more industry stds

Prototyping common

Very formal reviews

Limited user involvement; buyer involved

Use gov't and industry stds

Limited prototyping

Risk Category Likelihood

Category

Cost

Design

Integration

Support

Manufacturing

Technology

Management

Commercial DoD / NASA Comment

Highly Likely Highly Likely Whenever new development is required

Possible Likely Degree of design enhancement required

Possible

Possible

Likely

Likely

Likely

Possible

Possible

Likely

Likely

Possible

Driven by complexity

Commercial life cycles generally short

Varies with production rate & resource availability

Increases as push state of art

Depends on program complexity, performance expectations

Political Unlikely Likely External issue to program

Overview of

Risk Management Tools

Cautions in Tool Selection

A good tool for one organization may not be a good match for another

Tailor RM to the program needs

The tool should never dictate the process

Define process, then choose compatible tool

Be compatible with program culture

Effective Use of a Tool

RM is more than using a RM tool

Tool must efficiently & effectively integrate into program

Resources required

Level of detail, complexity

Focus of tool – e.g., program phase

RM Data Base Considerations

Provide sufficient configuration control

Accessible to all team members

Ability to accept anonymous comments

Support program needs

Reporting

Monitoring

Captures lessons learned

Fulfills contractual requirements

Balance costs/value

Tools Comparison

@Risk & Crystal Ball – licensed software

Monte Carlo simulation add-in for Excel

Select desired distribution function & define parameters

Provide data and generate a plausible distribution function

Provides statistics and graphical output

User provides risk analysis structure

Probability-Consequence Screening

Developed by the Air Force

Risk events assigned a probability & consequence for performance, schedule & cost

Position in consequence screening matrix determines risk score

User assigns Hi, Med, Low ranges

Generates reports and graphical output www.pmcop.dau.mil

Probability-Consequence Screening

Probability-Consequence Screening

Risk Matrix

Excel – based model

Collects inputs in watch list format

Uses best practices (ordinal) breakout for probability

& consequence

Orders events by Borda rank & assigns risk level

Generates action plan reports and graphical output www.mitre.org

Risk Matrix

Risk Matrix

Risk Radar

Access – based, licensed software

Can establish standard values for risk categorization

Manual or automatic risk prioritization

Complies with ISO, SEI CMMI & Government standards

Generates detailed, summary & metrics reports

Demo available: www.iceincusa.com/products_tools.htm

Risk Radar

Risk Radar

Risk Radar

TRIMS

Technical Risk ID & Mitigation

Knowledge-based system

Utilizes SEI & Navy Best Practices to collect data on past experiences

Measures technical risk rather than cost & schedule

Most applicable to design efforts

Can tailor categories, templates & questions

Generates status, next action & overdue action reports www.bmpcoe.org

TRIMS

Technical Risk ID & Mitigation

TRIMS

Technical Risk ID & Mitigation

DSM – Design Structure Matrix

Knowledge & simulation-based tool

Assesses complexity of dependency relationships between project tasks

Measures risk in terms of schedule impact

Most applicable to design efforts

Ongoing development effort at MIT

Generates suggested sub-team groupings, & probability curves for task duration ranges www.dsmweb.org

DSM – Design Structure Matrix

DSM – Design Structure Matrix

DSM – Design Structure Matrix

Final Exam

Closed book, closed notes

You have 90 minutes for exam.

Any questions?

Turn in Part II of your project according to the schedule discussed last week

Download