Proposed Way Forward for SERC EM Task Barry Boehm, USC-CSSE

advertisement

Proposed Way Forward for SERC EM Task

Barry Boehm, USC-CSSE

30 January 2009

Proposed Way Forward

• Iterate coverage matrix

• Add Personnel Competency EM

• Revise, circulate survey; analyze results

– Follow up with experience interviews

• Revise EM evaluation template, use to evaluate EMs

– Perhaps using Coverage Matrix categories

• Also evaluate EMs vs. Systemic Analysis Database

• Develop OpCon(s) for using EM instrument(s)

• Determine preferred candidate EM combinations for OpCons

– Possible result: Artifact-monitoring; evidence taxonomies; specialties (teaming, weapons platforms)

• Prepare EM instruments for pilot use

– Possible result: Refined INCOSE LIs; Refined Macro Risk Tool

01/30/2009 2

01/30/2009

Iterate Coverage Matrix

• Originators review current ratings

• Expand rating scale

– ** - Strong

– * - Adequate

– o - Partial

– - Missing

• Reorganize categories

– Try for 5x5 – 7x7 range

3

Candidate Personnel Competency EMs

• INCOSE Systems Engineering Certification Framework

• GsWERC Body of Knowledge SysE portions

• Nidiffer source

01/30/2009 4

01/30/2009

GsWERC BoK - Requirements

SOFTWARE REQUIREMENTS

Software requirements fundamentals

Definition of software requirement

Product and process requirements

Functional and non-functional requirements

Emergent properties

Quantifiable requirements

System requirements and software requirements

Requirements process

Process models

Process actors

Process support and management

Process quality and improvement

Requirements elicitation

Requirements sources

Elicitation techniques

Requirements analysis

5

EM Usage OpCons

• Planning review vs. execution monitoring

• Life cycle: Greenfield, Brownfield, O&M

• Domain: Weapons platform, System of systems, Netcentric services

• EM type: Artifact-monitoring; evidence taxonomies; specialties (teaming, weapons platforms)

01/30/2009 6

Trilateral Working Group Addressing Early

Warning Indicators

• US, UK, and Australian participants

• Mix of traditional and emerging indicators

• Current version a work in progress; seeking community feedback

• Organized around program goals and critical success factors

• Focused on readiness for milestone decision reviews

1/27/04 ©USC-CSE 7

Software Acquisition Goals

• Goal 1. System and software objectives and constraints have been adequately defined and validated.

• Goal 2. The system and software acquisition strategies are appropriate and compatible.

• Goal 3. The success-critical stakeholders have committed adequate software capability to perform their softwarerelated tasks.

Goal 4. The software product plans and process plans are feasible and compatible. A Feasibility Rationale provides convincing evidence that:

• Goal 5. Software progress with respect to plans is satisfactory.

1/27/04 ©USC-CSE 8

Critical Success Factors: Goal 1

Goal 1. System and software objectives and constraints have been adequately defined and validated.

• 1.1System and software functionality and performance objectives have been defined and prioritized.

• 1.2

The system boundary, operational environment, and system and software interface objectives have been defined.

• 1.3System and software flexibility and evolvability objectives have been defined and prioritized.

• 1.4System and software environmental, resource, infrastructure, and policy constraints have been defined.

• 1.5System and software objectives have been validated for overall achievability within the system and software constraints.

1/27/04 ©USC-CSE 9

Proposed Way Forward

• Iterate coverage matrix

• Add Personnel Competency EM

• Revise, circulate survey; analyze results

– Follow up with experience interviews

• Revise EM evaluation template, use to evaluate EMs

– Perhaps using Coverage Matrix categories

• Also evaluate EMs vs. Systemic Analysis Database

• Develop OpCon(s) for using EM instrument(s)

• Determine preferred candidate EM combinations for OpCons

– Possible result: Artifact-monitoring; evidence taxonomies; specialties (teaming, weapons platforms)

• Prepare EM instruments for pilot use

– Possible result: Refined INCOSE LIs; Refined Macro Risk Tool

01/30/2009 10

01/30/2009

Macro Risk Tool Demo

11

Macro Risk Model Interface

Goal 1:

Critical Success Factor 1

1(a)

3

3

1(b)

3

3

1(c) 2 2

Critical Success Factor 2

2(a)

2(b)

3

4

3

4

2(c) 5 5

Critical Success Factor 3

3(a)

5

5

U M F C/I S

USC MACRO RISK MODEL

Copyright USC-CSSE

NOTE: Evidence ratings should be done independently, and should address the degree to which the evidence that has been provided supports a "yes" answer to each question.

EV EVIDENCE

CSF Risk

(1-25)

Rationale and Artifacts

System and software objectives and constraints have been adequately defined and validated.

System and software functionality and performance objectives have been defined and prioritized.

Are the user needs clearly defined and tied to the mission?

Is the impact of the system on the user understood?

Have all the risks that the software-intensive acquisition will not meet the user expectations been addressed?

The system boundary, operational environment, and system and software interface objectives have been defined.

Are all types of interface and dependency covered?

For each type, are all aspects covered?

Are interfaces and dependencies well monitored and controlled?

6

15

System and software flexibility and resolvability objectives have been defined and prioritized.

Has the system been conceived and described as "evolutionary"?

21

©USC-CSSE

Download