Probability of Program Success Component Standardization : Status

advertisement
Probability of Program Success
Component Standardization :
Status Brief
Ms. Jane Rathbun
Special Assistant , Program Management Improvement
OSD(AT&L)(A&T)(PSA)
May 5, 2010
1
Status
•PoPS Component Working Group has reached
agreement on a common framework
• PARCA in conjunction with PSA, SE and ARA
will work to incorporate PoPS into their
governance efforts.
2
PoPS Framework Components
As defined by Navy version
• Program Health: The current state of an acquisition program’s
requirements, resources, planning and execution activities, and external
influencers, and how those factors are impacting the program’s ability to
deliver a capability within specific constraints.
• Factors: These are Program Health organizational categories. The four
Factors are: Program Requirements, Program Resources, Program
Planning and Execution, and External Influencers.
• Metrics: Major sub-categories that collectively define the scope of a
particular Factor. There are 18 Metrics in the Naval PoPS 2.0 Program
Health framework. Metrics are the basic building blocks of Naval PoPS.
• Criteria: Parameters (qualitative and quantitative) used to evaluate a
particular Metric. Each Criteria is associated with a unique identification
number to enable traceability between Naval PoPS documents and tools.
3
3
Proposed Enterprise Framework
4
Agreement on Enterprise Framework
What does it Mean?
• The same framework will be used by all Military Departments—Titles and Definitions
• The same sets of criteria will be aligned to and used for each metric
• The same metric weighting system by phase will be used by all components
• The criteria will change according to the agreed upon number of phases to be evaluated
• A target Enterprise IOC date of Third quarter 2011
What has to occur on the PoPS side?
• Working Group will work to align the criteria for the planning phase first, and then the criteria
for the remainder of the phases will be aligned
• Final agreement on the phases to be used at the enterprise level will have to be reached
• Weighting process has to be worked out
• An enterprise PoPS governance and management process will need to be established to
manage changes to framework—factors, metrics and criteria
• Need to determine how this will be incorporated into other components.
• Identification of implementation resources
Question to be answered: How will OSD use this tool?
5
How will we use it? Possible uses of
PoPS in the Defense Enterprise
• Goal: At the enterprise level, leverage (to the greatest extent possible) program assessment
tools in use by the components, minimize additional workload on the program offices
• Possible Enterprise Uses
• Processes: DAES, Program Support Reviews, Performance Assessments
• Organizations: PARCA, SE, T&E
• Pending Legislation: HR 5013 proposes a new chapter to Part IV of title 10— “Performance
Management of the Defense Acquisition System”
• “ …all elements of the defense acquisition system are subject to regular performance assessments”
• “ the SECDEF shall establish categories of metrics for the defense acquisition system, including at a
minimum , categories related to cost, quality, delivery, workforce and policy implementation…”
• Action: PARCA, PSA, SE and ARA will form a working group to determine how best to
leverage the Standard PoPS into governance efforts
6
Approach
• Phase One, Part Two A MilDep Working Group : May 2010-September 2010
Present way forward to AT&L and Component Leadership for common PoPS get approval.
Complete criteria alignment for the first phase and apply proof of concept to all other phases,
align criteria and weighting
Build requirements documentation and implementation plan
Identify any additional resources needed for implementation
Build an implementation schedule
• Phase One. Part Two B MilDep and OSD working group: Identification of
Enterprise Uses & Governance May 2010-September 2010
Identify enterprise reporting requirements that could be modified or replaced with a standard
PoPS model
Share model with components other than MILDEPs
Share model with Industry counterparts and other components for evaluation as the
industry/government program health assessment tool (done through ICPM)
Components and OSD establish governance and adjudication process for PoPS framework and
related reporting
7
Approach
• Phase Two, Military Department Implementation: FY 2011
Components to make changes to their internal processes and PoPS models
OSD prepare to receive and utilize PoPS in identified forums
First reporting goal: Beginning of Third Quarter
• Phase Three, All Component Implementation: FY 2011
All components to make changes to their internal processes and adopt the PoPS model
First reporting goal: Beginning of First Quarter 2012
8
Back Up
9
Background & Intent
• NDIA/ICPM Sub-Team working on Key Performance Indicators identified a set
of metrics that might serve both government and industry ( March 2009 ICPM)
• As of 2008, all Military Departments are utilizing some variant of PoPS
(Probability of Program Success)—a program health assessment tool
• November 2008 Memo from AT&L, Director PSA to Military Deputies established
as working group to determine a way forward on a common variant
• Can we get to a common variant of PoPS within Defense?
• What other program health/performance indicators are needed for a complete suite of
assessment tools—PoPS+?
• What enterprise level information requirements could be replaced by a PoPS +?
• Could a common variant of PoPS serve as the baseline measure of program
health for both Government and Industry?
10
Factor/Metric Descriptions—Program Requirements
•
Program Requirements: Capability requirements [defined in the Initial
Capabilities Document (ICD)/Capability Development Document
(CDD)/Capability Production Document (CPD)] that the program must
meet within approved cost and schedule constraints
– Parameter Status: Progress toward defining capability requirements
[ICD/CDD/CPD] and meeting those requirements through the achievement of
Key Performance Parameter (KPP)/Key System Attribute (KSA)/other attribute
threshold values. Also measures requirements traceability and the validity of the
threat assessment.
– Scope Evolution: Stability of performance parameters/other attributes/quantities
from the established baseline and the impact of requirements changes on total
program cost and schedule.
– CONOPS: Progress toward developing and scoping the Concept of Operations
(CONOPS), using it to inform program requirements, acquisition approaches,
and strategies, and the validity of the CONOPS over time.
11
Factor/Metric Descriptions—Program Resources
• Program Resources: Funding and manning that is
allocated to the program to accomplish planning and
execution activities.
– Budget: Sufficiency of funding (amount and phasing) across the Future
Years Defense Program (FYDP) based on last approved budget controls
and degree of deviation from the current cost estimate.
– Manning: Stability and adequacy of Resource Sponsor and Program Office
staffing (availability, skills, experience, certification, and training).
12
Factor/Metric Descriptions—Program/Planning
•
Program Planning/Execution: Activities performed by the Program Office, contractors,
and government performers to fulfill program requirements and deliver expected,
affordable, and sustainable capability to the operating forces.
–
–
–
–
–
Total Ownership Cost Estimating: Measures the adequacy of the elements required to produce sound cost
estimates: program description information, cost data, cost estimating process, cost estimate stability and
comparisons, and cost estimate measures. Also assesses how well acquisition, systems development, and
sustainment strategies are evolving in ways intended to mitigate Total Ownership Cost (TOC) growth.
Schedule: Completeness and progress against the integrated master schedule/program master schedule; also
includes status of milestone documentation development. Status of procurement activities and achievement of
contracting milestones against the planned schedule.
Industrial Base/Manufacturing/Production: Assesses market research activities, industrial base health, and an
understanding of industrial implications for cost, schedule, and technical risks. Also measures
manufacturing/production capabilities and execution.
Test and Evaluation: Progress toward defining and executing the Test and Evaluation Strategy (TES) and the Test
and Evaluation Master Plan (TEMP). This includes the ability to evaluate the system's technical and operational
maturity and performance through testing, the adequacy of test resource requirements to accomplish the necessary
key test activities, the status of identified technological risks, system deficiencies, and the effectiveness, suitability, and
survivability of the system under development.
Technical Maturity: Assessment of the maturing system and sub-systems design, as well as the technical maturity of
Critical Technology Elements (CTEs) in accordance with the approved Technology Development Strategy (TDS).
Evaluation of the supporting engineering processes, engineering documentation, and lessons learned to achieve an
Operationally Effective and Suitable System.
13
Factor/Metric Descriptions—Program/Planning
•
Program Planning/Execution (cont.)
–
–
–
–
–
Technology Protection: Status and progress toward the safeguarding of DOD research, technology information,
and applied knowledge associated with the program. Functional disciplines include threat assessments and
intelligence/counterintelligence, Anti-Tamper, Supply Chain Risk Management, and physical and electronic
security across government and Defense Industrial Base partners. Evaluated by the reporting of program
protection strategy and plans, personnel (both internal and external to a program office), and resources.
Software: Software management and engineering (including translation and allocation of system capabilities to
software, software code development, software-related risk management, etc.); applies to software activities by
government agencies and/or contractors that are integral to program deliverables. Evaluated in terms of software
size and stability, cost and schedule, organization, and quality.
Sustainment: Progress toward defining and executing the sustainment strategy, and the resource adequacy
applied toward those life cycle sustainment activities. Sustainment is conducted as specified by an evolving Life
Cycle Sustainment Plan (LCSP) and attachments. The Independent Logistics Assessment (ILA) is the milestone
focus by which decision makers determine LCSP execution effectiveness and affordability.
Government Program Office Performance: Progress toward defining and executing intra-government
requirements; responsiveness to deliverable submissions; delivery of facilities, funding, and Government
Furnished Equipment (GFE)/Government Furnished Information (GFI) in accordance with scheduled
requirements; Configuration Management/Configuration Control Board (CCB) and Risk Management Board (RMB)
effectiveness.
Contractor Performance: Performance of major contractors and/or government performers as measured by the
Earned Value Management System (EVMS), Contractor Performance Assessment Reports (CPARs)/Informal
Performance Assessment Reports (IPARs), staffing adequacy, and work package completion. Also assesses
each company’s financial health, financial systems, and manufacturing/production capabilities.
14
Factor/Metric Descriptions—External
Influencers
•
External Influencers: Issues or actions taken by parties outside the
purview of the Program Manager that may impact program
planning/execution activities and the achievement of program
requirements or objectives.
– Fit in Vision: Program alignment with current documented Office of the Secretary
of Defense (OSD) guidance and Service strategies.
– Program Advocacy: Support demonstrated by key stakeholders: Congressional;
Under Secretary of Defense Acquisition Technology & Logistics (USD AT&L) (or
equivalent); Assistant Secretary of Defense for Networks and Information
Integration (ASD NII); Cost Assessment and Program Evaluation Office (CAPE);
Director of Operational Test & Evaluation (DOT&E); USD (Comptroller);
Service/Component; Joint Staff/Combatant Commander (COCOM); Fleet Forces
Command (FFC)/Marine Corps Forces (MARFOR); International Partners; Other
Services.
– Interdependencies: Interface issues affecting inter-related programs; determines
whether dependent programs are on track to deliver the requisite capability or
quantity on schedule.
15
Download