Stevens-USC SERC: A DoD University Affiliated Research Center (UARC) Barry Boehm, USC

advertisement
Stevens-USC SERC:
A DoD University Affiliated Research Center (UARC)
Barry Boehm, USC
SERC-Internal Kickoff Tasks Workshop
January 29, 2009
Outline
• Nature of UARCs
• SERC Overview
– SERC organization and vision
– SERC research strategy
• Initial tasks
– SysE Effectiveness Measures Assessment
– SysE Methods, Processes, Tools Evaluation
• Workshop objectives and approach
What is a UARC?
1.
University Affiliated Research Centers are “not-for-profit, private sector
organizations affiliated with, or part of, universities or colleges that
maintain essential research, development and engineering capabilities
needed by sponsoring DoD components.”
2.
They maintain long-term, strategic relationships with sponsoring DoD
components in specific core areas and operate in the public interest, free
from real or perceived conflicts of interest.
3.
UARCs are financed through long-term, non-competitive contracts
awarded by sponsoring DoD components for specific core work.
6/27/2016
3
Several Existing UARCs
1.
Johns Hopkins University APL; 4,300 people; annual funding $680M
2.
UC Santa Cruz with NASA Ames – information technology, biotechnology, nanotechnology,
computer science, aerospace operations, astrobiology, and fundamental biology. Has a
Systems Teaching Institute with San Jose State University and UCSC to teach through hands
on experience on research projects.
3.
Penn State University Applied Research Laboratory for the Navy with focus on undersea
missions and related areas; strategic partner with NAVSEA and ONR; established 1945; has
>1000 faculty and staff
4.
University of Washington APL – acoustic and oceanographic studies ; established in 1943
5.
UC Santa Barbara Institute for Collaborative Biotechnology – Army Research Office;
partnered with MIT and Cal Tech – focus on biologically-derived and biologically-inspired
materials, sensors, and information processing …
6.
University of Texas UARC started in 1945 focuses on sonar, acoustics, software system
research, satellite geodesy, active sonar, …; now has 600 people on staff
7.
USC Institute for Creative Technology – US Army; focus on virtual reality multimedia
applications for training, C4ISR
6/27/2016
4
SERC Organization
Lead organizations
Members
•
•
•
•
•
•
•
Auburn University
Air Force Institute of Technology
Carnegie Mellon University
Fraunhofer Center at UMD
Massachusetts Institute of
Technology
Missouri University of Science and
Technology (S&T)
Naval Postgraduate School
•
•
•
•
•
•
•
•
•
•
Pennsylvania State University
Southern Methodist University
Texas A&M University
Texas Tech University
University of Alabama in Huntsville
University of California at San Diego
University of Maryland
University of Massachusetts
University of Virginia
Wayne State University
As the DoD Systems Engineering Research-University Affiliated Research Center, SERC will be
responsible for systems engineering research that supports the development, integration,
testing and sustainability of complex defense systems, enterprises and services. Its members
are located in 11 states, near many DoD facilities and all DAU campuses.
6/27/2016
5
SERC Organization - II
Dr. Dinesh Verma
Executive Director
Dr. Art Pyster
Deputy Executive Director
Julie Norris (acting)
Director of Operations
Dr. Barry Boehm
Director of Research
Pool of more than 140 Senior Researchers and hundreds of
research faculty and graduate students from across members
Stevens’ School of Systems and Enterprises will host the SERC at Stevens’ Hoboken, NJ, campus. Stevens’ faculty
engagement will be complemented by a critical mass of systems engineering faculty at USC. A fundamental tenet
of SERC is its virtual nature – each of its 18 members will be a nexus of research activities. All research projects
will be staffed by the best available researchers and graduate students from across the members.
6/27/2016
6
Rough Financial Model
• Minimum of $2M/year
– Can add more funded tasks within contract
• First year: two specified tasks
– 1. SysE Effectiveness Measurement: EM (USC lead; $500K)
– 2. SysE Methods, Processes, Tools Assessment : MPT (Stevens lead;
$350K)
• Further tasks TBD
• Other sponsors can sole-source through UARC
– Procedures being worked out
6/27/2016
7
SERC Vision and Perspective
Vision
DoD and IC systems achieving mission outcomes – enabled by
research leading to transformational SE methods, processes,
and tools.
Perspective
The SERC will be the primary engine for defense and intelligence
community SE basic research. In doing so, the SERC will:
1.Transform SE practice throughout the DoD and IC communities by
creating innovative methods, processes, and tools that address critical
challenges to meeting mission outcomes (what),
2.Become the catalyst for community growth among SE researchers by
enabling collaboration among many SE research organizations (who),
3.Accelerate SE competency development through rapid transfer of its
research to educators and practitioners (how).
8
Research must respond to challenges in many
life cycles, activities, and attributes
Conceiving
Retiring
Evolving &
Adapting
SUPERVISORY FUNCTIONS: SE Planning, SE
Managing, SE Assessing, SE Controlling, …
Synthesizing
“Classic” Systems
Weapon Platforms,
System of Systems,
Network Centric
Services,
Enterprise Systems
Testing
Operating &
Supporting
Designing for Reliability, Security, Maintainability,
Supportability, Resilience, Scalability,
Manufacturability, Adaptability, …
9
SERC Research Methodology
Example: Mission and Stakeholder
Requirements and Expectations
Can we develop a transformative, interactive,
and graphical environment to bring
stakeholders (warfighters and analysts)
together with SEs to develop a graphical/visual
conops in an extremely agile manner?
Every study on failed
projects, refers to inadequate
requirements, and
understanding of the “real”
problem: large projects and
small projects; defense or
commercial
Understand and synthesize
advances in multi-media
technologies and interactive
graphics; gaming
technologies; real options
theory
Technical Task Order (TTO) Submittals
1. Prepare draft TTOs using template that Doris Schultz will
send you.
2. Proposed TTO funding can either be from core OSD/DoD
or from other agencies.
3. Template is based on how government is asking us to
submit information to them.
4. Officially, PMO proposes research to which SERC
responds. They recognize, of course, that they need our
help in identifying the proper research to perform and
have asked for informal proposals.
5. Template asks what you want to do, why it is important,
what previous research you are basing this work on, who
will perform the work, what competencies it requires, what
will be delivered, and how much it will cost.
SERC Research Initiation Strategy
FY09 Focus
SE external
Factors/Context
Mission
Drivers
Baseline SE
MPTs
Research to
Determine
Validate
Research
to
Determine
Validate
Address Specific
Modified MPTs
MPT
Research
to
Determine
Validate
Address
Specific To Modified
MPTs
MPT
SE issues
address
gaps
Effectiveness
Research
to
Determine
Validate
Address
Specific To Modified
MPTs
MPT
SE issues
address gaps
Effectiveness
Address
Specific To Modified
MPTs
MPT
SE issues
address gaps
Effectiveness
SE issues
To address gaps
Effectiveness
Determine SE
Effectiveness
And Value
Measures
Early focus on a solid baseline and quantifiable, observable
Measures to enable future demonstration of improvement
Outline
• Nature of UARCs
• SERC Overview
– SERC organization and vision
– SERC research strategy
• Initial tasks
– SysE Effectiveness Measures Assessment
– SysE Methods, Processes, Tools Evaluation
• Workshop objectives and approach
First Two Task Orders
1.
2.
“Assessing Systems Engineering Effectiveness in Major Defense
Acquisition Programs (MDAPs)”
–
Government lead: OUSD(AT&L)/SSE
–
Barry Boehm (USC) task lead, with support from Stevens, Fraunhofer
Center, University of Alabama at Huntsville
“Evaluation of Systems Engineering Methods, Processes, and Tools
(MPT) on Department of Defense and Intelligence Community
Programs”
–
Government lead: DoD
–
Mike Pennotti (Stevens) task lead, with support from USC, University
of Alabama at Huntsville, USC, Fraunhofer
6/27/2016
15
Coordinated approach
•
Synergetic contribution to Sponsor’s
SysE effectiveness goals
•
– Common collaborators, battle rhythm with
regular e-meetings, shared workshops
– Research Integrity Team
– Best practices + progress monitoring
and improvement
•
Common context
– Domains of interest and levels of
organization
6/27/2016
Coordinated management
•
Coordinated Technical Approaches
– Definitions, evaluation criteria and
methods
– Cross-feed/peer review ongoing results
16
Workshop Objectives and Approach
• Review, improve on EM and MPT task objectives
– Utility of results to DoD constituencies
– Identification of needs-capabilities gaps
– Identification of promising research directions
• Review, improve on EM and MPT task approaches
– Survey and evaluation approaches, criteria, and instruments
– Involvement of key stakeholders
• Contractors, program/portfolio managers, oversight organizations
– Coverage of DoD application domains
• Initial EM priority: Weapons platforms
• Initial MPT priority: Net-centric services
– Test and evaluation of results
• Capture participants’ relevant experience
12/9/2008
17
Download