A Prescriptive Adaptive Test Framework (PATFrame) for Unmanned and Autonomous Systems:

advertisement
Massachusetts Institute of Technology
A Prescriptive Adaptive Test Framework
(PATFrame) for Unmanned and
Autonomous Systems:
A Collaboration Between MIT, USC,
UT Arlington and Softstar Systems
Dr. Ricardo Valerdi
Massachusetts Institute of Technology
March 10 ,2010
Outline
•
•
•
•
•
The Challenge
PATFrame team
PATFrame objectives & scope
PATFrame features
Next steps
“Anything that gives us new
knowledge gives us an opportunity
to be more rational”
- Herb Simon
http://lean.mit.edu
© 2010 Massachusetts Institute of Technology Valerdi- 2
Sponsors
Transition Partners
PATFrame kickoff meeting
Fort Hood, TX - Aug 2009
http://lean.mit.edu
© 2010 Massachusetts Institute of Technology Valerdi- 4
The Challenge: Science Fiction to Reality
“You will be trying to apply international
law written for the Second World War to
Star Trek technology.”
Singer, P. W., Wired For War: The Robotics Revolution and Conflict in the 21st Century (Penguin, 2009)
http://lean.mit.edu
© 2010 Massachusetts Institute of Technology Valerdi- 5
PATFrame Objective
To provide a decision support tool encompassing a
prescriptive and adaptive framework for UAS SoS Testing
• PATFrame will be implemented using a software dashboard that
•
•
will enable improved decision making for the UAS T&E community
Focused on addressing BAA topics TTE-6 Prescribed System of
Systems Environments and MA-6 Adaptive Architectural
Frameworks
Three University team (MIT-USC-UTA) draws from experts in test
& evaluation, decision theory, systems engineering, software
architectures, robotics and modeling
http://mit.edu/patframe
http://lean.mit.edu
© 2010 Massachusetts Institute of Technology Valerdi- 6
Prescriptive Adaptive Test Framework
Test Strategy/
Test Infrastructure
Prescriptive
Adaptive
System under test
http://lean.mit.edu
© 2010 Massachusetts Institute of Technology Valerdi- 7
Time Scale for Testing Decisions
PATFrame Scope
Test Execution
Data Collection,
Test
Test
Long Term
(real time)
analysis &
Planning
Development
Planning
reprioritization
(in SoS
(i.e., design for
Decisions/
environment)
testability)
investments
Minutes
http://lean.mit.edu
Hours
Days
Months
Years
© 2010 Massachusetts Institute of Technology Valerdi- 8
Net-centricity of the
environment
Testing a system in a
SoS environment
net-centric focus
SoS
Testing a SoS in
SoS environment
Ultimate Goal
DARPA Urban
Grand Challenge
Use case
(UAS in SoS)
UAST focus
Difficulty
PATFrame Initial Focus:
Testing Autonomous
System in SoS
environment
http://lean.mit.edu
none
Human
S
SoS
Complexity
of the system
under test
AI
Testing SoS
© 2010 Massachusetts Institute of Technology Valerdi- 9
Prescribed System of Systems Environment
Goal: Synthetic framework
for SoS testing at single and
multi-program level
Goal: Construct Theoretical
Best “SoS” Test
Normative
Metric set,
“best” levels
Prescriptive
“success”
…
Metrics, state of
the practice levels
“Successful SoS Test” =
f(metricA, metricB, etc.)
MetricB
MetricC … MetricN
(MetricA)
Descriptive
Goal: Capture
Actual SoS Test
Actual SoS tests include
metricA’, metricC, etc.
http://lean.mit.edu
MetricA
limit
Normative (best)
Descriptive (SoP)
Descriptive (actual)
Potential (new test A)
test A contribution to
state of the practice
© 2010 Massachusetts Institute of Technology Valerdi- 10
Integrated Test Management
PATFrame Decision Support Technologies
Real Options
Systems
Dynamics
Leading
Indicators
Risk Models
Parametric
Models
...
PATFrame Core Models
Normative
Prescriptive
Descriptive
Ontology
Decision
Support
System
Assessment of
Effectiveness
Testbed
Modeling Language for Adaptive
Test System Architectures
Modeling, Analysis, Simulation,
and Synthesis Toolsuite
PATFrame Adaptive Architecture Framework
http://lean.mit.edu
© 2010 Massachusetts Institute of Technology Valerdi- 11
Real Options as Prescriptive and
Adaptive Framework for SoS Testing
• Use case:
•
•
•
Question: what to test? (i.e. what SoS test scenarios to prescribe?)
Inputs: models of autonomous/net-centric system of systems, major
uncertainties
Outputs: enablers and types of real options for responding to
uncertainties as candidate test scenarios (i.e. identification of how SoS
can adapt)
Identification of Real Options:
Joint/ SoS Model:
coupled dependency
structure matrix
1. Real option to adjust comm. range using
flexible range comm. systems on vehicles1, 2
2. Real option to use Vehicle3 as relay
Vehicle1
Ontology
Objective: Maintain Vehicle1Vehicle2 comm.
Uncertainty: proximity of vehicles 1 and 2
(Army)
Vehicle2
3
(Navy)
2
Uncertainties
Vehicle3
Mission
objectives
(Air Force)
http://lean.mit.edu
1
© 2010 Massachusetts Institute of Technology Valerdi- 12
Testing to Reduce SoS Risks vs.
the Risks of Performing Tests on an SoS
SoS Risks
• What are unique risks for
UAS’s? For UAS’s operating in
an SoS environment?
• How do you use testing to
mitigate these risks?
• What are metrics that you are
using to measure the level of
risk?
http://lean.mit.edu
Risks of Testing an SoS
• What are unique programmatic
risks that impact your ability to
do testing on UAS’s? To do
testing on UAS’s operating in
an SoS environment?
• What methods do you use to
mitigate these risks?
• What are the metrics that you
are using to measure the level
of programmatic risk in testing?
© 2010 Massachusetts Institute of Technology Valerdi- 13
Example PATFrame Tool Concept
•
Question:
• When am I Done testing?
•
Technology:
• Defect estimation model
• Trade Quality for Delivery Schedule
•
Inputs:
• Defects discovered
•
Outputs:
• Defects remaining, cost to quit, cost to continue
14
http://lean.mit.edu
© 2010 Massachusetts Institute of Technology Valerdi- 14
When am I done testing?
http://lean.mit.edu
15
© 2010 Massachusetts Institute of Technology Valerdi- 15
PATFrame
Primary Inputs
to PATFrame
Models of the
system under test
Model of the SoS
environment
Mission needs/goals
Knowledge
Base
Analysis
Techniques
•How should my test
•strategy change?
•What realizable options
Reasoning Engine
are available?
Plan
Ontology
Test Requirements
Analyze and Design
Execute
Available resources
•and time
Evaluate
Risk & Uncertainty Analysis
(leading indicators)
Primary Outputs for
Test and Evaluation
Process Analysis
(System Dynamics)
•Which test do I run
and in what order?
•When am I done
•Testing?
Download