Quantitative Capability Delivery Increments: A Novel Approach for

advertisement
Quantitative Capability Delivery Increments:
A Novel Approach for Assessing DoD
Network Capability
Jimmie G. McEver, III, EBR
David T. Signori, EBR
Dan Gonzales, RAND
Craig M. Burris, JHU APL
Mike D. Smeltzer, EBR
Heather Schoenborn, OASD/NII
FACT Meeting
Vienna, VA
May 12, 2010
mcever@ebrinc.com
This Briefing
 Summarizes paper presented at Infotech@Aerospace conference
─ Inform community about availability of tools, methods and approaches
─ Get feedback from community researchers working similar problems
 Focuses on a flexible approach for applying the QCDI Demand
model to assess the adequacy of network capability
─ Means of identifying major shortfalls in supply & assessing alternatives
─ Methods that can be used to enable Network Mission Assurance analysis
 Assumes
─ Familiarity with the QCDI Demand Model, described in draft ICCRTS paper,
June 2010
 Describes
─
─
─
─
Objectives & supply construct
Methods for estimating capability supply
Overview of assessment approach
Illustrative applications
2
QCDI Supply Model
Objective & Basic Construct
 Objective: Develop methodology for estimating supply of net
centric capability
─ Suitable for comparison with demand at the unit level
─ Feasible to execute across all units, timeframes, and programs
─ Flexibility to balance quality of estimate with data availability & effort
 Basic supply construct:
─ Sets of users (units) are provided capability by programs
─ Programs provide capability via program components (e.g., devices,
service instances, etc.) with quantifiable performance characteristics
─ Program component performance can be estimated and aggregated at
varied levels of sophistication that can account for a range of factors
depending on the data available & quality of estimate needed
─ Unit supply is estimated by aggregating over relevant components in
each program providing capability, then over programs
3
Output Views for Steps in Methodology
Supply for Unit Type and Time Frame
Component Supply
Component Fielding to Units
0
0
0
0
0
0
0
0
200
268
495
302
403
744
455
606
0
0
0
0
0
0
0
0
0
122
74
99
183
111
149
275
167
224
14
9
12
21
14
18
32
21
27
72
43
58
108
65
87
162
98
131
36
22
29
54
33
44
81
50
66
72
43
58
108
65
87
162
98
131
14
9
12
21
14
18
32
21
27
0
0
0
0
0
0
0
0
0
Component
Mapping
660 400 536 990 600 804 1485
to Demand
Type
462 280 375 693 420 563 1040
Indirect
Pn
C1
C2
C3
P
PP111
LOS
330
200 P
C
C
C
2
Number of
1
2
3
C2
C3
BLOS CC1 1
2
C1 330 CC2 200
CC3 3
Components
LOS
P
LOS
330
200
Indirect330
1 268
Numberofof LOS
200
Number
BLOS
LOS
330
200
Number
of
BLOS C1
C2
C3
Components
Number
of BLOS
Components
Indirect
268
BLOS
Components
Indirect
268
Components
LOS Indirect330
200
268
Indirect
268
Number of
BLOS
Components
Indirect
268
P2
P1 P2 … BLOS
Pn
P1 P2 … PLOS
n
P1 P2 … Pn
Indirect
C3
C1 C2BLOS
C1 C2 CLOS
3
C 1 C 2 C3
Metrics
Metrics
Metrics
0
330
Program
Supply
900 1206
630
845
99
11
83
149
17
125
224
26
188
74
33
42
111
50
63
167
75
95
Pn
Indirect
C3
C1 C2BLOS
C1 C2 CLOS
3
C1 C2 C3
Metrics
Metrics
Metrics
MEB (USMC)
HBCT (USA)
BSTB (USA)
CAB (USA)
HHC CAB (USA)
RC CAB (USA)
AC CAB (USA)
RS (USA)
FAB (USA)
BSB (USA)
STRYKER BCT (USA)
IBCT (USA)
ACR (USA)
AV BDE (USA)
P1
Component Fielding by
Demand Type
Metrics
Metrics
Metrics
2012
2016
2020
C 1 C2 C3 C 1 C 2 C 3 C1 C 2 C 3
Indirect
C3
C1 C2BLOS
C1 C2 CLOS
3
C1 C 2 C 3
Metrics
Metrics
Metrics
From Program, Unit and
Architecture Data
Component Performance Metrics
Data Rate
Voice DR
Protected DR
QCDI Demand
Model
Indirect
BLOS
LOS
C1
C2
C3
600 kbps 600 kbps 1400 kbps
16 kbps 16 kbps 16 kbps
0 kbps
0 kbps
0 kbps
Total
Unit
Supply
OTM
M1
M2
…
M3
Analyst
QCDI Demand
Exploration Tool
Decision
Maker
Program Supply Calculations for a Major Unit
Aggregate Data Rate Metric
SME
Documentation
Supply
Template
Program Device
Deployment Schedule
LOS, BLOS or Wired
Aggregate Device Data Rate
(ADRunit, device)
ADRunit, device  # Devicesunit DataRatedevice 
LOS, BLOS or Wired
Aggregate Program Data Rate
 ADR
ADRunit, program 
Program Device
Mapping to the Unit
Program Device
Specifications
LOS, BLOS or Wired
Aggregate Supply for Unit
ADR unit 
Mapping to Demand
Device Type
unit, device
device types
in program
 ADR
unit, program
programs
supporting
unit
Key challenge is accounting for interactions among elements
Incremental Maturity Levels for
Supply Capability Estimates
Factors
Accounted
For
Maximum
Nominal
Performance
of Program
Elements
Performance
of Program
Networks
Data generally
available from
programs
Data available
from studies
DemandTrans-Program Loaded
Program
Interfaces
Demand
Loads
Increased
Quality
of
Estimate
Extensions requiring richer
data and other elements of
analysis
Additional assumptions or analysis required as device, program, and
demand interactions are progressively considered
6
Incremental Maturity Levels for
Supply Capability Estimates
Factors
Accounted
For
Data and Analysis
Required
Maximum
Nominal
Performance
of Program
Elements
Performance
of Program
Networks
 Characteristics of
devices or service
instances
 Number of devices or
service instances
Generally available from
programs
 Program network
design
 Nominal design-point
loads
Available from
programs and studies
DemandTrans-Program Loaded
Program
Interfaces
 Cross-program
demand use cases
 Performance at
interfaces
Demand
Loads
 Integrated view of
loads
 Allocation of loads to
program networks
Extensions requiring richer data and
specification of context
Increased
Quality
of
Estimate
Increased
Effort
Illustrative Calculations for Levels 1 & 2
Level 1 Supply Estimate
 Assumptions
Level 2 Supply Estimate
 Assumptions
─ Radio terminals have
transmit/receive capability of 0.6
Mbps per channel when
communicating point to point
─ Notional program has 1-, 2-, and 4channel radios with LOS capabilities
─ Fielding as indicated in table below
Device
1 Chan
Device
2 Chan
Device
4 Chan
Data rate per
radio (Mbps)
0.6
1.2
2.4
Radios in unit
36
375
Data rate to
unit (Mbps)
21.6
450.0
─ When employed in the field, radios are
configured in subnets in which, on
average, 30 nodes share 200 kbps of
subnet capacity (≈ 0.007 Mbps/node)
─ 2- and 4-channel radios have half their
channels configured for voice
─ Each channel treated as a separate
radio for subnet participation purposes
Device
1 Chan
Device
2 Chan
Device
3 Chan
Data rate per
radio (Mbps)
0.007
0.007
0.014
30
Radios in unit
36
375
30
72.0
Data rate to
unit (Mbps)
0.25
2.63
0.42
Level 1 Estimated Supply: 543.6 Mbps
Level 2 Estimated Supply: 3.30 Mbps
8
Factors Accounted for at Levels 3 & 4
 Level 3: Interactions among programs
─ Constraints due to interacting programs; e.g. Impact of limitations in
satellite capacity on potential capacity of the terminal or vice versa
─ Minimum capability of systems in the chain may dictate overall
performance
 Level 4: Impact of demand itself on the supply network
─ The impact of demand reflected as a load on the supply networks and
devices;
– Often different from design loads assumed by program offices
─ The effects of other programs or users that are not explicitly part of the
of the demand or supply being studied
– E.g. , other users may be sharing satellite bandwidth
─ Other effects in which the nature and structure of demand affects of
the ability of the program to provision
– E.g., only a portion of demand can be satisfied by broadcast
capability
Assessment Approach (1)
 Define the issue
─ Existing, programmed or proposed systems of concern
─ Operation, mission or forces potentially effected
 Identify the relevant QCDI dimensions
─ Functional domain(s)
─ Device type(s) and key demand metric(s)
─ Users and units to be supported
 Characterize the supply architecture(s) at issue in terms of
additional dimensions of the QCDI demand framework; e.g.
─ The types and numbers of devices provided,
─ Relevant modes of operation
─ Use or configuration to support a typical unit.
10
Assessment Approach (2)
 Estimate the aggregate supply provided by the mix of
programs to the relevant units
 Determine the appropriate demand from the QCDI model to
serve as a point of reference for the assessment
─ In some cases, it may be necessary to parse the demand of users
further to map portions of demand (e.g., from subsets of users) to the
systems and programs in the architecture.
 Compare aggregate supply with aggregate demand for each
of the metrics chosen
─ Drill down to greater resolution as necessary.
 Repeat to explore alternative solutions for filling gaps
─ Conduct sensitivity analysis.
11
Illustrative Applications Described in Paper
 Incremental tactical radios capability for major ground unit
─ LOS data comms capability supplied to a specific class of users
by a radio program
─ LOS data comms capability supplied to OTM users in an HBCT by
a radio program
─ All data comms capability provided to an HBCT by a radio
program plus a program providing backbone capabilities
 Alternative Satellite communication capability for a maritime JTF
─ Base case: Satellite terminals providing access to protected and
unprotected SATCOM
─ Alternative 1: Addition of leased commercial SATCOM
─ Alternative 2: Utilization of broadcast capability
─ Alternative 3: Additional unprotected military SATCOM capacity
12
Maritime Situation
Problem Profile
Programs
Devices
Point-to-point military satellite
capabilities with and without
jamming resistance, commercial
point-to-point satellite capabilities
and satellite broadcast capability
Mix of configurable multimode
satellite terminals and antenna
groups that vary with the type of
ship supported
Type device
Indirect demand
demand
Metrics
Typical data rate and protected
communications data rate
User
classes
Command Post
Unit
structure
JTF comprised of 4 Carrier Strike
Groups and 5 Expeditionary Strike
Groups, each of which includes
both large and small ships
Satellites
X, K, Ka-band WGS w/GBS
Ka, Q-band AEHF Protected
C, Ku-band Commercial
Terminals
Ashore
NMT Terminal Waveforms
EHF (Protected)
Ka-band
X-band
CBSP Terminal Waveforms
C/Ku band
Ku Band
2016
Expeditionary
Strike
Group
(ESG)
Guided Missile
Cruiser
(CG)
Landing Ship, Dock
(LSD)
Amphibious
Transport, Dock
(LPD)
Guided Missile
Destroyer
(DDG)
Guided Missile
Destroyer
(DDG)
Amphibious
Assault Aviation/Dock
(LHA/LHD)
Operation Configuration
CSG 1
CSG 4
JTF Comprised of
Nine Strike Groups
ESG 2
ESG 1
ESG 3
CSG 3
ESG 4
ESG 5
Timeframe 2016
CSG 2
13
Maritime Assessment: Base Case
 Assumptions
─ Ships have one DOD SATCOM
terminal providing access to two
satellites
– One providing unprotected
communications
– One providing protected
communications (high robustness
and encryption)
─ Capability available to ship limited by
either terminal or SATCOM
capacities (whichever is smaller)
─ SATCOM capacities:
– Protected: 250 Mbps
– Unprotected: 214 Mbps
─ Terminal capacity: 517 Mbps
 Calculations
─ Protected supplied first
– SATCOM sufficient for 1/3 of demand
– Fully utilized
─ Remainder available for unprotected
– 267 Mbps terminal capacity remains
– Only 214 Mbps SATCOM capacity
available
Unprotected
Protected
267
250
214
250
214
250
Demand (Mbps)
1000
750
Percent Demand
Satisfied
21%
33%
Terminal Supply
(Mbps)
Satellite Supply
(Mbps)
Constrained Supply
(Mbps)
14
Impact of Adding Commercial Leases, Broadcast,
and Additional Military SATCOM Capacity
 Assumptions
 Calculations
─ Terminals and leased commercial
SATCOM added to JTF ships
─ The capacity of an additional military
satellite is available to the JTF
─ Broadcast capability of military
satellite utilized to deliver commonuse information to JTF; broadcast
receiver terminals added
Terminal Supply
(Mbps)
Satellite Supply
(Mbps)
Constrained Supply
(Mbps)
─ Commercial terminals provide additional
50 Mbps to supply of unprotected comms
─ Broadcast capability uses 14 Mbps of
military SATCOM capacity to satisfy 125
Mbps of demand
─ Additional military satellite increases
available SATCOM supply by 214 Mbps
Base Case
Unprotect
Pt-pt
Military
Unprotect
Pt-Pt
Commercial
Leases
Pt-Pt
Total
Unprotect
Pt-Pt
Total
unprotect
w/BC
267
267
540
807
1257
214
414
50
464
478
214
267
50
317
442
Demand (Mbps)
1000
875
1000
Percent Demand
Satisfied
21%
36%
44%
Notes:
• Supply now terminalconstrained; full
benefits of satellite
investment not
realized without
terminal investment
• Effects such as this
only picked up with
portfolio-level
analysis
15
Next Steps
 This methodology is being evolved and matured along with the
demand model and tool as well as the NMA Theory, methodology
and tools
─ Applied to a wide range of problems of varying complexity
─ Proven to be extremely adaptable to both problems and resources
 More research and engineering analysis is needed to achieve the
highest level in the estimation maturity model,
─ To reflect the full impact of the underlying information architecture
 An explicit methodology is being developed to parse QCDI demand
estimates into requirements for the exchange of information
─ Among nodes that characterize a force conducting a mission
 Simple network design tools are being used to rapidly reflect the
impact of these requirements in the analysis
─ As loads on a supporting system architecture comprised of multiple
programs
 The results of this work will be presented in subsequent papers
─ After those already mentioned
16
Other Venues for Community Outreach
 InfoTech @ Aerospace (AIAA) (April 20-22, Atlanta)
─ Quantitative Capability Delivery Increments: An Approach for Assessing
DoD Network Capability Against Projected Needs
 ICCRTS (US DoD CCRP) (June 22-24, Santa Monica)
─ Quantitative Capability Delivery Increments: A Novel Approach for
Estimating and Future DoD Network Needs
 MORS Symposium (June 2010, Quantico)
─ Assessing Operational Risk Arising from Network Vulnerability
 IEEE Mission Assurance: Tools, Techniques, and
Methodologies (August 2010, Minneapolis)
─ Network Mission Assurance Overview
 MILCOM (Oct/Nov 2010, San Jose, CA)
─ Estimating Mission Risk from Network Vulnerabilities
17
Network Mission Assurance: Assessing the
Mission Impact of Network Capability and
Capability Gaps
18
Why a Mission Assurance Framework for Endto-End Network Analysis?
Need: Quantitative Mechanism to Link Performance of Planned
Network Architectures to Mission Outcomes
• What are the threats to the network and their quantitative
impact on the network?
– Enemy action
– Environmental
– User behavior
• What are the mission impacts of network attacks or failures?
– Not all network degradation impacts the mission
– However, network problems can impact critical tasks
• What are the benefits of various mitigation strategies/solutions?
– Understand the impact of options at the critical task level
– Present quantitative, repeatable, apples-to-apples comparisons
Solution: Leverage previous OSD/Service investment in a family of tools and
techniques developed for quantitative analysis of Net-Centric
Architectures/Programs
19
Summary of Previous Work Relating Mission
Needs to Network Performance
 Recent work in DoD has established general frameworks to relate
critical “enablers” to mission outcomes
─ Critical Infrastructure Protection activities have identified infrastructure
components needed to accomplish high-level missions
─ Recent OSD Network Mission Assurance efforts examined network support
to “Mission Essential Functions”
─ These initiatives typically examine vulnerabilities, mission implications and
mitigation strategies for common user networks via a single metric (e.g.
throughput)
 Approach developed in OASD/NII work to date:
─ Use selected CONPLAN, scenario, vignette, etc. to derive critical tasks, units
involved, environmental considerations, and threats
─ Examine each critical task separately in terms of type of C2 and network
support required for task performance
─ Use Joint and Service doctrine/TTPs/TACMEMO/etc. to determine nature of
task dependency on network support
─ Apply family of tools for end-to-end quantitative and repeatable results
20
NMA Tool Suite:
Components and Relationships
QCDI
Parsed Demand
Demand
allocation
rules
Mission
Selection
and
Analysis
Demand Matrix
(unit-to-unit
network)
Device – Metric – Data Type
Source
Units
NMA Analytic
Context
Units involved
Selected tasks
Network role
FlowNET
Supply Matrix
(unit-to-unit
network)
X2
X3
X4
X5
X6
X7
DR(1,1)
DR(1,2)
DR(1,3)
DR(1,4)
DR(1,5)
DR(1,6)
DR(1,7)
X2
DR(2,1)
DR(2,2)
DR(2,3)
DR(2,4)
DR(2,5)
DR(2,6)
DR(2,7)
X3
DR(3,1)
DR(3,2)
DR(3,3)
DR(3,4)
DR(3,5)
DR(3,6)
DR(3,7)
X4
DR(4,1)
DR(4,2)
DR(4,3)
DR(4,4)
DR(4,5)
DR(4,6)
DR(4,7)
X5
DR(5,1)
DR(5,2)
DR(5,3)
DR(5,4)
DR(5,5)
DR(5,6)
DR(5,7)
X6
DR(6,1)
DR(6,2)
DR(6,3)
DR(6,4)
DR(6,5)
DR(6,6)
DR(6,7)
X7
DR(7,1)
DR(7,2)
DR(7,3)
DR(7,4)
DR(7,5)
DR(7,6)
DR(7,7)
Destination Units
Device – Metric – Data Type
Source
Units
Supporting
systems/arch
X1
X1
X1
X2
X3
X4
X5
X6
X7
X1
DR(1,1)
DR(1,2)
DR(1,3)
DR(1,4)
DR(1,5)
DR(1,6)
DR(1,7)
X2
DR(2,1)
DR(2,2)
DR(2,3)
DR(2,4)
DR(2,5)
DR(2,6)
DR(2,7)
X3
DR(3,1)
DR(3,2)
DR(3,3)
DR(3,4)
DR(3,5)
DR(3,6)
DR(3,7)
X4
DR(4,1)
DR(4,2)
DR(4,3)
DR(4,4)
DR(4,5)
DR(4,6)
DR(4,7)
X5
DR(5,1)
DR(5,2)
DR(5,3)
DR(5,4)
DR(5,5)
DR(5,6)
DR(5,7)
X6
DR(6,1)
DR(6,2)
DR(6,3)
DR(6,4)
DR(6,5)
DR(6,6)
DR(6,7)
X7
DR(7,1)
DR(7,2)
DR(7,3)
DR(7,4)
DR(7,5)
DR(7,6)
DR(7,7)
Destination Units
NMA Risk Tool
Supply/Demand Comparison
(Link Provisioning Matrix)
Source
Units
Device – Metric – Data Type
X1
X2
X3
X4
X5
X6
X7
X1
DR(1,1)
DR(1,2)
DR(1,3)
DR(1,4)
DR(1,5)
DR(1,6)
DR(1,7)
X2
DR(2,1)
DR(2,2)
DR(2,3)
DR(2,4)
DR(2,5)
DR(2,6)
DR(2,7)
X3
DR(3,1)
DR(3,2)
DR(3,3)
DR(3,4)
DR(3,5)
DR(3,6)
DR(3,7)
X4
DR(4,1)
DR(4,2)
DR(4,3)
DR(4,4)
DR(4,5)
DR(4,6)
DR(4,7)
X5
DR(5,1)
DR(5,2)
DR(5,3)
DR(5,4)
DR(5,5)
DR(5,6)
DR(5,7)
X6
DR(6,1)
DR(6,2)
DR(6,3)
DR(6,4)
DR(6,5)
DR(6,6)
DR(6,7)
X7
DR(7,1)
DR(7,2)
DR(7,3)
DR(7,4)
DR(7,5)
DR(7,6)
DR(7,7)
Aggregated
Provisioning Scores
Mission/Task Risk
Destination Units
21
Process for Determination of Task Risk
Due to Network Failure
Note: Process repeated for each mission phase, critical function, etc.
Mission/Task Analysis
(Map mission to units
and Tasks/Subtasks)
Determine Role of
Network in Task
Success
Shape of Risk Curv e
Sharing
(Exponential)
Collaboration
(Linear)
Determine Extent
of Task Network
Dependency
Scale of Risk Curv e
Synchronization
(Quick Onset)
Select
Scale
Specific Maritime Dynamic
Targeting example and draft
document detailing execution of
this process are available
Task Failure
Selected Curv e
Network Failure
22
Roles of the Network in Enabling Mission Activities: Enabling
Segments of Boyd’s OODA Loop
Observe
Act
Decide
Orient
Type
Information Sharing
Collaboration
Direction
Network Role
Provide access to
information
Provide means for
interaction
Real-time interaction and
information exchange
Requirement
Path existence
Relatively short paths
Very short or direct paths
Example(s)
Flat network, random
matrix
Small World, Semirandom w/preferential
attachment
Deterministic hierarchal,
rooted tree (for local
synchronization)
23
Network Decay From Node Removal
From: R. Albert and A.-L. Barabasi: Statistical mechanics of complex networks
REVIEWS OF MODERN PHYSICS, VOLUME 74, JANUARY 2002
Both WWW and INTERNET have small
world network characteristics, but
INTERNET is closer to deterministic
tree structure, and decay shows some
exponential characteristics while
WWW decays linearly….
The relative size S (a),(b) and average path length l (c),(d) of the
largest cluster in two communication networks when a fraction f
of the nodes are removed: (a),(c) Internet at the domain level,
N=56209, k=3.93; (b),(d) subset of the World Wide Web (WWW)
with N=5325729 and k=4.59. squares = random node removal;
dots= preferential removal of the most connected nodes. After
Albert, Jeong, and Baraba´ si (2000).
~2.5
However, other algorithms suggest that
random and small-world networks may
have similar decay [risk] curve shapes with
respect to S ….
24
Exemplar Results from NMA Analysis
(All Results Notional)
Task
Threat
Mitigation
Likelihood
Task Failure
Comments
ALL
None
None
0%
Task Link Needs (assumed met in base case)
Man CO-Cmd links for HUMINT Collection
Man CO – Spt unit for FWD Resupply
Man CO – Man CO for COIN Combat
Destroy
Enemy LRFs
Jamming
None
0%
Jamming insufficient to reduce link
capability below demand levels
Secure Bridge
Sites
Jamming
None
25%
Peak loads push demand for some links (e.g., between
maneuver companies and support units) above
available supply
Secure Bridge
Sites
Jamming
Fiber
25%
No mitigation: Fiber only add link capability between
stationary units (e.g., HHC units and support units)
Defeat OBJ
EAGLE Enemy
Jamming
None
30%
Jamming reduces capacity of wireless links connecting
Man COs with support units and Bn-Bde-DIV HQs
20%
Some mitigation: Fiber enables offload of some
demand between Bn-Bde-DIV level HQs and support
units from tactical wireless networks
Defeat OBJ
EAGLE Enemy
Jamming
Fiber
Back-up Slides
QCDI Summary
 Versions 1 and 2 of the demand model are complete and data
are available to the NC community: qcdi.rand.org (password
required)
 Model added to OSD/CAPE M&S tools registry:
https://jds.cape.osd.mil/Default.aspx (CAC required)
 Applied in approximately 20 past and current major
studies/analysis efforts across DoD
 Additional detail and assistance with application of model
and data available through the NC CPM SE&I Team
 Points of Contact:
─ Jimmie G. McEver, EBR, mcever@ebrinc.com
─ Craig M. Burris, JHU/APL, craig.burris@jhuapl.edu
─ NC CPM POC: heather.schoenborn@osd.mil
27
Nature of the Problem
 Ongoing DOD transformation to Joint net-enabled operations
promises improved force agility
─ Key to dealing with the uncertainty associated with a wide range of
changing threats, missions and operations.
 This strategy poses significant challenges for decision makers
and analysts planning portfolios comprising the Joint
network
─ Identify capability gaps and determine mission implications
─ Overcome curse of dimensionality & challenge of forecasting
 Traditional methods based on information exchange
requirements have significant limitations
─ Resource intensive and time consuming
─ Often based on the past experience of SMEs
28
Related NII Initiatives
 Quantitative Descriptive Capability Increments (QCDI)
─ Estimates demand for net centric capability
─ Assesses the supply provided by programs and systems
─ Facilitates understanding of the degree to which systems are capable
of satisfying demand
 Network Mission Assurance
─ Estimates the mission risk associated with various levels of network
capability support
─ Aims at achieving a repeatable methodology with a family of
supporting tools that can be adapted to a wide range of problems
─ Models demand, supply and mission impact at low level of resolution
that complements traditional methods and tools
29
The QCDI Demand Model:
Objective and Guidelines
 Objective: Easy to use demand model that provides
quantitative representation of NC needs across the entire
DOD
─ Serve as quantitative baseline for NC CPM
─ Illuminate investments likely to have greatest impact
 Key Guidelines
─ Focus on steps to a fully interoperable Joint network as reflected in Net
Centric CDI
─ Base on specific needs of various classes of users that comprise units
─ Identify relatively small set of widely applicable metrics for 2012, 2016,
2020 CDI increments
─ Estimate values representing an 80% solution, to serve as starting point
for more detailed analysis
─ Facilitate assessment of supply provided by programs
30
Role of Demand Devices in
QCDI Demand Model
 Key Premises
─ Users employ devices of different types to access Joint network
capabilities
─ Aggregate demand for network capability driven by trends in
communication devices used to access the joint network
 Device Types Represented in QCDI Demand
─ Direct Beyond-Line-of-Sight (BLOS): User demand directly supported
through a BLOS wireless device (generally direct use of a low data rate
SATCOM terminal)
─ Direct Line-of-Sight (LOS): User demand directly supported through
use of line-of-sight (LOS) wireless device
─ Indirect: User demand not directly supported by a wireless receiver or
transmission device. This demand is aggregated with demand from
other users before transport outside of local networks by either LOS or
BLOS capacity
31
Key Elements of QCDI Demand Model
Decision
Maker
Analyst
8
Exploration Tool
Req Docs
Studies
1
Core
Terrestrial
A
B
2012
2016
2020
Typical Req. DR (Mbps)
0.29
2.27
2.234
Protected Comm. DR (Mbps)
0.1
Voice DR (Mbps)
Edge
IT
C
1
1.4
2
1.12
1.191
1.3
0.8
0
2.37
Comm. Set-up time (min) (max)
0
0.2
0.51
Data End-to-End Delay (sec) (max)
1
1.6
0.1
Voice End-to-End Delay (sec) (max)
0
1.6
0.5
0.582
Service Discovery Requests (Req/Hr)
0.68
2.39
0.617
0.18
0.861
0.23
0.3
2
0.456
0.859
1.69
F
0.07
Service Discovery Response Time (sec)
1.867
2.4
0.9
0.96
Cross-Domain Transfer Time (sec)
IA
G
0.2
Validation Time (min)
2.774
IAM Mgt Time (min)
0.91
2
2
0.2
3
1
0.169
0.88
Pedigree production rate (%)
0.3
1
3.69
DAR compromise time (days)
0.36
0.27
1
Compliant COMSEC Tier
0.1
1
1.5
Intrusion Detection Time (min)
0.2
1.64
0.4
CND Response Time (min)
0.5
2
2.571
3.5
Interoperability Depth Hghr Network Tiers
NM
0
0.5
2.3
0.427
DNS (Req/Hr)
E
1
0
0.306
Auth. Serv (Req/Hr)
Email (Req/Hr)
Search (Req/Hr)
0.05
0.7
Response Time (seconds)
0.2
0.4
0.6
Time to Refresh contextual SA (sec)
2.45
0.63
0
Managed QOS Handling (%)
2.3
1.6
1.1
2.7
Connection Resilience (%)
1.48
0.1
End User Device RF Spectrum Eff (bps/Hz)
0.4
0.4
1
RF Spectrum Reallocation Time (sec)
0.1
0
0.1
User Values
Unit Values
…
Sfc
Local
…
NM
2012 Wired Dsmtd
Grnd Sfc
MobileLocal
Wrkr …
NMInfra
2012 BLOS Dsmtd
Grnd Sfc
MobileLocal
Wrkr …
Dsmtd
NMInfra
2012 IT
LOS
Grnd Mobile Wrkr
Infra
Typical Req. DR (Mbps)
0.29
Protected Comm. DR (Mbps)
0.1
Voice DR (Mbps)
3
ES
ES
ES
IA
IA
IA
NM
NM
NM
0.1
1
2
0.5
0.8
0
1
0
0.582
2.6
0.306
0.617
0.18
0.861
0.07
2.3
0.427
0.2
2.774
0.91
0.3
0.36
0.1
0.2
0.5
3.5
0.2
2.45
2.3
1.48
0.4
0.1
2.27
0.9
0.8
1
0
1
2.27
0
0.9
0.582
1
2.6
2
0.306
1.191
0.617
0.18
0.861
0.2
0.07
1.6
2.3
0
1.6
0.5
0.427
0
0.2
0.68
2.774
0.23
0.3
2
0.91
0.3
0.36
0.1
0.2
0.5
0.5
2.4
0.96
2
3.5
3
0.2
0.169
2.45
1
2.3
0.27
1.48
1
0.4
1.64
0.1
0
1
2
0.5
2
0.05
0.4
0.63
1.6
2
1.191
0
0.2
1.6
1.6
0.5
0
0.68
0.23
0.3
2
0.5
2.4
0.96
2
3
0.169
1
0.27
1
1.64
2
0.05
0.4
0.63
1.6
0.1
0.4
0
2.234
0
0
1.4
0.2
2.234
1.6
0
0.5
1.4
0
1.12
0.68
1.3
0.23
0.3
2
2.37
0.5
0.5
2.4
0.96
0.51
0.1
1
0
2
2.39
3
1
0.456
0.859
1.69
0.27
1.867
0.169
1
1.64
2
0.9
2
0.2
0.05
1
0.4
0.88
0.63
3.69
1.6
1
0.1
1.5
0.4
0.4
0
2.571
0.7
0.6
0
1.1
2.7
1.12
1.3
2.37
0.51
0.1
0.5
1
0
2.39
0.456
0.859
1.69
1.867
0.9
2
0.2
1
0.88
3.69
1
1.5
0.4
2.571
0.7
0.6
0
1.1
2.7
1
0.1
0.29
1.3
0.1
2.37
1
0.51
0.1
0.29
0.5
0.1
1
1
0
2
2.39
0.5
0.456
0.859
1.69
0.8
1.867
0
0.9
0.582
2
0
1
2.6
0.2
0.306
1
0.617
0.18
0.861
3.69
1
0.07
1.5
2.3
0.4
0.427
2.571
0.2
0.7
2.774
0.6
0.91
0
0.3
1.1
0.36
2.7
0.1
1
0.2
0.1
0.9
1
1.12
0.88
2.27
0.1
1.4
2
1.191
1.6
0.29
0.5
3.5
0.2
2.45
2.3
2
0.5
0.8
0
1
0
0.582
2.6
0.306
0.617
0.18
0.861
0.07
2.3
0.427
0.2
2.774
0.91
0.3
0.36
0.1
0.2
0.5
3.5
0.2
2.45
2.3
1.48
0.4
0.1
2.27
0.9
0.8
1
0
1
2.27
0
0.9
0.582
1
2.6
2
0.306
1.191
0.617
0.18
0.861
0.2
0.07
1.6
2.3
0.427
0
1.6
0.5
0
0.2
0.68
2.774
0.23
0.3
2
0.91
0.3
0.36
0.1
0.2
0.5
0.5
2.4
0.96
2
3.5
3
0.2
0.169
2.45
1
2.3
0.27
1.48
1
0.4
1.64
0.1
2
0.05
0.4
0.63
1.6
Connection Resilience (%)
1.48
0.1
1.48
0.1
End User Device RF Spectrum Eff (bps/Hz)
0.4
0.4
1
0.4
0.4
RF Spectrum Reallocation Time (sec)
0.1
0
0.1
0.1
0
2
1.191
0
0.2
1.6
1.6
0.5
0
0.68
0.23
0.3
2
0.5
2.4
0.96
2
3
0.169
2.27
2.234
0.1
0.9
0
1
2
Voice Packet Delivery Ratio (%)
0.29
time (min) (max)
ProtectedComm.
Comm.Set-up
DR (Mbps)
0.1
End-to-End Delay (sec) (max)
Voice DRData
(Mbps)
1.6
0.5
Voice
Availability
(%) End-to-End Delay (sec) (max)
3
4
1
0.27
1
1.6
0.5
1.3
2.39
0.1
0.5
1
0.456
0.859
1.69
0
0.5 2.37
0
1.867
0.68
2.4 0.51
2.39
0.9
0.96 0.1
0.456
0.859
1.69
0.2
1.867
0.88
0.9
3.69
2
1
2
0
0.5
0.582
2.6
0.306
0.617
0.18
0.861
0.4271.6
0.07
0.91 0
0.3 0.68
0.5
2.3
0.427
0.360.23
0.96
0.617
0.18
0.861
0.07
0.91
2.6
0.07 0
2.3 0.2
1.6
0.2
2.774
0.427
2.4
0.1 0.3
0.2 2
2
0.5 0.5
0.169
3
2 0.5
3 1
0.169 0
1 2.39
0.270.456
1 0.859
1.641.69
2 1.867
3.5 2.4
1
0.2 0.96
0.27
0.4 2
2.45 2
1
0.63 0.2
0.2
2.3 3
1.64
1.6 1
0.5
1.480.169
3.5
0.4
RF
Spectrum
Reallocation Time (sec) 0.36
DAR compromise
time
(days)
Response
Time
(seconds)
(seconds)
0.2
0.1 0.27
Time
to Refresh
contextual SA
SA (sec)
(sec)
Compliant
COMSEC
Tier contextual
0.1
2.45
1
0.63
Handling
(%)
(%)
Intrusion Managed
Detection QOS
Time Handling
(min)
0.2
2.3
1.64
Connection
Resilience
(%)
(%)
CND Response
TimeResilience
(min)
0.5
1.48
2
0.1
End User
Device
Spectrum
Eff (bps/Hz)
Interoperability
Depth
HghrRF
Network
Tiers
3.5
0.4
0.4
RF
Spectrum
Reallocation Time (sec)
Response
Time
(seconds)
0.2
0.1
Time to Refresh contextual SA (sec)
1.6
0.3
DAR compromise
time
(days)
Response
Time
(seconds)
(seconds)
Service Discovery
Response
Time
(sec)
0.23
0.3
2
0.1
0.63
Depth
PedigreeInteroperability
production rateDepth
(%) Hghr Network Tiers
DNS (Req/Hr)
2.3
0.2 1.6
2.7740.5
0.36
0.4
IA
NM
1.6
0.1
0.4
0
0
NM
0
0.51
0.23 1.4
0.3 1.12
2 1.3
0.2
0.05
1
0.68 0
1.6
0.306
Time
to
Refresh
contextual SA
SA (sec)
(sec)
Cross-Domain
Transfer
Time
(sec)
Compliant
COMSEC
Tier contextual
0.4
0.63
0.5
0.2
0.617 1
0.18 2
0.861
1.191
0.91
2
0.05
0.1
0.3060.9
2.774
1.64
2.37
1
Endrate
User
Spectrum
Eff (bps/Hz)
Depth
Depth
HghrRF
Network
Tiers
PedigreeInteroperability
production
(%)Device
0.3
2
0.51
1.3
Connection
Resilience
(%)
(%)
IAM Mgt CND
Time Response
(min)
TimeResilience
(min)
1
1.64
2.37
0
1.4
Managed
Handling
(%)
(%)
ValidationIntrusion
Time (min)
Detection QOS
Time Handling
(min)
1
0.27
2.234
1.12
0 2.234
0
DAR compromise
time
(days)
Service
Discovery
Response
Time
(sec)
Auth. Serv
(Req/Hr)
Cross-Domain
Transfer
Time (sec)
Compliant
COMSEC
Tier
Email (Req/Hr)
ValidationIntrusion
Time (min)
Detection Time (min)
Search (Req/Hr)
Mgt CND
Time Response
(min)
Time (min)
File DlvryIAM
(Req/Hr)
0
1
Mgt Time
(min)
File DlvryIAM
(Req/Hr)
Service Discovery
Requests
(Req/Hr)
Pedigree production rate (%)
DNS (Req/Hr)
Chat Requests
(Req/Hr)
IA
NM
2.6 2.27
0.1
0.8
IA
ES
0.2
1
1.6
ES
2
0
0.9
2
File (%)
Dlvry(min)
(Req/Hr) (Req/Hr)
Packet Delivery
ServiceRatio
Discovery
Requests
DNS
(Req/Hr)
Comm. Set-up
time
(min)
(max)
Chat Requests
(Req/Hr)
0.5
2.4
0.96
2.27
1.191
0
Service
Discovery
Data End-to-End
Delay
(sec)
(max) Response Time (sec)1
Auth. Serv
(Req/Hr)
Cross-Domain
Transfer Time (sec)
Email (Req/Hr)
Voice End-to-End
Delay (sec) (max)
0
Validation
Time (min)
Search
(Req/Hr)
Amt. Assured
Data
Storage (GB)
0.582
0.169
1
0
ES
IT
0
0.582
0.8
End-to-End
Delay
(sec) (max)
Auth. Serv
(Req/Hr)
Voice DRData
(Mbps)
Email (Req/Hr)
Voice
Delay (sec) (max)
Availability
(%) End-to-End
Search
(Req/Hr)
Voice Packet
Ratio
(%)
Amt. Delivery
Assured
Data
Storage
(GB)
0.68
0.23
0.3
2
1.4
1.12
1.191
0.8
2
0.5
time (min)(Req/Hr)
(max)
Chat
Requests
ProtectedComm.
Comm.Set-up
DR (Mbps)
0
1
2012 2016 2020
Voice Packet
(%)
Amt. Delivery
Assured Ratio
Data Storage
(GB)
Packet
(%) (min)
ServiceRatio
Discovery
Requests (Req/Hr) 0.29
Typical Req.
DRDelivery
(Mbps)
IT
1.6
2
0.5
Packet
Typical Req.
DRDelivery
(Mbps) Ratio (%) (min)
0
0.2
1
2012 2016 2020
Availability (%)
2
1.191
1
0.05 0.9
0.5
0.2
1
1.5
1
0.4
0.88
2.571
3.69
0.7
1
0.6
1.5
0
0.4
1.1
2.571
2.7
2
0.1 0.88
0.05
0.4 3.69
0.7
1
0.1
0
1
0.6
1.5
0
1.6
0.4
1.1
0.1
2.571
2.7
0.05
0.4
0.7
1
0.4
0
0.6
0.1
0.4
2.45
0.63
0
Managed QOS Handling (%)
2.3
1.6
1.1
Connection Resilience (%)
1.48
0.1
2.7
End User Device RF Spectrum Eff (bps/Hz)
0.4
0.4
1
RF Spectrum Reallocation Time (sec)
0.1
0
0.1
7
2
User Mapping to TO&E
…
HBCT
1
BSTB
CAB
RS
FAB
BSB
1
2
1
1
1
1 3284 4776 6281
1 2904 2492 5963
1 2332 1599 946
17
23
68
1
0
3 2019 450
0
67
10
1
60
66
9
1
0 144
35
1
4 176 804 824
60
45
0
1
1 130
63 2169
0
0 566
1
1 120
10 1220
1
10
53 383
1 442 830 2848
1 391 246 1352
1
18 497 388
1
33
87 1108
1 380 2284 318
0 230 156
1
0
2 380 706
1
0 311
0
1
0 227
0
1
0 810 162
0 3469 168
0 2708 124
0 1257 53
0
89
4
0 390 21
0
8
1
0
6
1
0
11
1
0 744 24
0
9
1
0 748 27
0
75
4
0 526 21
0 147
2
0 703 44
0 273 18
0 251 19
0 179
7
0 761 44
0 276
9
0 146 14
0
66
6
0 111
5
0 162 10
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
79 109
79 97
0 75
0
3
0 63
0
0
0
0
0
0
0
0
0
9
79 10
79
0
0 10
0
0
0 12
0
0
0
0
0 12
0 12
0 12
0
0
0
0
0
0
0
0
0 3697 188
0 3199 188
0 1089 116
0
69
2
0 294 18
0
13
6
0
18 12
0
21
6
0 660 72
0
14
0
0 939
0
0 182
0
0 587
0
0 170
0
0 1171 72
0 485
0
0 305 72
0 381
0
0 498
0
0 194
0
0
82
0
0
36
0
0
58
0
0 128
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
ge
nt
A
A
art
art
Sm
ir
A
LO
Units
5
1
1
1
1
3
1
1
1
4
1
1
1
1
1
1
1
1
1
IS
R
R
ec
om
m
C
en
ou
de
nt
d
C
D
ou
is
m
nt
ou
nte
Su
d
G
rf
ro
ac
un
e
M
d
ob
Lo
ile
cal
W
ork
C
er
P
La
rg
e
C
P
M
ob
ile
C
om
m
an
C
de
3
A
r
ir
User Classes
GCE
Reg HQ
Bn
Tank Co.
LAR Co.
AAV Co.
ARTY Bn
Engineer Co.
ACE
MAG
MACG
MWSG
LCE
CLR (Fwd Spt Reg)
CLR (Direct Spt Reg)
Engr Spt BN
ge
nt
Return to Index
Total
MEB
database
0.29
Protected Comm. DR (Mbps)
Voice DR (Mbps)
1
2
0.5
…
2012 2016 2020
Typical Req. DR (Mbps)
Sm
TO&E
0.29
2.234
0.9
1
Availability (%)
Typical Req. DR (Mbps)
Voice Packet Delivery Ratio (%)
Protected Comm. DR (Mbps)
Packet Delivery Ratio (%) (min)
Voice DR (Mbps)
Comm. Set-up time (min) (max)
Availability (%)
Typical Req. DR (Mbps)
0.29
Data End-to-End Delay (sec) (max)
Voice Packet Delivery Ratio (%)
Protected Comm. DRVoice
(Mbps)
0.1
End-to-End Delay (sec) (max)
Packet Delivery Ratio (%) (min)
Voice DR (Mbps)
1
Amt. Assured Data Storage (GB)
Comm. Set-up time (min) (max)
Availability (%)
2
Service Discovery Requests (Req/Hr)
Data End-to-End Delay (sec) (max)
Voice Packet DeliveryChat
Ratio
(%)
0.5
Requests
(Req/Hr)
Voice End-to-End Delay (sec) (max)
Packet Delivery RatioAuth.
(%) (min)
0.8
Serv (Req/Hr)
Amt. Assured Data Storage (GB)
Comm. Set-up time (min)
(max)
0
Email (Req/Hr)
Service Discovery Requests (Req/Hr)
Data End-to-End Delay
(sec)(Req/Hr)
(max)
1
Search
Chat Requests (Req/Hr)
Voice End-to-End Delay
(sec)
(max)
0
File Dlvry (Req/Hr)
Auth. Serv (Req/Hr)
Amt. Assured Data Storage
(GB)
0.582
DNS (Req/Hr)
Email (Req/Hr)
Service Discovery Requests
(Req/Hr)
2.6
Service Discovery Response Time (sec)
Search (Req/Hr)
Chat Requests (Req/Hr)
0.306
Cross-Domain Transfer Time (sec)
File Dlvry (Req/Hr)
Auth. Serv (Req/Hr) Validation Time (min)
0.617
DNS (Req/Hr)
Email (Req/Hr)
0.18
IAM Mgt Time (min)
Service Discovery Response Time (sec)
Search (Req/Hr)
0.861
Pedigree production rate (%)
Cross-Domain Transfer Time (sec)
File Dlvry (Req/Hr) DAR compromise time (days)
0.07
Validation Time (min)
DNS (Req/Hr)
2.3
Compliant COMSEC Tier
IAM Mgt Time (min)
Service Discovery Response
(sec) Time (min)
0.427
IntrusionTime
Detection
Pedigree production rate (%)
Cross-Domain Transfer
Time
(sec) Time (min)
0.2
CND
Response
DAR compromise time (days)
Validation Time (min)Interoperability Depth Hghr Network Tiers
2.774
Compliant COMSEC Tier
IAM Mgt Time (min) Response Time (seconds)
0.91
Intrusion Detection Time (min)
Pedigree production rate
0.3
Time(%)
to Refresh contextual SA (sec)
CND Response Time (min)
DAR compromise time
(days)
0.36
Managed QOS Handling (%)
Interoperability Depth Hghr Network Tiers
Compliant COMSEC Connection
Tier
0.1
Resilience (%)
Response Time (seconds)
Intrusion Detection Time
(min)
0.2
End User Device RF Spectrum Eff (bps/Hz)
Time to Refresh contextual SA (sec)
CND Response TimeRF
(min)
Spectrum Reallocation Time (sec) 0.5
Managed QOS Handling (%)
Interoperability Depth Hghr Network Tiers
3.5
Connection Resilience (%)
Response Time (seconds)
0.2
End User Device RF Spectrum Eff (bps/Hz)
Time to Refresh contextual SA (sec)
2.45
RF Spectrum Reallocation Time (sec)
Managed QOS Handling (%)
2.3
IT
IT
2.27
ESG
IBCT
IT
HBCT
A
ir
M
ob
ilit
Ta y A
ir
ct
ic
al
Lo
A
ir
cal
U
nm
Pri
a
m
ary nn
ed
Sm
U
Sy
art nm
ste
an
A
m
ne
ge
d
Sta
nt
Sy
tic
ste
Se
m
ES
ns
or
In
fr
as
IA
tr
uc
In
fr
tu
as
re
N
tr
M
uc
In
tu
fr
re
as
To
tr
ta
uc
l
tu
re
Assumptions – Rationale
Estimation Methodology
SMEs
Maritime
D
0.5
2.6
Chat Requests (Req/Hr)
File Dlvry (Req/Hr)
Airborne
0
1
0.5
Packet Delivery Ratio (%) (min)
Amt. Assured Data Storage (GB)
ES
0.9
2
Voice Packet Delivery Ratio (%)
Availability (%)
20
20
JNO
Arch
Intermediate
2
Metrics Framework
User Framework
20
16
CDI
0 22,051 8892 39129
0 17,754 7462 35716
0 7,467 2962 8233
0
142
460
275
0 3,255 1107 1740
0
37
99
105
0
55
128
172
0
72
266
218
0 3,304 1504 5468
0
45
72
138
0
4165 1923 11745
0
363 2913
906
0
2494 1204 6735
0
765 356 2097
0
6122 2577 15738
0
2765 1129 7382
0
1550 666 2552
0 1,807 782 5804
0 4,297 1430 3413
0
434 1199
877
0 1,328
215
356
0
419 151
232
0
401 173
232
0 1,272 457 1394
6
32
QCDI Metrics by Tier II Capability
Information Transport
• Typical Req. Data Rate (Mbps)
• Protected Comm. DR (Mbps)
• Voice DR (Mbps)
• Availability (%)
• Voice Packet Delivery Ratio (%)
• Packet Delivery Ratio (%) (min)
• Comm. Set-up time (min) (max)
• Data End-to-End Delay (sec) (max)
• Voice End-to-End Delay (sec) (max)
• Upload (%)
• External Traffic (%)
Enterprise Services
• Amt. Assured Data Storage (GB)
• Service Discovery Requests (Req/Hr)
• Chat Requests (Req/Hr)
• Auth. Serv (Req/Hr)
• Email (Req/Hr)
• Search (Req/Hr)
• File Dlvry (Req/Hr)
• DNS (Req/Hr)
• Service Discovery Response Time (sec)
Information Assurance
• Cross-Domain Transfer Time (sec)
• Validation Time (min)
• Authorization Management Time (min)
• Pedigree production rate (%)
• DAR compromise time (days)
• Compliant COMSEC Tier
• Incident Detection Time (min)
• Incident Response Time (min)
Network Management
• Interoperability Depth - Higher Network Tiers
• Response Time (sec)
• Time to Refresh contextual SA (sec)
• Priority Information Delivery Mgt (%)
• Connection Resilience (%)
• End User Device RF Spectrum Eff (bps/Hz)
• RF Spectrum Reallocation Time (sec)
33
QCDI User Areas and User Classes
Core
Terrestrial/
Ground
Local Wrkr
CP High
USS High
UAS High
Airborne
Maritime
Intermediate
Dsmtd Ground
Surface Mbl
Local Wrkr
CP High
CP Low
USS High
USS Low
UAS High
UAS Low
Dsmtd Ground
Surface Mbl
Local Wrkr
CP High
CP Low
USS High
USS Low
UAS High
UAS Low
LO Air
Mobility Air
TAC Air
UAS High
C2 Air
ISR Air
UAS High
Surface Mbl
Local Wrkr
CP High
CP Low
Tactical Edge
USS High
USS Low
UAS High
UAS Low
Surface Mbl
Local Wrkr
CP High
CP Low
USS High
USS Low
UAS Low
All areas have Commander, Static Sensor, and
ES, IA and NM Infrastructure Users
User demand aggregated to unit-level estimates: comparisons
made at unit – not individual user – level
34
Scaling of Analysis and Data Requirements
with Estimation Maturity
Maximum
Nominal
DemandTrans-Program Loaded
Characteristics of
Integrated
devices or service
Program
Cross program view of loads
Data and
instance
Network Design demand use cases Allocation
analysis required
Number of
Nominal design Performance at of loads to
device Service
point loads
interfaces
program
instances
networks
Data generally
available from
programs
Data available
from studies
Increased
Effort
Extensions requiring richer
data and other elements of
analysis
Additional assumptions required as device, program, and demand
interactions are progressively considered
35
Incremental Maturity Levels for
Supply Capability Estimates
Factors
Accounted
For
Maximum
Nominal
Performance
of Program
Elements
Performance
of Program
Networks
Data generally
available from
programs
Data available
from studies
DemandTrans-Program Loaded
Program
Interfaces
Demand
Loads
Extensions requiring richer
data and other elements of
analysis
*This color coding scheme is used throughout subsequent examples
Increased
Quality
of
Estimate
Methodological Issues Explored
 Spectrum of data needs and analytical
sophistication
─ Relationship between effort and quality
 Use cases consistent with the demand
model for higher levels of maturity
─ Resolution at the user vs echelon level
 Potential role of simulations
─ Employing PET or PET data base to different
degrees
Time and resources dictated emphasis on levels 1 &2
Output Views for Steps in Methodology
Supply for Unit Type and Time Frame
Input from Supply Template for an individual program
(Example: WIN-T Program)
D1
2012
D2 Dn
D1
2016
D2 Dn
D1
Metrics
Deployment Schedule Worksheet
Maps devices to timeframes and maps the number of devices to a
unit
2020
D2 Dn
Pn
D1 …
Wired
BLOS
LOS
Dn
Area C
MEB
HBCT
Pn
BSTB
CAB
P2
D1 …
P2
D1
Mapping of devices to demand device type
D1
D2
…
Dn
LOS
BLOS
Wired
D2
P2
…
Dn
D3
Number LOS
of
BLOS
Devices Wired
Metrics
Area D
Supply
P1
P1
D1 …
Wired
BLOS
LOS
Dn
Supply Values (Program Element Performance)
D1
Dn
Total
M1
M2
M2
…
…
…
M1
D2
Mn
Mn
Wired
BLOS
LOS
Pn
Metrics
P1
Wired
BLOS
LOS
TotalOTM
External
Wired
BLOS
LOS
Output Views for Steps in Methodology
Supply v Demand for Unit Type and Time Frame
Deployment Schedule Worksheet
Maps devices to timeframes and maps the number of devices to a
unit
D1
2012
D2 Dn
D1
2016
D2 Dn
D1
2020
D2 Dn
Illustrative
Trace for
JTRS DR
Metrics
Input from Supply Template for an individual program
(Example: WIN-T Program)
Pn
D1 …
Wired
BLOS
LOS
Dn
Area C
MEB
HBCT
Pn
BSTB
CAB
P2
D1 …
P2
D1
Area D
…
Dn
Metrics
D2
LOS
BLOS
Wired
P2
…
Dn
D3
Number LOS
of
BLOS
Devices Wired
Mapping of devices to demand device type
D1
D2
Supply
P1
P1
D1 …
Wired
BLOS
LOS
Dn
Supply Values (Program Element Performance)
D1
D2
…
Wired
BLOS
LOS
Pn
Metrics
P1
Wired
BLOS
LOS
Total
Dn
TotalOTM
External
Wired
BLOS
LOS
M1
M2
M2
…
…
M1
Mn
Mn
Total
…
M1
M2
Mn
TotalOTM
External
Comparison of Supply and Demand
Wired
BLOS
LOS
Wired
BLOS
LOS
Total
On The Move
Supply Demand Ratio Supply Demand Ratio
S/D
S/D
M1
M2
M3
Total
Supply v Demand for Unit Type and Time Frame
Output Views for Steps in Methodology
Component
Supply
From Program Templates
Component Fielding to Units
Deployment Schedule Worksheet
Maps components to timeframes and maps the number of
components to a unit
2012
C2 Cn
C1
2016
C2 Cn
C1
2020
C2 Cn
Area C
Component Fielding by
Demand Type
MEB
HBCT
BSTB
CAB
Pn
C1 …
Metrics
C1
Wired
BLOS
LOS
Cn
Program
Supply
Pn
Supply
C1
C2
…
Cn
LOS
BLOS
Wired
Component Performance Metrics
LOS
Number of
BLOS
Components
Wired
C2
C3
C1
Pn
…
Cn
Metrics
compontent type
C1
Metrics
Component Mapping
to Demand
Type
Mapping
of components to
demand
P1
P1
Wired
BLOS
LOS
P2
P1
C1 …
P2
…
Metrics
Area D
Wired
BLOS
LOS
Pn
Wired
BLOS
LOS
Cn
Supply Values (Program Element Performance)
C1
C2
…
Cn
M1
M2
Total
…
Unit
Supply
TotalOTM
External
M1
M2
…
Mn
Wired
BLOS
LOS
Mn
QCDI Demand
Model
Analyst
QCDI Demand
Exploration Tool
Decision
Maker
Download