Training Needs Analysis for Team and Collective Training

Training Needs Analysis for Team
and Collective Training
The investigation which is the subject of this Report was initiated by the Research
Programme Leader Human Capability.
© BAE Systems 2011. All Rights Reserved. The authors of this report have asserted their
moral rights under the Copyright, Designs and Patents Act, 1988, to be identified as the
authors of this work.
Reference ............................................HFIDTCPIII_T13_01
Version ................................................................................. 2
Date ................................................................. 16 April 2011
© BAE Systems 2011. All Rights Reserved. Issued by BAE Systems on behalf of the HFI DTC consortium. The HFI DTC consortium consists of BAE
Systems, Cranfield University, Lockheed Martin, MBDA, SEA, Southampton University and the University of Birmingham.
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Authors
Jonathan Pike
Cranfield University
Dr John Huddlestone
Cranfield University
ii
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Contents
1
Executive Summary ................................................................................... 1
2
Introduction ................................................................................................ 3
2.1
Background ......................................................................................................................... 3
2.2
Scope of Team/Collective Training Needs Analysis ........................................................... 3
2.3
Training Needs Analysis ..................................................................................................... 4
2.4
Methodological Approach.................................................................................................... 8
3
The Team/Collective Training Model .......................................................... 9
3.1
Introduction.......................................................................................................................... 9
3.2
Definitions............................................................................................................................ 9
3.3
Development of a Teamwork Taxonomy .......................................................................... 10
3.3.1 Review of Extant Teamwork Models ...................................................................... 10
3.3.2 Synthesis of a Teamwork Taxonomy...................................................................... 15
3.4
3.3.2.1
Results ......................................................................................................... 15
3.3.2.2
3.3.2.1.1 Initial Coding .............................................................................. 15
3.3.2.1.1.1 Leadership ............................................................................... 15
3.3.2.1.1.2 Situational Awareness ............................................................. 17
3.3.2.1.1.3 Task and Resource Allocation ................................................ 17
The Teamwork Taxonomy ........................................................................... 17
Development of a Team/Collective Performance Model .................................................. 19
3.4.1 Review of Team Effectiveness Models ................................................................... 19
3.4.2 Environmental Task Demands................................................................................ 23
3.4.3 The Team/Collective Performance Model .............................................................. 24
3.4.3.1
Task Environment ........................................................................................ 25
3.4.3.2
Performance Outcomes ............................................................................... 26
3.4.3.3
Team Processes .......................................................................................... 26
3.4.3.4
Team Properties .......................................................................................... 27
3.4.3.5
Team Member Characteristics ..................................................................... 27
3.5
Development of a Team/Collective Training Model .......................................................... 27
4
Review of Team Analysis Methods Applicable to TCTNA ........................ 30
4.1
Introduction........................................................................................................................ 30
4.2
Evaluation of Methods ....................................................................................................... 30
iii
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
4.2.1 Hierarchical Task Analysis for Teams .................................................................... 30
4.2.2 Team Cognitive Task Analysis ............................................................................... 31
4.2.3 Team Task Analysis................................................................................................ 31
4.2.4 Task and Training Requirements Analysis Methodology ....................................... 32
4.2.5 Work Domain Analysis ............................................................................................ 32
4.2.6 Mission Essential Competencies ............................................................................ 33
4.2.7 Models for Analysis of Team Training (MATT) ....................................................... 33
4.3
Evaluation.......................................................................................................................... 35
5
The TCTNA Method ................................................................................. 36
5.1
Introduction........................................................................................................................ 36
5.2
The Team/Collective Training Model ................................................................................ 36
5.3
The Triangle Model of TNA ............................................................................................... 37
6
Constraints Analysis ................................................................................. 41
6.1
Introduction........................................................................................................................ 41
6.2
Constraints Tables ............................................................................................................ 41
7
Team/Collective Task Analysis................................................................. 43
7.1
Introduction........................................................................................................................ 43
7.2
External Task Context Description .................................................................................... 43
7.2.1 Generic Task Scenarios ......................................................................................... 43
7.2.2 Team Context Diagrams ......................................................................................... 45
7.2.2.1
Team Context Diagram Notation ................................................................. 45
7.2.2.2
Interaction Table Construction ..................................................................... 46
7.2.2.3
Example External Context Description ........................................................ 47
7.2.3 Environment Description Table Construction ......................................................... 50
7.3
Internal Task Context Description ..................................................................................... 53
7.3.1 Organisational Structure ......................................................................................... 53
7.3.2 Role Definitions ....................................................................................................... 54
7.3.3 Internal Team Context Diagram.............................................................................. 56
7.3.4 Team Communication Structure ............................................................................. 58
7.3.4.1
Communication Diagram ............................................................................. 58
7.3.4.2
Communications Description ....................................................................... 61
7.3.4.3
Communications Matrix ............................................................................... 61
iv
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
7.4
Task Analysis .................................................................................................................... 62
7.4.1 Hierarchical Task Analysis for Team & Collective Training - HTA(TCT) ................ 62
7.4.1.1
HTA(TCT) Diagram Notation and Format ................................................... 62
7.4.1.2
Example HTA (TCT) Diagram ...................................................................... 64
7.4.1.3
Task Sequence Diagrams............................................................................ 66
7.4.1.4
HTA(TCT) Task Description Tables............................................................. 67
7.4.1.5
Task Role Matrix .......................................................................................... 72
7.5
Capture Environmental Task Demands ............................................................................ 73
7.6
Teamwork Analysis ........................................................................................................... 74
7.6.1 Teamwork Process Priorities .................................................................................. 74
7.6.2 Teamwork Interaction Analysis by Role ................................................................. 74
7.7
Training Gap Analysis ....................................................................................................... 75
7.7.1 The Risk Management Approach ........................................................................... 76
7.7.2 The Training Priorities Table ................................................................................... 76
7.8
Team/Collective OPS and TO Development .................................................................... 80
7.8.1 Linking Tasks and Training Objectives to Mission Task Lists ................................ 80
8
Training Overlay Analysis ......................................................................... 83
8.1
Introduction........................................................................................................................ 83
8.2
Instructional Theory ........................................................................................................... 83
8.2.1 Clark’s Content-Performance Matrix (Clark, 2008) ................................................ 83
8.2.2 Part-task training ..................................................................................................... 85
8.2.3 Reigeluth’s Simplifying Conditions Method (Reigeluth, 1999) ................................ 85
8.2.4 Team Training Methods .......................................................................................... 86
8.2.5 The Relationship between Tasks and the Operational Environment ..................... 87
8.3
Instructional Method Selection .......................................................................................... 88
8.3.1 Selecting Methods for Initial Instruction .................................................................. 88
8.3.2 Selecting Methods for Practice and Assessment ................................................... 90
8.3.2.1
Identification of Part-Task Training Requirements ...................................... 90
8.3.2.2
Identification of Simplifying Conditions Method Requirements.................... 91
8.3.2.3
Assessment and Feedback Methods........................................................... 91
8.3.2.4
Practice and Assessment Methods Table ................................................... 92
8.3.3 Training Scenario Specification .............................................................................. 94
8.4
Instructional Task Identification ......................................................................................... 97
8.4.1 Instructor Task Table .............................................................................................. 97
8.4.2 Training Overlay Requirement Specification .......................................................... 98
v
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
9
Training Environment Analysis ............................................................... 100
9.1
Introduction...................................................................................................................... 100
9.2
Training Environment Specification ................................................................................ 100
9.2.1 Training Environment Rationalisation ................................................................... 100
9.2.2 Fidelity Analysis .................................................................................................... 102
9.2.2.1
Specifying Fidelity Requirements .............................................................. 102
9.2.2.1.1
9.2.2.1.2
9.2.2.1.3
9.2.2.1.4
9.2.2.1.5
9.2.2.1.6
System Fidelity Requirements ................................................. 103
Resource Fidelity Requirements ............................................. 104
Human Element Fidelity Requirements ................................... 104
Manned Systems Fidelity Requirements ................................. 105
Physical Environment Elements .............................................. 106
Recording Element Fidelity Specifications .............................. 106
9.3
Training Environment Option Identification ..................................................................... 108
9.4
Training Environment Option Definition .......................................................................... 110
9.5
Training Environment Option Evaluation ........................................................................ 113
10
Training Options Evaluation ................................................................... 114
10.1 Introduction...................................................................................................................... 114
10.2 Estimation of Costs ......................................................................................................... 114
10.3 Estimating Effectiveness ................................................................................................. 114
10.4 JSP 822 Guidance .......................................................................................................... 114
11
Conclusions and Recommendations ...................................................... 115
11.1 Conclusions ..................................................................................................................... 115
11.2 Recommendations .......................................................................................................... 116
12
References ............................................................................................. 117
Appendix A
Teamwork Models................................................................. 120
A.1
Team Process Model (Annett, 2000) .............................................................................. 120
A.2
Team Coordination Dimensions (Bowers et al, 1993) .................................................... 121
A.3
The Models for Analysis of Team Training Taxonomy (Dstl, 2006) ................................ 122
A.4
Salas “Big Five” Model of Teamwork (Salas et al, 2005) ................................................ 124
A.5
Teamwork Behaviours (Rousseau et al, 2006) ............................................................... 125
vi
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
List of Tables
Table 1 Teamwork Models used in Teamwork Analysis .......................................................................... 12
Table 2 Integrative Teamwork Models...................................................................................................... 13
Table 3 Initial Teamwork Categories ........................................................................................................ 16
Table 4 TCTNA Teamwork Taxonomy ................................................................................................... 18
Table 5 Work Domain Analysis Abstraction Hierarchy (Naikar and Sanderson, 1999)........................... 33
Table 6 MATT Process Stages (Dstl, 2006) .............................................................................................. 34
Table 7 TNA Triangle Stage Sub-Components ......................................................................................... 38
Table 8 Example Constraints Table Format .............................................................................................. 42
Table 9 Generic Scenario Table Format .................................................................................................... 44
Table 10 Completed Generic Scenario Table ........................................................................................... 44
Table 11 Interaction Table Format ............................................................................................................ 47
Table 12 Example Tornado F3 Pair External Context Interaction Table .................................................. 49
Table 13 Environment Description Table Format ..................................................................................... 50
Table 14 Example Tornado F3 Pair Environment Description Table ....................................................... 52
Table 15 Example Role Definition for the F3 Pair Lead WSO ................................................................. 55
Table 16 Example Role Definition for the F3 Pair Wingman WSO ........................................................ 55
Table 17 Example Tornado F3 Pair Interface Interaction Table ............................................................... 57
Table 18 Tornado F3 Pair Environment Description Table Entries .......................................................... 57
Table 19 Example System Matrix ............................................................................................................. 58
Table 20 Example Tornado F3 Pair Communications Matrix................................................................... 60
Table 21 HTA(TCT) Diagram Element Symbols and Descriptions.......................................................... 63
Table 22 Syntax for plans and examples of use ........................................................................................ 64
Table 23 Task Sequence Diagram Notation .............................................................................................. 66
Table 24 HTA(TCT) Task Description Table Structure............................................................................ 68
Table 25 Task Description Table for F3 Pair Task 1.2.5 Meld Radar....................................................... 70
Table 26 Task Description Table for F3 Pair Task 1.2 Detect Bandit BVR ............................................. 71
Table 27 Task and Role Matrix Example .................................................................................................. 72
Table 28 Example Environmental Task Demands Table .......................................................................... 73
Table 29 Example Teamwork Process Priority Table ............................................................................... 74
Table 30 Teamwork Interaction Table ...................................................................................................... 75
Table 31 Example Training Priorities Table ............................................................................................. 78
Table 32 Factors Affecting the Likelihood of an Error and Indicators of High Severity
Consequences ............................................................................................................................................ 79
Table 33 Training Content Performance Matrix with Application Level Elaboration Strategies,
adapted from Clark (2008)......................................................................................................................... 84
Table 34 Suggested Content for Initial Instruction on Teamwork ............................................................ 89
Table 35 Example Initial Instruction Table for F3 Pairs Training ............................................................ 90
Table 36 Example Practise and Assessment Methods Table..................................................................... 93
Table 37 Example Training Scenario Table .............................................................................................. 95
Table 38 Environment Description Table Entries for Task Scenario Requirements................................. 96
Table 39 Example Instructor Task Table Format and Entries ................................................................... 98
vii
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 40 Environment Description Table Training Overlay Requirements Entries ................................. 99
Table 41 System Fidelity Requirements .................................................................................................. 103
Table 42 Resource Fidelity Requirements ............................................................................................... 104
Table 43 Human Fidelity Requirements .................................................................................................. 105
Table 44 Manned Systems Fidelity Requirements .................................................................................. 105
Table 45 Physical Environment Fidelity Requirements .......................................................................... 106
Table 46 Tornado F3 Pair Environment Description Table .................................................................... 107
Table 47 Example Training Environment Option Description Table ..................................................... 111
Table 48 Example Training Environment Option Properties Table Entries............................................ 112
Table 49 Training Environment Options Comparisons Table ................................................................. 113
Table 50 Definitions of Team Coordination Dimensions (Bowers et al, 1993), ..................................... 121
Table 51 MATT Teamwork Behaviours (Dstl, 2006) ............................................................................. 122
Table 52 MATT Team Member Attitudes and Characteristics (Dstl, 2006) ........................................... 123
Table 53 MATT Teamwork Knowledge Requirements (Dstl, 2006) ...................................................... 123
Table 54 Definitions of the Core Components and Coordinating Mechanisms of the “Big Five”
Model of Teamwork (adapted from Salas et al, 2005) ............................................................................ 124
Table 55 Teamwork Behaviours (Rousseau et al, 2006) ......................................................................... 125
viii
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
List of Figures
Figure 1 The Systems Approach to Training ............................................................................................... 4
Figure 2 MoD TNA Process Diagram (JSP822, 2007) ............................................................................... 7
Figure 3 TCTNA Development Sequence ................................................................................................... 8
Figure 4 Patterns of Team Interactions adapted from Tesluk et al (1997) ................................................ 10
Figure 5 Input-Process-Output (IPO) paradigm for analysis of group interaction as a mediator of
performance outcomes (Hackman and Morris, 1975) ............................................................................... 19
Figure 6 Model of Team Effectiveness adapted from Tannenbaum, Beard and Salas (1992) .................. 20
Figure 7 The Command Team Effectiveness Model with Basic Components and Feedback Loops
(NATO, 2005)............................................................................................................................................ 21
Figure 8 Information Transduction Model of Group Activity on the Task Environment (Roby,
1968) .......................................................................................................................................................... 22
Figure 9 Properties of Naturalistic Environments, Environmental Stressors and Environmental
Task Demands............................................................................................................................................ 24
Figure 10 The Team/Collective Performance Model ................................................................................ 25
Figure 11 Team Training Cycle, adapted from Tannenbaum et al (1998) ................................................ 28
Figure 12 The Team/Collective Training Model ....................................................................................... 29
Figure 13 Example HTA (T) Chart............................................................................................................ 31
Figure 14 The Team/Collective Training Model ....................................................................................... 36
Figure 15 The Triangle Model of TNA ..................................................................................................... 37
Figure 16 Team Context Diagram Notation .............................................................................................. 45
Figure 17 Example Tornado F3 Pair External Environment TCD ............................................................ 48
Figure 18 Example Tornado F3 Pair Organisational Chart 1 .................................................................... 54
Figure 19 Example Tornado F3 Pair Interfaces TCD ................................................................................ 56
Figure 20 Example Tornado F3 pair Communications Diagram and Textual Description ....................... 60
Figure 21 HTA Diagram Format ............................................................................................................... 63
Figure 22 Example HTA for the F3 Pair Task Destroy Bandit beyond Visual Range .............................. 65
Figure 23 Task Sequence Diagram Format ............................................................................................... 67
Figure 24 Task Sequence Diagram for the F3 Pair Task 1 Destroy Bandit BVR ..................................... 67
Figure 25 Task Sequence Diagram for the F3 Pair Sub-Task 1.2 Detect Bandit BVR ............................. 67
Figure 26 General mapping of Tasks to METs ......................................................................................... 81
Figure 27 Mapping of Example ATC Role Group Summary Tasks to Military Tasks in the
MTL(M) ..................................................................................................................................................... 82
Figure 28 Part Task Training Sequence (adapted from Wickens and Hollands, 2000) ............................ 85
Figure 29 Simplifying Conditions Method Training Sequence (Adapted from Reigeluth, 1999) ............ 86
Figure 30 Mapping of TOs to the Operational Environment .................................................................... 88
Figure 31 Information Inputs to Generic Scenario Specifications ............................................................ 94
Figure 32 Mapping of TOs to Training Environments ............................................................................ 101
Figure 33 JOUST Networked Flight Simulation System ........................................................................ 110
Figure 34 Team Process Model (adapted from Annett, 2000) ................................................................ 120
ix
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
List of Acronyms
ACM
Air Combat Manoeuvring
ATC
Air Traffic Control
BATUS
British Army Training Unit Suffield
BVR
Beyond Visual Range
Cap JTES
Capability Joint Training Experimentation and Simulation
CAST
Combined And Staff Trainer
CATT
Combined Arms Tactical Trainer
CDM
Critical Decision Method
CF
Competency Framework
CONOPS
Concept of Operations
CONEMP
Concept of Employment
CONUSE
Concept of Use
CTEF
Command Team Effectiveness
CWA
Cognitive Work Analysis
DSAT
Defence Systems approach to Training
Dstl
Defence science and technology laboratory
ECG
Electrocardiograms
EDT
Environment Description Table
FC
Fighter Control
HF
Human Factors
HFI DTC
Human Factors Integration Defence Technology Centre
HTA
Hierarchical Task Analysis
HTA(T)
Hierarchical Task Analysis (Teams)
HTA(TCT)
Hierarchical Task Analysis for Team and Collective Training
IPO
Input Process Output
JTIDS
Joint Tactical Information Distribution System
JSP
Joint Service Publication
KSAs
Knowledge, Skills and Attitudes
MATT
Models for Analysis of Team Training
MCC
Maritime Component Command
MECs
Mission Essential Competencies
MSHATF
Medium Support Helicopter Aircrew Training Facility
x
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
MTL(M)
Military Task List (Maritime)
MTDS
Mission Training through Distributed Simulation
MoD
Minstry of Defence
NATO
North Atlantic Treaty Organisation
OTA
Operational Task Analysis
OTI
Operation Task Inventory
PPE
Post Project Evaluation
RAF
Royal Air Force
RN
Royal Navy
RWR
Radar Warning Receiver
RHWRRadar Homing & Warning Receiver
SA
Situation Awareness
SAT
Systems Approach to Training
SCM
Simplifying Conditions Method
SME
Subject Matter Experts
TCD
Team Context Diagram
TCTNA
Team/Collective Training Needs Analysis
TES
Tactical Engagement Simulation
TO
Training Objective
TNA
Training Needs Analysis
TTA
Team Task Analysis
TTRAM
Task and Training Requirements Analysis Methodology
UHF
Ultra High Frequency
US
United States
USAF
United States Air Force
VHF
Very High Frequency
WDA
Work Domain Analysis
WSO
Weapons Systems Operator
xi
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
1 Executive Summary
The research project described in this report was devised in response to a Royal Navy
request for the Human Factors Integration Defence Technology Centre (HFI DTC) to
provide guidance on the conduct of Training Needs Analysis (TNA) for Collective
Training to support the TNA process being conducted for the Queen Elizabeth (QE) Class
Aircraft Carriers. TNA is the systematic process of analysing training tasks and
identifying suitable training option(s) that has been used in connection with acquisition
projects for many years, however traditionally it has been focussed on individual training.
The inherent complexity and scale of collective training puts it beyond the analytical
reach of the techniques normally employed for individual training. Due to anomalies in
how the three Services define team and collective training, the method developed is
referred to as Team/Collective TNA (TCTNA) to make it clear that is applicable to both
team and collective training however they may be defined. The TCTNA guidance
provided in this document is designed to extend and amplify the extant guidance on TNA
provided in JSP 822, not to replace it.
In order to develop a TNA methodology, it was necessary to develop a model of
collective training. This necessitated the development of an underlying model of team
performance and supporting teamwork taxonomy. A review of extant teamwork and team
performance models was conducted and, in the absence of suitable extant models being
identified, a model of team performance and supporting teamwork taxonomy suitable for
TCTNA were synthesised.
A review of extant human factors methods devised to facilitate the analysis of team
training was conducted. Of the relatively small number of models in existence, none were
found to provide all of the analytical components required to support TCTNA,
confirming that a new model had to be developed.
The TCTNA method that has been devised is structured around an adaptation of the TNA
Triangle model devised in a previous phase of HFI DTC research (HFI DTC, 2009). It is
composed of five components: Constraints analysis, Team/Collective Task Analysis,
Training Overlay Analysis, Training Environment Analysis and Training Option(s)
Selection.
Constraints Analysis provides a mechanism for the recording of all key constraints and
analysis of their consequences in terms of limitations on potential training solutions.
Team/Collective Task Analysis exploits adaptations of software design representational
techniques to develop visual representations of the nature of the task environment. An
extension of Hierarchical Task Analysis for Teams (developed under a previous Ministry
of Defence (MoD) contract) is then employed as the core task analysis method. The
teamwork taxonomy developed as part of the research is used to guide the analysis of the
teamwork component of the task. Finally, training priorities are established using a risk
management approach.
During Training Overlay Analysis appropriate instructional methods are determined and
generic scenarios are developed which are used to inform subsequent training
1
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
environment specification. Instructor functions are specified as are the requirements for
data capture from the instructional environment.
During Training Environment Analysis the training environment requirements are first
rationalised and then a fidelity analysis is conducted for each environment required.
Novel fidelity analysis templates are provided for specifying each of the manned systems,
human elements, resources, systems and physical environment elements that are required
within each training environment. Training environment options are then identified and
characterised and technical suitability is assessed.
Outline guidance on Training Option Selection is provided, with reference being made to
the JSP 822 direction on seeking advice from MoD and industry for the costing and
comparison of complex training systems.
This research has delivered a methodology for the conduct of TCTNA. Worked examples
and templates are provided for the components of each stage of the methodology. It is
anticipated that this guidance will be used by military and commercial TNA specialists
conducting TCTNA.
2
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
2 Introduction
2.1 Background
The requirement for this work originated in a request from the RN for the development of
a methodological approach for the conduct of Training Needs Analysis (TNA) which
could be applied in a set of TNAs to be conducted for collective training for the Queen
Elizabeth Class Aircraft Carriers.
The principal differences between individual training and team and collective training are
associated both with scale and complexity. The issues include:
•
Complexity of the task
•
Complexity of the context in which the task is conducted
•
Complexity of the start state of the training audience
•
Complexity of exercise planning
•
Complexity of the instructional task
•
Complexity of evaluation
•
Scale of resource requirements
•
Costs of training
A TNA method for Team and Collective Tasks must be theoretically capable of covering
any type of Collective Task from the smallest and simplest (two people working together)
to large complex tasks that span multiple teams and organisations and involve integration
of effort between them (such as ground-air integration in warfare).
The experience of the Royal Navy (RN) TNA specialists was that, whilst the overarching
approach to TNA mandated in Joint Service Publication (JSP) 822 was logical and
applicable to collective training, there was an absence of appropriate guidance on
techniques to deal with the complexities of the collective training problem. The purpose
of this report is to fill this methodological gap.
2.2 Scope of Team/Collective Training Needs Analysis
The term “collective training” is often used differently by the three services. The RN uses
the term to characterise training for individual units and multiple units or force elements.
The Royal Air Force (RAF) considers collective training to be that which encompasses
multiple aircraft types, with training for a single aircraft type being referred to as team
training. The Army uses a six-level description to characterise the progression of training
3
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
from individual level up to Battle group and beyond, referring to these as collective
training levels 1-6. The North Atlantic Treaty Organisation (NATO) definition of
Collective training cited in NATO (2004) describes collective training as training which:
“…involves 2 or more ‘teams’, where each team fulfils different ‘roles’, training in an
environment defined by a common set of collective training objectives (CTOs)” (p6.2)
Given that the requirement for this work was to develop a methodological approach
applicable to support TNA for the training of individual teams and collective training,
and that there is the potential for confusion about the applicability of the method if it is
simply labelled Collective TNA, the method described is referred to in this document as
Team/Collective TNA (TCTNA) and is applicable to all levels of training above
individual training.
2.3 Training Needs Analysis
Military training in the UK has been developed and managed following the principles of
the Systems Approach to Training (SAT) with some success for many decades. The SAT
approach in its simplest from is illustrated in Figure 1. The first stage is the analysis and
definition of the training requirement. Training design follows on and embraces both the
high level training strategy and the detailed design of training materials. Training
delivery then takes place and is followed by evaluation. The results of evaluation are then
fed back so that adjustments can be made to the analysis, design and delivery to ensure
that optimal training is delivered and any weaknesses are corrected.
Historically, the three services each had their own implementation of SAT and a set of
associated documents defining the approach. This was rationalised with the publication
of the Defence Systems Approach to Training (DSAT) Quality Standard in 2003.
Figure 1 The Systems Approach to Training
4
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Notwithstanding the success of the application of SAT to the development and
management of training within the RN, the Army and the RAF, for many years the
development of training associated with the acquisition of new systems fell outside the
scope of the SAT process of each individual Service. The TNA process was developed to
provide guidance on the application of the principles of SAT to training developed within
the acquisition process. The scope of TNA is illustrated by the red overlay in Figure 1. It
embraces the analysis phase and sufficient high level design to facilitate the identification
of a recommended training solution. In 1996 the Department of Internal Audit endorsed a
common approach to TNA across the three services, achieved by the publication of JSP
502 – TNA for Acquisition Projects. The output of the TNA is a document set that forms
the inputs for subsequent stages of the instructional design process - such as specification
of training equipment, development of lesson content and training material and training
implementation, assessment and evaluation strategies.
Historically, SAT and subsequently DSAT have only applied to individual training.
Significantly, with the integration of DSAT and JSP502 into JSP 822 in 2007, the need to
consider collective training requirements is specifically mentioned within the guidance on
TNA (JSP 822).
The current MoD TNA process is shown diagrammatically in Figure 2 TNA is conducted
in three phases:
Phase 1 - Scoping Study. The Scoping Study defines how the TNA is to be conducted
and managed and identifies the constraints, assumptions and risks associated with the
project. It also includes a target audience description which characterises who will need
training, the annual throughput and the input standard.
Phase 2 – TNA Development. The TNA development constitutes the core of the
analytical activity of the TNA and yields four key deliverables. These are:
1. Operational/Business Task Analysis (OTA). This deliverable reports the outcome
of a task analysis to establish the performance conditions and standards for all
affected job holders. It includes an Operational Task Inventory (OTI) with
associated performance conditions and standards statements. It is recommended
that the OTI is rationalized using a process such as Difficulty Importance
Frequency (DIF) analysis. Notably, it identifies the requirement to identify
interfaces between individuals and teams to include co-ordination, communication
and backup activities, noting that they form a vital part of sub-team and command
team training.
2. Training Gap Analysis. The purpose of this deliverable is to determine the
additional training required to meet the gap between the performance level
required as defined by the OTA and the existing performance level of individuals.
Specific mention of unit and collective training requirements is required. A
fidelity analysis is also required to identify the “key cues and stimuli that support
the requirement to train” (Sect 2.30. The conduct of the fidelity analysis can
continue into the next phase but has to be completed before method and media
selection takes place.
5
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
3. Training Options Analysis. The purpose of this deliverable is to make a
recommendation as to cost effective training solution(s) to meet the training
requirement identified in the previous phases of the TNA. This includes a
description of the training methods and media (or combinations thereof) that will
partially or fully meet the training requirement along with an estimate of the
relative training effectiveness of each costed training media option.
4. The Final Report. The final report draws together all the key elements from the
previous phases of analysis. The output from this deliverable is an endorsed
training solution, draft Operational performance Statement/Competency
Framework, implementation plan and an evaluation strategy. If the final report is
accepted the next stage of the process is the development of the training course
and any required training devices. These may be partly or wholly contracted out.
If the final report is not accepted then some or all of the analysis process will have
to be repeated. The branching path in Figure 2 from the Final Report reflects these
options.
Phase 3 Post Project Evaluation. The Post Project Evaluation (PPE) should be
conducted once training delivery and evaluation has commenced and reviews the
effectiveness of the TNA process for the project. Historically, relatively few projects
have had a PPE.
The guidance for the conduct of the Phase 1 Scoping Study is comprehensive and does
not appear to require any adaptation for team/collective training. Similarly, the Phase 3
PPE guidance is robust and applicable to any TNA. However, whilst the Phase 2 TNA
Development guidance mentions the requirement to address team and collective training
and highlights the requirement to capture communication, coordination and backup
activities, there is no detailed guidance on how to apply the overall approach to this
complex domain. Therefore, the requirement identified for this study was to develop an
approach for Phase 2 TNA Development that addressed the complexities of team and
collective training, and provided guidance in sufficient detail for the approach to be
implemented.
6
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
RESPONSIBILITIES
Project sponsor to
Contact Lead Training
Authority
MOD TNA
steering group
Phase 1 – TNA Scoping
Study
Phase 2 – TNA Development
MOD or
Contractor
Operational/business
Task Analysis
OPS/CF
Statement
Deliverable 1
Training Gap
Analysis
Training Objectives
Deliverable 2
Recommended
Training
Solution
Deliverable 3
Training Options
Analysis
TNA Final Report
Deliverable 4
No
No
Can trg design begin?
Can trg equipment
contract be awarded?
Yes
Yes
Training Design &
Development
Relevant
Agencies
Other SAT
Elements
Training Delivery
Training Equipment
Design &
Development
Ready for Trg
Date (RFTD)
Training
Evaluation
Training
Authority
Phase 3 – TNA Post
Project Evaluation
Figure 2 MoD TNA Process Diagram (JSP822, 2007)
7
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
2.4 Methodological Approach
The TCTNA method was developed in four stages as illustrated in Figure 3. Firstly, a
model of team/collective training was developed in order to provide a framework for
analysis of team/collective tasks from a training perspective. A review of extant HF
methods applicable to TCTNA was then conducted to determine if there was an extant
method that would satisfy the analytical requirements of the team/collective training
model or, if not, which extant methods would inform the development of a new model. A
theoretical TCTNA model was then developed. Finally, the theoretical model was
instantiated into a practical model with detailed implementation guidance.
Figure 3 TCTNA Development Sequence
8
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
3 The Team/Collective Training Model
3.1 Introduction
Successful TNA is founded upon a clear, robust analysis of the task to be performed, a
considered evaluation of appropriate training strategies and instructor roles and a clear
specification of appropriate training environments. Given the inherent complexities of
team-collective tasks, the teamwork which underpins them, and the training methods and
resources that are required to deliver training for them, it is necessary to develop a model
of team training to guide the analysis. This has to be underpinned by appropriate models
of teamwork and team-collective task performance.
After first setting out some key definitions used in this report, a teamwork taxonomy is
developed which is then integrated into a model of team/collective performance. Finally,
a team training model is developed by the integration of a model of instructional
functions and supporting system requirements.
3.2 Definitions
The teamwork literature suffers from the use of a multiplicity of definitions for
commonly used terms such as team and teamwork. Furthermore, the definitions
themselves often manifest a confusion of team member attributes, interactions, processes,
functions, strategies, behaviours and products. For example, Nieva et al (1978), defined a
team as:
“two or more interdependent individuals performing coordinated tasks towards
the achievement of specific task goals”
whereas Salas et al (1992) defined a team as
“two or more individuals, who have specific roles, perform interdependent tasks,
are adaptable and share a common goal”
In the Salas et al. (1992) definition, a team is defined as being adaptable and sharing a
common goal. In a poorly performing team, team members may not necessarily be
adaptable (or want to be adaptable) and may not necessarily share a common goal –
which is why inadequate conflict resolution is quoted as a factor in poor teamwork
(Baker, Day and Salas, 2006). For the sake of clarity, the following definitions are used in
this report:
Team: a number of persons constituting a work group, assembled together for the
purpose of joint action.
Teamwork: the interactions and processes that occur between team members in response
to environmental demands to help achieve the generation of team-collective task
products.
9
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Teamwork Interaction: a single instance of one team member causing an effect on
another team member; actions that connect team members.
Teamwork Process: a systematic series of coupled team member interactions directed to
some end.
Teamwork Competencies: the supporting Knowledge Skills and Attitudes (KSAs) that
support teamwork directly.
Collective Task: a task involving (most usually requiring) the performance of more than
one individual in a work group; the performance of joint action between team members to
a shared goal, generating a task product with specified value measures which may be
evaluated in performance assessment. Joint action may be parallel (not dependant) or
highly interdependent.
3.3 Development of a Teamwork Taxonomy
In order to meet the requirement for the provision of a teamwork taxonomy to be used to
underpin a teamwork performance model, a number of contemporary teamwork models
were reviewed to identify if a suitable taxonomy already existed. As no single model was
considered adequate, a new model was synthesised from the models reviewed.
3.3.1 Review of Extant Teamwork Models
In a recent study, Salas, Sims and Burke (2005) identified 138 teamwork models that had
been published in the previous 20 years. Conducting a full review of the extant teamwork
models would have been a substantial undertaking. A full review was untenable in the
timeframe of this study, so a more pragmatic approach needed to be adopted. A survey of
the team training literature revealed that four teamwork models were used in the
published methods related to the analysis of team training (Tesluk, Mathieu, Zaccaro &
Marks, 1997; Annett, 2000; Bowers et al, 1993; DSTL, 2006). Furthermore, there had
been two recent, meta-analytic studies of teamwork models (Salas et al, 2005; Rousseau,
Aube and Savoie, 2006) which yielded integrative models of teamwork. Therefore, the
initial review was focussed on these six models.
The model put forward by Tesluk, Mathieu and Zaccaro (1997), was used by Arthur,
Edwards, Bell, Villado and Bennett (2005) in their Team Task Analysis approach to
identify tasks that were team-based. The model provides descriptions of generic patterns
of workflow or teamwork interactions within teams as shown in Figure 4
Figure 4 Patterns of Team Interactions adapted from Tesluk et al (1997)
10
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Whilst these patterns are of interest in that they facilitate characterisation of the dynamics
of a team task in terms of interaction pattern, the model is of limited value from a TNA
perspective, as an understanding of the nature of the interactions and the processes that
they support is required in order to devise a training strategy and determine a suitable
training environment.
The remaining models identified provide detailed breakdowns of teamwork. These are
summarised in Table 1 and Table 2. The full models are provided in Appendix A.
11
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 1 Teamwork Models used in Teamwork Analysis
Annett (1997)
Bowers et al (1993)
DSTL(2006)
Behavioural Processes
Communication
Communication Behaviours
Communication
(send, receive, acknowledge)
Information Exchange
(send, receive, discuss)
Situational Awareness
(seek and pass – relevant and timely)
Coordination
(identify souce / nature of problems,
Communication Skills
(Collaborate, synchronise,
discuss)
perception of location,
(use of formats & conventions)
detect situations requiring action)
Co-ordination Behaviours
Decision Making
Procedural Co-ordination
(gather info required, identifying
solutions to problems,evaluating
consequences of alternatives,
select best alternative)
(integration and sync iaw procedures)
Cognitive Processes
People, Team Plan &
World models
Affective Processes
Morale
Cohesiveness
Mission analysis
Collaboration
(organising team resources, activities and actions
– tasks shared and completed on time)
(monitoring, allocating & coord of
people and resources, prioritising
tasks, setting goals, developing
plans, creating contingency plans)
Leadership and Task Management
Leadership
Situation Assessment
(directing others,monitoring and
assessing performance,
motivating,communication of
mission requirements)
(develop common understanding of the situation)
Adaptability
(alter course of action, maintain
behaviour under pressure, adapt to
internal or external changes)
Assertiveness
(willingness to make decisions,
demonstrating intiative, maintian
own position until convinced
otherwise)
Total Coordination
Need for interaction and
coordination
(directing, coordinating team activities)
Adaptive Behaviours
Decision Making
(assessment of situation, choice of course of
action through discussion and argument)
Back-up Behaviours
Performance Monitoring and Feedback
(monitoring performance of teammates, providing
advice, giving and receiving feedback)
Mutual Support
(providing assistance to team members)
Attitudes
Mutual Trust
Shared Vision
Team Orientation
Collective Efficacy
Knowledge
Shared Task Models
Accurate Problem Models
Team Role Interaction Patterns
Team Member Characteristics
Task Sequencing
Cue-Strategy Associations
Teamwork skills
Team Mission, Objectives and
Resources
Boundary Spanning Roles
12
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 2 Integrative Teamwork Models
Salas Big Five (2005)
Rousseau (2006)
Core Components:
Preparation of work accomplishment
Team leadership
Team Mission Analysis
(Direct & coordinate activities, assess team performance,
assign tasks, develop team KSAs, motivate team, plan,
organise, establish positive atmosphere)
(identification of main tasks, environmental
conditions and team resources available)
Mutual Performance Monitoring
Goal Specification
Planning
(monitor performance using appropriate strategy, develop
common understandings of environment)
(Development of alternative courses of action)
Backup behaviour
Work Assessment Behaviours
(anticipate other team members needs through knowledge of
responsibilities, shift workload to achieve balance)
Performance monitoring
Adaptability
(tracking team resources and state of
environment)
(alter course of action in response to conditions, use backup
behaviour & reallocation of resources to adjust strategies)
Systems monitoring
Task-related collaboration behaviours
Team orientation
Coordination
(take others behaviour into account, belief in importance of
team over individual goals)
(integrating activities ref time)
Coordinating Mechanisms:
Shared mental models
(relationships between tasks and how team will interact)
Mutual trust
Cooperation
(working together)
Information Exchange
Team Adjustment Behaviours
Backing up Behaviours
(shared belief team members will perform roles and protect
interests of team mates)
(providing task related help)
Closed-loop communication
Inter-team Coaching
Exchange of information
(feedback on task performance and behaviour)
Collaborative Problem Solving
(gathering information, identifying alternatives,
select best solution, decision making,
implementing solution)
Team Practice Innovation
(Invent and implement new ways of doing)
Management of Team Maintenance
Psychological Support
(assistance to team mates)
Integrative Conflict Management
(resolution of conflicts tasks processes and
interpersonal issues)
13
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Annett (1997) proposed his teamwork model, shown in the first column of Table 1 as the
theoretical framework for teamwork upon which his Hierarchical Task Analysis for
Teams (HTA(T)) method was based. The cognitive process components are described as
being the knowledge or beliefs that team members hold of the problem (world model),
the what other team members are doing or capable of doing (people model) and the team
plan. Arguably they might be better described as knowledge components rather than
cognitive processes. Annett (1997) observes that affective components such as morale
and cohesiveness, whilst regarded as important by many, are problematic as it is
unknown if they are a cause of team behaviour, but suggests they should be incorporated
for completeness. The application of this model in HTA(T) manifests as the capture of
descriptions of teamwork requirements for goals and sub-goals in terms of the
communication and co-ordination requirements.
Bowers, Morgan, Salas and Prince (1993) devised a teamwork model which they used as
the basis for a questionnaire to assess coordination demands in flight tasks in order to
inform the further development of aircrew co-ordination training. The eight dimensions
of their model are shown in the second column of Table 50. This model has many more
processes than the Annett model but does not consider affective or cognitive elements.
The Dstl (2006) model, shown in the third column of Table 2, was designed to underpin
the Models for Analysis of Team Training (MATT) approach to the development of team
training. It is notable in that it provides a much more extensive list of underpinning
knowledge and affective elements than the other models, although the definitions of the
affective elements appear to overlap.
The Salas et al (2005) model, shown in the first column of Table 2 is structured
differently from all the other models in that it is split into five core components of team
work and three coordinating mechanisms. This model was developed on the basis of the
analysis of twenty selected teamwork models.
The Rousseau et al (2006) model, shown in the second column of Table 2 was developed
on the basis of the analysis of twenty nine teamwork models. A notable difference
between this model and the Salas et al model is that leadership is specifically excluded
from the list of behaviours considered although a detailed reason for its exclusion is not
offered. It is also the only model that includes conflict management as a teamwork
behaviour.
When all five models are compared, there are only two categories of behaviour,
communication and coordination, which are common to all. Furthermore, in a number of
instances behaviours with the same name are defined in different ways. For example,
Rousseau et al (2006) define mission analysis as being concerned with the identification
of tasks, environmental conditions and the team resources available for utilisation in
undertaking the task, whereas Bowers et al (1993) include the development of plans and
the allocation of people and resources to task. In addition, the similar sets of behaviours
in different models are given different labels. For example, Rousseau et al (2006) use the
term intra-team coaching as the label for the provision of feedback and distinguish it from
backup behaviour, whereas it is explicitly labelled as feedback in the backup behaviours
category in the DstlDstl (2006) model. Even the category of coordination, which is
common to all of the models, is defined differently in every model.
14
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Given the lack of standardisation of terms, the differences of scope of each model, and
the fact that all of the models contained unique categories, there was no one model that
was an obvious candidate to be used to underpin the analysis of teamwork for TCTNA.
Therefore it was considered that a synthesis of the models was required.
3.3.2 Synthesis of a Teamwork Taxonomy
The six teamwork models reviewed in the previous section were used to synthesise a
teamwork taxonomy using a grounded theory approach.
A grounded theory approach was adopted as described by Strauss and Corbin (1990)
using the implementation approach described by Huddlestone and Harris (2003). This
was comprised of three main stages of analysis, open coding, axial coding and selective
coding. Open coding was comprised of breaking down the data using a line by line
analysis technique of the descriptions of the categories in each model (Strauss and
Corbin, 1990), and then categorising them using a constant comparison technique as
described by Partington (2000). This was followed by axial coding where connections
between the categories were identified to form higher order categories. Then, selective
coding was applied to identify the overarching category. Finally, to triangulate the
analysis, a second coder used the hierarchical categories derived to re-categorise the data
and any anomalies were discussed and resolved.
3.3.2.1 Results
3.3.2.1.1 Initial Coding
The results of the initial application of open, axial and selective coding are shown in
Table 3. Open coding initially yielded thirty two categories, including the decomposition
of leadership and collaborative problem solving into subcomponents. During axial
coding, leadership and collaborative problem solving were categories containing multiple
elements, with teamwork KSAs, teamwork supporting processes, teamwork processes
and team attributes identified as higher order organising categories. The unifying theme
identified by selective coding was labelled as teamwork, which captured the notion of
cooperation as an overarching descriptor of participation in teamwork.
Triangulation of the data analysis identified four key areas of difference concerning
leadership, situational awareness, task allocation and resource allocation. These are
considered in turn.
3.3.2.1.1.1 Leadership
Two issues were identified with leadership as a category. Firstly, there was overlap with
other categories (for example, performance monitoring occurs as a separate element as
well as a sub component of leadership). Secondly, there was the issue as to whether
leadership is best considered as a role or a teamwork function. In the military context, all
teams have appointed leaders who have specific responsibilities for task achievement,
maintenance and development of the team and the development of individuals.
15
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 3 Initial Teamwork Categories
Teamwork
Teamwork Knowledge
Skills and Attitudes
(KSAs)
Teamwork
Supporting
Processes
Teamwork Processes
Team
Attributes
Mental models
Communication
Leadership
Adaptability
Communication skills
Performance
monitoring
• Goal specification
• Planning
• Task prioritization
• Task assignment
• Control
• Coordination
• Monitoring Team Performance
• Performance assessment
• Motivating Team
• Creating positive atmosphere
Situational awareness
Cohesion
Team
orientation
Mutual trust
Backup behaviour
Task coordination
Workload management
Information coordination
Resource coordination
Collaborative problem solving
•
•
•
Gathering required information
Identifying potential solutions
Evaluating alternative
solutions
• Forming consensus on best
alternative
Collaborative planning
Conflict management
Task allocation
Resource allocation
Adair (1997) identifies eight leadership functions: defining the task, planning, briefing,
controlling, evaluating, motivating, organising and providing an example. These
functions are identified as part of his action centred leadership model, which is
commonly taught during leadership training in the British Armed Forces. The view was
taken that whilst leadership is a role with responsibilities for ensuring that effective
teamwork is taking place in order to secure task delivery, the role is greater than simply
being a teamwork process. Therefore, leadership was removed from the teamwork
process list.
16
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
3.3.2.1.1.2 Situational Awareness
Situational awareness was challenged as a category in the light of the most recent HFI
DTC research into the concept of Distributed Situational Awareness (DSA) published in
Salmon, Stanton, Walker and Jenkins (2009). Salmon et al advance DSA as an alternative
view to shared situational awareness (SA) and posit that DSA is a system held construct.
They suggest that different individuals with different roles will form different mental
models of the situation even if they are presented with the same information, the
differences being attributable to their roles. They suggest that what is important is the
alignment of their perceptions of the situation where their actions and activities are
related. In their DSA model they assert that team attributes such as cohesion and team
processes such as communication contribute to the achievement of appropriate DSA but
it is not solely a teamwork process. Therefore, situational awareness was discounted as a
teamwork process.
3.3.2.1.1.3 Task and Resource Allocation
The issue concerning both task allocation and resource allocation were whether they were
best considered as processes or interactions. The view that was advanced was that, based
on observation of simple instances of teamwork, they should be characterised as
interactions. For example, considering the example of two pilots operating an aircraft,
the handling pilot may direct the non-handing pilot to take over responsibility for talking
on the radio. This was considered to be different to the more extended process of
workload management which may result in a revision of task allocations instantiated
through a number of task allocation interactions. Similarly, resource allocation such as an
infantry section commander directing a member of the section to take six grenades could
be considered to be a simple interaction, different in nature to the larger process of
resource coordination. Task and resource allocation were therefore re-categorised as
teamwork interactions.
3.3.2.2 The Teamwork Taxonomy
The final version of the taxonomy, with definitions for each element, is shown in Table 4
17
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 4 TCTNA Teamwork Taxonomy
Team Attributes
Adaptability
The ability of the team to react to changing circumstances.
Cohesion
The degree to which the team actively engages in teamwork
behaviour and coordinates its actions.
Teamwork Processes
Task coordination
Ensuring that task elements are synchronised as required.
Workload management
Evaluating the balance of workload across the team and changing
task allocations if appropriate to achieve balance.
Information coordination
Ensuring that information is passed from the most appropriate
source and sent to appropriate destinations – answering the
questions “who has got what we need?” and “who needs what we
have got?”.
Resource coordination
Ensuring that resources are deployed appropriately across the team
Collaborative problem
solving
The collaborative process of gathering required information,
identifying potential solutions, evaluating alternative
solutions and forming a consensus view (decision) on the
best alternative.
Collaborative planning
The collaborative development of plans and contingency plans by
means of gathering required information, identifying potential
solutions, evaluating alternative solutions, and forming a consensus
about the best alternative.
Conflict management
Resolving conflicts concerning tasks and interpersonal
issues.
Teamwork Supporting Processes
Communication
Communicating with team members by any means.
Performance monitoring
Monitoring the performance of other team members and assessing
whether they need assistance or feedback.
Backup behaviour
Providing advice or assistance with a task to another team member.
Resource allocation
Providing a resource to another team member.
Task allocation
Directing another team member to undertake a task.
Teamwork Knowledge Skills and Attitudes (KSAs)
Mental models
Mental models of the task environment, the team plan for
conducting the task, team interaction patterns, team
member strengths and weaknesses and how interactions
with entities outside the team are managed.
Communication skills
Using agreed conventions where appropriate and putting view
forward assertively in a group context.
Team orientation
Positive view of the value of the team approach and actively
participating in teamwork activities.
Mutual trust
Belief that other team members will fulfil their roles in the
interests of the team and task completion.
18
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
3.4 Development of a Team/Collective Performance Model
Following a review of extant team performance models and a consideration of how the
environment places demands on the team during task performance, Team/Collective
performance model is synthesised.
3.4.1 Review of Team Effectiveness Models
Traditionally, team effectiveness models have focussed on inputs, processes, and outputs
and are referred to as Input Process Output (IPO) models. Figure 5 illustrates a typical,
traditional IPO model (Hackman and Morris, 1975).
Figure 5 Input-Process-Output (IPO) paradigm for analysis of group interaction as
a mediator of performance outcomes (Hackman and Morris, 1975)
This model shows that both individual as well as team factors influence team
performance. It also identifies the nature of the task and environmental factors as inputs,
including stresses imposed by the environment. One issue that was identified by
Hackman and Morris (1975) is that there are other types of output apart from
performance outcomes, represented in the dashed box in Figure 5. Some of these “Other
outcomes” are in fact input variables, such as group cohesiveness. Outputs can be inputs
as part of the group interaction process, this is not a particular surprise to anyone who has
worked in a team and had their relationship change with other team members as a result.
The lack of the representation of feedback would seem to be a limitation of the model.
Later researchers extended the IPO paradigm for team effectiveness by including other
sub-categorisations within the input, process and output categories and capture some
notion of feedback between inputs and outputs. Figure 6 below shows a later model of
team effectiveness, from Tannenbaum, Beard and Salas (1992), which was considered in
a recent NATO study (NATO, 2005) to be the most appropriate to command team
effectiveness of the models that they reviewed.
19
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Figure 6 Model of Team Effectiveness adapted from Tannenbaum, Beard and Salas
(1992)
Whilst this model captures feedback to some degree, there are a number of surprising
features about this model:
•
Team changes and individual changes are not shown as fed back into the
individual and team characteristics.
•
Individual characteristics are shown as acting only indirectly through team
characteristics on team processes and team performance.
•
Organisational and situational characteristics are shown as somehow influencing
the whole of the input, throughput, output process but explicit connections to the
individual elements are not made.
•
The work assignment and communications elements in work structure appear to
be organisational factors
•
Team norms (in work structure) would appear to be a team characteristic.
•
The throughput component only captures teamwork processes and not taskwork
processes.
One of the latest team effectiveness models is the Command Team Effectiveness (CTEF)
Model developed by NATO (2005) shown in Figure 7.
20
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Figure 7 The Command Team Effectiveness Model with Basic Components and
Feedback Loops (NATO, 2005)
This model captures both team and task related processes and shows task and team
outcomes being fed back to the input conditions. It also illustrates the possibility of
organisational learning if an after action review is conducted. The nature of the mission
and task are captured as inputs but surprisingly the task environment is not mentioned
explicitly.
Marks, Mathieu and Zaccaro (2001) deal more explicitly with the notion of individual
and team characteristics being both inputs to and products of team processes. They use
the term “emergent states” to characterise the cognitive, motivational and affective states
of teams, which they suggest are dynamic in nature and vary as a function of team
context, imputs, processes and outputs. On first inspection, the notion of emergent states
appears to have some utility from a training perspective, since it focuses attention on
team performance as being a function of experience. This is pertinent since training
events are designed to deliver experiences from which the team learn and modify their
behaviour as required. From this perspective, capturing team emergent states in a team
performance model has merit as it would be a component that an instructional team
should be monitoring. However, there are certain aspects of the construct which are
problematic. The principle issue is that it can be argued that cognitive, motivational and
affective states are held at an individual level not at a team level. In the same way that
Salmon et al (2009) argue that situational awareness is a function of differing elements of
situational awareness being held by individuals and system elements, team properties
may be construed as being a function of the state of the individuals that make up the
team, each of whom may well be reacting differently at a given instant in time based on
the specific experience they are having and their own KSAs and prior experiences. On
21
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
this basis. it can be argued that it is more useful to apply the concept of emergent states at
the individual level within a team performance model. This has the advantage of
focussing attention on how the strengths and weaknesses of individuals are contributing
to team performance in a dynamic way.
One of the few models that captures the notion of the team responding to cues from the
environment and taking actions to affect the environment is that of Roby (1968) shown in
Figure 8. Whilst the terminology used in the model is relatively unfamiliar, it is in
essence presenting a straightforward information processing model cast in group terms.
Figure 8 Information Transduction Model of Group Activity on the Task
Environment (Roby, 1968)
Based on the analysis of the models reviewed, the following were considered to be the
key elements that a team performance should capture:
•
The nature of the environment in which the task is performed, including the
demands that it places on the team
•
The nature of the task
•
The nature of the team in terms of its characteristics and organisation
•
The characteristics of the individuals in the team and their emergent states
•
Task and other outcomes
•
The connections between all of the elements, including feedback loops
As none of the models reviewed captured all of these elements completely, a new model
had to be developed.
22
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
3.4.2 Environmental Task Demands
One of the input factors identified that affects team performance is the stress which the
environment places on the team. From a training standpoint, these elements of the
environment must be accurately captured in order to be able to identify the generic
properties of credible scenarios and so specify training environments that are capable of
delivering these scenarios.
From the literature, there are two relevant models that amplify this concept. Orasanu
(1993) characterises naturalistic environments as having the properties shown in the lefthand column of Figure 9. Cannon-Bowers and Salas (1998), based on research
experience with the United States (US) Navy, identified a list of stressors shown in the
middle column of Figure 9. The lines between the two lists show the mapping between
the items in the two lists. One of the notable features of the list of properties of
naturalistic environments is that there are multiple items that are strongly related, such as
dynamic environments, uncertain environments and shifting goals. By comparison, the
items in the environmental stressors list appear to be discrete, self-contained entities.
The discrete, self-contained nature of the environmental stressors list makes it a strong
candidate to be used within the TCTNA approach as a framework for capturing data
about the environment. However, the requirement for two amendments to the list was
identified. Firstly, the auditory overload/interference category is explained by CannonBowers and Salas (1998) as capturing the problem experienced by US Navy warfare
teams who had multiple radio channels, each used by different functional networks of
people at the same time, fed into their headsets, with different channels fed into each ear
(this is also the case for RN warfare teams). Whilst this remains a feature in
contemporary operations rooms, advances in technology have led to an equivalent
problem occurring in the visual domain. Mission support systems now provide
chatrooms dedicated to different networks of operators and these chatrooms now take
much of the load that the radio channels took previously. Consequently, warfare
operators now have to scan multiple chat screens as well as listen to multiple radio
channels. Visual overload is therefore identified as an additional category that is required.
In a similar vein, the category of high workload/information load merits dividing into
high workload and high information load as a consequence in advances in information
systems technology. The concept of information overload is of particular concern in the
context of operational information management and exploitation (HFI DTC, 2005; HFI
DTC, 2007). Therefore this merits consideration as a separate entity from high workload
in general. The extended list is shown in the right-hand column in Figure 9. The term,
environmental task demands, is used to distinguish the list from the environmental
stressors list.
23
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Figure 9 Properties of Naturalistic Environments, Environmental Stressors and
Environmental Task Demands
3.4.3 The Team/Collective Performance Model
The Team/Collective Performance Model shown in Figure 10 is designed to capture all of
the essential elements identified in the previous section. It shows how team processes,
influenced by team properties and team member characteristics, enable the team to
deliver performance outcomes in order to achieve required changes in the state of the
environment and ultimately it is hoped, goal achievement. The performance outcomes
can also influence team properties, team member characteristics and the team processes.
These elements and interactions are described in detail in the following sections,
24
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
illustrated using the example of a medical team deployed in a field hospital to give
concrete examples of the constructs used.
Figure 10 The Team/Collective Performance Model
3.4.3.1 Task Environment
The task environment is composed of the physical environment, human elements,
systems, manned systems and resources. Also captured are the environmental task
demands that place stress on the team. In the field hospital example the physical
environment would include the tents that they work within. Environmental characteristics
such as extreme temperature would also fall into this category. Human elements are all
the people outside of the team that the team interact with. In the field hospital case this
category would include patients and personnel at field dressing stations that they
communicate with. Systems are all the elements that have interfaces that the team use
and would include medical systems such as Electrocardiograms (ECGs) and ventilators as
well as such items as communication systems. Manned systems are those elements
external to the team that they interact with. For a field hospital this might include field
ambulances and support helicopters providing casualty evacuation. Resources are all the
25
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
other items that the team use including equipment, such as hospital trolleys and forceps,
and consumables such as dressings, drugs and water. Environmental task demands are
the factors that in some way stress the team. In the field hospital case these might include
high workload due to large numbers of casualties and performance pressure and time
pressure caused by a critically ill patient requiring urgent, life-saving treatment.
3.4.3.2 Performance Outcomes
Team Processes generate “Task Products” and “Other Outcomes”, both of which
constitute a modification to the task environment. In the case of a field hospital example
the principal task product is successfully treated casualties - this constitutes the
achievement of the task goal.
Other Outcomes are ancillary modifications to the task environment which are
concomitant with task performance (though not necessarily goal achievement) i.e. what
changes in the environment as a result of the task being performed which isn’t directly
goal related. These might include resources used such as bandages, dressings, syringes,
units of blood etc, and human elements being affected such as untreated casualties
worsening in condition. Other outcomes also include effects on individuals and the team
as a whole. These might include team organisation having to be changed because of a
team member being injured, team members becoming fatigued, knowledge gained by
individuals from experiencing a new situation. Therefore, performance outcomes feed
back to team properties and team member characteristics, as well as the environment.
3.4.3.3 Team Processes
Team processes are the team’s response to the environmental inputs and are composed of
both teamwork and taskwork elements. Their purpose is to generate appropriate task
outcomes to achieve the required goal. They also have the side effect of generating other
outcomes. The connection between team processes and task products is shown as twoway, as the team may adjust its process in the light of success or failure to generate the
required outcomes.
Environmental task demands have a critical influence on team processes. For example, if
a resource such as an item of equipment is limited in availability then the team will have
to come up with an allocation or sharing mechanism for that resource. Another example
would be team workload pressures. If the medical team is working under huge task
environment input demands with large numbers of casualties to be treated, they will need
to carefully manage task allocation in a way that enables that team to be effective in that
situation.
The conduct of team processes will be influenced by the characteristics of the team
members (their KSAs) and the properties of the team both in terms of the organisational
factors of the team (structure, roles etc) and attributes such as cohesion and adaptability.
26
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
3.4.3.4 Team Properties
Team properties include both organisational aspects, such as organisational structure,
roles and role allocation, team size, and team attributes such as cohesion, adaptability and
morale. These are affected by the conduct of team processes and the outcomes of the
processes as well as by the characteristics of the individuals in the team. In the field
hospital there may need to be an adjustment to role allocation to handle particularly
demanding casualty levels or to deal with a casualty who has come into contact with a
chemical agent. Replacement of a team member with another who does not have the same
degree of team orientation as his predecessor may affect team cohesion and morale.
Successful treatment of large numbers of casualties who arrived in a short space of time
may boost team cohesion and morale.
3.4.3.5 Team Member Characteristics
Team member characteristics include their teamwork and taskwork KSAs and their
emergent states. Emergent states reflect the dynamic nature of individual performance
capabilities, influenced by the environment, the experience of carrying out the team
processes including teamwork interactions, and the properties of the team. A senior
surgeon coaching a junior surgeon may result in the junior surgeon extending his
knowledge and skills and self-confidence. Similarly, working in a highly cohesive
nursing team may engender greater team orientation in a newly trained nurse in the team.
On the other hand, seeing severely injured young soldiers who have been victims of
Improvised Explosive Devices may have a severe emotional impact on a team member,
reducing their effectiveness in their task.
3.5 Development of a Team/Collective Training Model
The development of a team/collective training model necessitates the superimposing of
instructional and supporting functions, and the systems and resources required to
facilitate them, onto the team/collective performance model. The basic instructional
functions that are required for team training practice are illustrated in the team learning
cycle model advanced by Tannenbaum, Smith-Jentsch and Behson (1998) shown in
Figure 11.
27
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Figure 11 Team Training Cycle, adapted from Tannenbaum et al (1998)
The set of instructional functions is completed by the addition of initial instruction. The
supporting functions that need to be added to this model concern the configuration,
control, monitoring and adaptation of the practice environment. All of these functions
will typically require supporting resources and systems. For example, an instructor
observing a tank squadron will probably not follow it on foot across the exercise area,
using a four wheel drive vehicle would be somewhat more convenient. Observing the
same tank squadron exercising in a synthetic training environment would require some
means of accessing the virtual environment in which they were exercising to make
equivalent observations. These requirements are illustrated in Figure 12 as aspects of the
training overlay.
28
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Figure 12 The Team/Collective Training Model
Figure 12 shows the instructional functions applying to team processes, team properties,
team member characteristics and performance outcomes. Environment management
functions are connected to the task environment.
29
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
4 Review of Team Analysis Methods Applicable to
TCTNA
4.1 Introduction
A review of analytical methods aimed at team training analysis was conducted to
determine if there was an extant method that was suitably comprehensive to be used for
TCTNA, or if not, which extant methods were suitable to be used as components of the
TCTNA approach.
4.2 Evaluation of Methods
A previous HFI DTC review of Human Factors methods, published by Stanton, Salmon,
Walker, Baber and Jenkins (2005), identified four methods applicable to TCTNA:
•
Hierarchical Task Analysis for Teams
•
Team Cognitive Task Analysis
•
Team Task Analysis
•
Task and Training Analysis Methodology
A further literature review identified a further two applicable methods:
•
Mission Essential Competencies
•
Models for Analysis of Team Training
These six methods are reviewed in turn.
4.2.1 Hierarchical Task Analysis for Teams
Hierarchical Task Analysis for Teams (HTA(T)), was developed by Annett, Cunningham
and Mathias-Jones (2000) under a MoD research contract, of which one of the aims was
to devise a procedure for identifying team skills. It is a simple adaptation of the original
HTA method devised by Annett, Duncan, Stammers and Gray (1971). The HTA chart
notation, shown in Figure 13, is extended to contain a list of the actors involved in the
goal or sub-goal. Each sub-goal shown in the chart also has a supporting table which
details the goal, the plan, the evaluation measures and a description of the teamwork
required to enact the plan. Teamwork is characterised as comprising communication and
coordination.
30
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Figure 13 Example HTA (T) Chart
Whilst the model of teamwork is limited, the tabular format supporting the HTA chart
could be amended easily to contain other pertinent information. HTA has been described
by Kirwin and Ainsworth (1992) as the “best known task analysis technique” (p 396).
The representation of instructional scalers (maps of training objective hierarchies) as part
of all MoD training documentation is based on the HTA chart format (without the plans)
and so will be familiar to training needs analysts. As such it is a strong candidate for
inclusion in any TCTNA method for the analysis of tasks and identification of teamwork
components.
4.2.2 Team Cognitive Task Analysis
Team Cognitive Task Analysis (TCTA; Klein, 2000) is an adaptation of the Critical
Decision Method (CDM; Klein and Armstrong, 2005). The essence of the method is the
conduct of a set of semi-structured interviews with all members of a team, using a
predefined set of questions (probes) following observation of a task. A timeline of the
task observed is developed noting the critical decision points. The interviews develop a
view of the decision making requirements, cues used, reasons for difficulty, possible
errors, and strategies for effective decision making. It is suggested by the originators that
HTA is conducted prior to use of the technique to develop an understanding of the task
being analysed. Stanton et al (2005), characterise it as a method that is resource intensive
and requires considerable training to use. It has potential for analysing complex, decision
intensive tasks, but is unlikely to be of value for the main component of task analysis.
4.2.3 Team Task Analysis
Team Task Analysis (TTA; Burke, 2005) has been under development since the early
1990s as a method for identifying both the teamwork and taskwork components of a task.
The procedure for conducting Team Task Analysis involves firstly analysing the
taskwork requirements of the task, although no analysis methodology is specified, and
31
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
secondly conducting a coordination analysis. Coordination analysis involves selecting a
teamwork taxonomy (none is specified) and getting Subject Matter Experts (SMEs) to
rate each item on the taxonomy on a likert scale from 0-10 indicating the degree to which
the item is required for each task that has been identified. Whilst the notion of rating
teamwork processes in terms of their relative importance in a task appears to have merit,
the absence of detailed guidance or a teamwork taxonomy makes TTA a weak candidate
for inclusion as a component of TCTNA.
4.2.4 Task and Training Requirements Analysis Methodology
Task and Training Requirements Analysis Methodology (TTRAM) (Swezey, Owens,
Bergondy, and Salas, 1998) was developed as a method to identify potential application
areas for the use of networked simulations to supplement live training. For each task
identified in the task analysis a skill decay analysis and a practice effectiveness analysis
is conducted. The skill decay index calculation is based on ratings of task difficulty,
degree of prior learning and frequency of task performance. The practice effectiveness
index calculation is based on ratings of amount of task practice in the extant system,
frequency of task practice and quality of task practice. These indices are used to identify
training gaps. Further ratings of task skill requirements, task criticality level and
teamwork level are established to guide the selection of training solutions. Unfortunately
the guidance on identifying putative solutions is scant. Overall there appears to be little of
value in this method for TCTNA.
4.2.5 Work Domain Analysis
The Australian Defence Science and Technology Organisation has applied the Work
Domain Analysis (WDA) component of Cognitive Work Analysis (CWA) to TNA
(Naikar and Sanderson, 1999). At the heart of the WDA process is the development of an
abstraction hierarchy, the composition of which is shown in the first column of Table 5.
The first three levels of the abstraction hierarchy (functional purposes, properties and
values and purpose related functions) provide a technology agnostic description of the
requirements for the system. The lowest two levels (physical functions and physical
form) describe the physical implementation of the system as designed. There are many to
many mappings between each level of the hierarchy.
The novel element of the approach adopted by Naikar and Sanderson (1999) is the
mapping of the abstraction hierarchy elements to training needs and functional
specifications. The mappings are shown in the second and third columns of Table 5. The
mapping to training objectives appears to make sense in terms of training an individual in
the use and exploitation of a system, although it may prove challenging to integrate the
different views on performance into a coherent set of training objectives. The most
significant aspect of the approach is the mapping to functional specifications as it
highlights the requirement for the ability to collect performance data in the training
environment, the need to specify effective scenarios and the need to correctly identify the
physical and functional attributes of the system in use and the environment. Furthermore
they make the point that the specifications of physical and functional fidelity should focus
on the functionally relevant aspects of fidelity.
32
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
This approach does not address instructional methods or the related instructional and
supporting functions, although potentially the abstraction hierarchy approach could be
adopted to identify the instructor functions and then map them into implementation for a
given training environment.
Table 5 Work Domain Analysis Abstraction Hierarchy (Naikar and Sanderson,
1999)
Functional Structure
Training Needs
Functional Specifications
Functional Purposes: Why a
domain exists or the reason for
its design
Training Objectives: Purpose for
training workers to fulfil the functional
purposes of a work domain
Design Objectives: training system
must be designed to satisfy the
training objectives of the work domain
Properties and Values: Criteria
for ensuring that purpose–
related functions satisfy system
objectives
Measure of Performance: Criteria
for evaluating trainee performance or
the effectiveness of training
programmes
Data Collection: Training system
must be capable of collecting data
related to the measures of
performance
Purpose-related Functions:
Functions that must be executed
and coordinated
Basic Training Functions:
Functions that workers must be
competent in executing and
coordinating
Scenario Generation: Training
system must be capable of
generating scenarios for practicing
basic training functions
Physical Functions:
Functionality afforded by
physical devices in the work
domain and significant
environmental conditions
Physical Functionality: Workers
must be trained to exploit the
functionality of physical devices and
operate under various environmental
conditions
Physical Functionality: Training
systems must simulate the
functionality of physical devices and
significant environmental conditions
Physical Form: Physical
devices of the work domain and
significant environmental
features
Physical Context: Workers must be
trained to recognise functionally
relevant properties of physical
devices and significant environmental
features
Physical Attributes: Training System
must recreate functionally relevant
properties of physical devises and
significant features of the
environment
4.2.6 Mission Essential Competencies
The Mission Essential Competencies (MEC) approach to analysing training tasks was
developed by the United States Air Force USAF Research laboratories in order to
develop training interventions to enhance warfighter readiness. A series of SME
workshops are held to firstly identify the competencies that are required to perform
mission tasks effectively and then to identify the experiences that are considered
necessary to exercise those competencies. This is probably the most significant aspect of
the method in that it focuses on the required information to generate credible training
scenarios. Whilst the method is focussed on the enhancement of existing training, then
emphasis on scenario specification is particularly noteworthy.
4.2.7 Models for Analysis of Team Training (MATT)
MATT (Dstl, 2006) was developed with the objective of providing the RN with an
approach to collective training that complemented the SAT process for individual
training. Consequently, it attempts to address the analysis, design and evaluation of
33
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
training and is therefore the most comprehensive of the methods reviewed. The major
stages are shown in Table 6.
Table 6 MATT Process Stages (Dstl, 2006)
Team Task Analysis
Stage One
Establish principal features of the team’s task and main measurement
requirements
Stage Two
Identify Team Goals, Team Tasks, and Supporting Team Tasks
Stage
Three
Identify Team Processes and Team Errors
Stage Four
Identify Team Knowledge and Attitude Requirements
Stage Five
Identify priorities for Team Training
Team Training Design
Stage One
Establish principles of Team Training
Stage Two
Specification of Team Training Media
Team Training Assessment
Stage One
In Training Performance Assessment
Stage Two
Post Training Performance Assessment
The task analysis method is based on HTA(T) followed by the identification of likely
errors and consequences. Teamwork requirements are evaluated by rating the importance
of the main categories of teamwork identified in teamwork taxonomy as high, medium or
low. A table of suggested knowledge and attitude requirements is provided. Team
training priorities are determined by a consideration of difficulty in learning tasks, ease of
forgetting, when the task will be performed after training, task frequency and task
familiarity (previous experience). It is suggested that the output of the analysis would
lead to the construction of a description of principal features of the team task, of a set of
matrices and prioritised list of training activities. Whilst the guidance provided is
reasonably lengthy, it is surprising that there is no explicit guidance about how the
suggested outputs are derived from the recommended analysis steps or how these outputs
should be used subsequently.
Although the analysis is comprehensive, the difficulty arises in then taking the data
forward into the training design phase, as the guidance is limited to a list of training
principles and some suggested environment options for different categories of training
task. Teams are categorised as requiring training at stages equivalent to Fitts’ (1964)
cognitive, associative and autonomous stages. The significance being that it is argued that
novice and intermediate training requires lower fidelity training environments.
Notwithstanding the validity or otherwise of this assertion, there is no guidance provided
on fidelity analysis or training environment specification.
Detailed guidance on the development of assessment instruments is provided appropriate
to the detailed course design stage.
34
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
In summary the approach is strong from the perspective of the task analysis element,
although there are no diagrammatic techniques used, so it is sometimes difficult to
visualise the relationships between data collected at different stages of the analysis.
However, it is weak in giving guidance on training methods, instructor tasks and the
specification of instructional environments.
4.3 Evaluation
The review of the available methods shows that there is no one method already in
existence that is sufficient in scope to satisfy the requirements of TCTNA. HTA(T) looks
to be a promising method for conducting the core of the task analysis, and the principles
of the WDA abstraction hierarchy look to have application for the identification of
instructor functions and in the approach to specifying training environments and training
scenarios. A TCTNA method was therefore developed, see section 5.0.
35
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
5 The TCTNA Method
5.1 Introduction
The TCTNA Method described in the remainder of this document is designed to
supplement the guidance provided in JSP822, not replace it. Detailed guidance is
provided on how to conduct the task analysis and subsequent option generation and
selection. The scoping study and post project evaluation phases are not considered as the
extant guidance is considered to be sufficient. A note of emphasis with regards to the
scoping study is merited however. Given the complexities of team and collective tasks,
access to documentation about the task (both doctrine and system data) and SMEs is
imperative and must be considered in detail.
5.2 The Team/Collective Training Model
Figure 14 The Team/Collective Training Model
The Team/Collective Training Model in Figure 14 provides the framework for data
collection within the TCTNA method. Each component of the model is addressed during
the analysis process.
36
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
5.3 The Triangle Model of TNA
The TCTNA method is structured around the Triangle Model of TNA shown in Figure
15, which is an extension of that developed previously by the HFI DTC (2009).
Figure 15 The Triangle Model of TNA
Constraints on the final training solution have to be captured at an early stage to ensure
that analytical effort is not wasted on exploring training solutions that are untenable. For
example, a lack of available submarines might preclude live training for anti-submarine
warfare therefore leading to a synthetic training solution being the only feasible solution.
Constraints analysis (described first) is an ongoing process through the analysis.
The Team/Collective Task Analysis component combines operational task analysis and
training gap analysis from the extant TNA model, along with their outputs. At the start of
this analysis a number of models are made that capture the nature of the environment that
the team operates in as well as the organisation of the team and its communications
structure. Modelling techniques derived from software engineering are exploited to
provide useful visualisations. This ensures that the analyst develops a clear understanding
of the context of the task as well as providing valuable information for the subsequent
specification of training environments. The core of the task analysis is conducted using
an extension of HTA(T). The teamwork elements of the task are characterised based on
the teamwork taxonomy developed to underpin the Team/Collective Training Model. The
identification of training priorities is conducted using a risk management approach.
Training Overlay Analysis consists both of identifying appropriate methods for
facilitating the required training along with generic scenarios that the training
environment must ultimately support if effective training is to be delivered. Borrowing
37
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
from the concept of the abstraction hierarchy from Work Domain Analysis, instructor
functions are identified that are required to set up and deliver training In team and
collective training there are frequently many instructional staff involved with a variety of
roles. It is essential that these roles are identified and consideration given as to what
facilities are required to support those roles (such as capturing performance data and
conducting After Action Review).
Training Environment Analysis focuses on the specification of the required training
environments. This includes a fidelity analysis (fidelity analysis cannot be conducted
until the training method has been identified (for example, a part task training
environment will have very different fidelity requirements to an environment for full
mission training)). The analysis also considers the identification of the interfaces that
instructors would require to control training devices such as simulators and other tools
that they may require to fulfil their role, such as tools to support data capture about
student performance during exercises.
The sub-components of each stage of the TCTNA method are shown in Table 7 along
with a list of the supporting templates that are provided.
Table 7 TNA Triangle Stage Sub-Components
Constraints Analysis
Element & Description
Supporting Templates
Constraints Analysis
Constraints table:
Team/Collective Task Analysis
Element & Description
Supporting Templates/Notation
External Task Context Description
Generic Scenario Table
External Team Context Diagram
Interaction Table
Environmental Description Table
Environmental Task Demands Table
Internal Task Context Description
Organisational Chart
Role Definition Table
Internal Team Context Diagram
Interaction Table
System Matrix
Communications Diagram
Communications Matrix
Hierarchical Task Analysis for
Team/Collective Training (HTA
(TCT))
HTA (TCT) Diagram
Task Sequence Diagram
HTA (TCT) Task Description Tables
Task Role Matrix
38
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Teamwork Analysis
Teamwork Process Priority Table
Teamwork Interaction Table
Training Gap Analysis
Training Priorities Table
Team/Collective OPS and TO
Development
Training Overlay Analysis
Element & Description
Supporting Templates/Notation
Instructional Methods Selection
Practise and Assessment Methods Table
Training Scenario Specification
Training Objective Generic Scenario Table
Environment Description Table
Instructional Task Identification
Instructor Task Table
Training Overlay Requirement
Specification
Environment Description Table
Training Environment Analysis
Element & Description
Supporting Templates/Notation
Training Environment Specification
Training Environment
Rationalisation
Fidelity Analysis
Environment Object Specification Tables
Training Environment Option
Identification
Training Environment Option
Definition
Training Environment Option Description Table
Training Environment Option Properties Table
Training Environment Option
Evaluation
Training Environment Options Comparison Table
A Tornado F3 Pairs training example is used to illustrate various elements of the analysis
phases. The background to this example is shown in the box below.
39
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Tornado F3 Pairs Training Example
The use of a coordinated pair of mutually supporting aircraft has been central to fighter tactics
since WW1. The example used as a case study running through the three analysis phases is
based on the requirement to provide enhanced training in pairs tactics for Tornado F3 pilots and
Weapons Systems Operators (WSOs) in training. The detail is derived from research data from
a transfer of training trial conducted to establish the effectiveness of a networked desktop
computer system for training pairs tactics. This example has been chosen as it is a highly
demanding task which places extreme demands on the teamwork skills of the crews.
The context is a pair of fighters patrolling an area of airspace searching for bandit fighters or
bombers. The pair consists of a lead aircraft and a wingman aircraft. The lead aircraft controls
the intercept. The search is conducted by the WSOs using the air to air radars. The search
space is divided between them to improve the efficiency of the search. A ground based or
airborne Fighter Controller if present can also give vectors to a bandit aircraft. If bandits are
detected and the pair have a tactical advantage an intercept ensues. As the WSOs are looking at
different sectors of airspace, the WSO who has detected the bandit has to give directions to the
other WSO so that he can get the same radar picture. This is referred to as the radar meld. The
aim of the pair is initially to destroy the bandits Beyond Visual Range (BVR) with radar guided
missiles. If there is a single bandit, the leader will engage it whilst the wingman flies in support
ready to take a back up shot if required. If there are two bandits, then both aircraft will engage
a bandit after agreeing who is going to attack which one. The WSOs direct the radar based
intercept. If the BVR intercept is unsuccessful but the pair still have a tactical advantage a
visual intercept ensues. The WSO “talks the pilot’s eyes” onto the bandit location during the
merge into the visual. Once the pilot can see the bandit he takes over the intercept aiming to
shoot the bandit down with heat-seeking missiles. During the visual intercept the WSOs
provide an extra pair of eyes to watch for threats at visual range.
40
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
6 Constraints Analysis
6.1 Introduction
Constraints analysis is a simple but critical component of TCTNA. Accurate
identification of the constraints on the choices of training method and environment
prevent nugatory analytical effort in exploring training options which are not viable.
Typical categories of constraints include, but are not limited to:
a. Safety – Safety can be a significant constraint on the choice of training
environment. A typical example would be limitations on live firing. Invariably
weapons effects have to be simulated.
b. Cost – Cost of the use of training assets (a Tornado F3 costs between £10K and
£40k per hour to operate depending on what is included in the cost model) and
consumables (missiles can cost from tens to hundreds of thousands of pounds
each depending on type).
c. Training audience availability – There may be limits on how long a training
audience is available for training and when that availability falls.
d. Resource availability – Limitations on the availability of training areas,
equipment required for training, daily and seasonal weather conditions (for
example, the British Army Training Unit Suffield training area in Canada is
unavailable during the winter months as it is frozen over) can all limit training
options. It may also be difficult to present credible threats if the weapons
platforms required are not within the current Order of Battle.
e. Policy – There may be policy constraints on how training is conducted such as the
qualifications required for instructors to be able to conduct certain types of
training and the requirement for simulation to be explored as an option.
6.2 Constraints Tables
It is recommended that at the start of the TCTNA process a constraints table is
constructed and that it is maintained throughout the analysis. It should capture both the
constraints and their consequences. An example format is shown in Table 8.
41
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 8 Example Constraints Table Format
Constraint
Consequence
Live firing cannot take place
against manned targets
Instrumented Ranges are required for weapons effects to be
simulated in live flying.
Instrumented ranges are not
available for F3 Pairs training
Missile success has to be determined by instructors
evaluations of launch parameters and intercept geometry.
Limited aircraft hours are
available for training
Additional training time would have to be provided using
simulation or training hours would have to be cut from
elsewhere.
42
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
7 Team/Collective Task Analysis
7.1 Introduction
The Team/Collective Task Analysis (TCTA) stage is composed of a sequence of six
activities:
1. External Task Context Description
2. Internal Task Context Description
3. Task Analysis
4. Teamwork Analysis
5. Training Gap Analysis
6. Development of Team/Collective Operational Performance Statements and
Training Objectives.
This sequence allows for a progressively detailed understanding of the task to be
developed. The products at each stage are designed to have value in the subsequent stages
of analysis and be of utility for subsequent training design.
7.2 External Task Context Description
The first analytical step is to capture the key elements in the environment within which
the team operates, how the team interacts with those elements and why i.e. what effect is
the team aiming to deliver. The recommended approach is to capture narrative
descriptions of the requirements for team performance, in the form of generic scenario
descriptions, and develop supporting context diagrams.
7.2.1 Generic Task Scenarios
Generic task scenarios provide a narrative description of the environment within which
the team is required to operate and the team’s purpose in terms of the effect that it is
required to deliver. Source documents for identifying scenarios would include Concept
of Operations (CONOPS), Concept of Use (CONUSE), Concept of Employment
(CONEMP) and doctrine. At the early stages of the acquisition cycle for new systems
these may not be fully matured or in some circumstances may not have been produced.
Given that any system or organisation is put in place to deliver a required capability,
SMEs form the Capability Area in MoD should be able to assist in defining scenarios,
particularly as similar approach is used to determine capability requirements in the first
place. The generic scenario table structure shown in Table 9 provides a format for
representing scenario information which can be extended as required. Table 10 shows a
completed scenario table.
43
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 9 Generic Scenario Table Format
Scenario
Reference
Title (and reference number if required)
Effect Required
Description of the output required from the team
Timing
When the scenario takes place – may be relative to a specified
preceding event
Location &
Environment
Geographical area (if relevant) and description of the physical
environment in which the task is undertaken. This should cover both
the immediate environment in which the team is operating and the
larger environment in which the effect of the team is delivered (if
different).
Enemy Forces
Enemy force disposition and capabilities
Friendly Forces
Friendly force type, numbers, locations
Neutral Elements
Neutral forces and civilian population elements present of
significance to the scenario
Initial Conditions
The start state for the scenario
Events
Potential events and event sequences that could occur
Table 10 Completed Generic Scenario Table
Scenario
Reference
Combat Air Patrol forward of deployed ground forces
Effect Required
Combat identification of unknown aircraft entering the area of
responsibility and interception of enemy aircraft to maintain air
superiority of friendly forces area of operations.
Timing
Day/night
Location &
Environment
Over any geographical area where ground or littoral forces may be
operating.
Enemy Forces
Fighter aircraft with similar performance and weapons capabilities to
the Tornado F3. Bomber aircraft with similar performance and
weapons capabilities to the Tornado GR1.
Friendly Forces
AWACS providing fighter control for the area of the CAP
Neutral Elements
None
Initial Conditions
No enemy aircraft reported in the vicinity. Intelligence brief for the
mission suggests enemy bomber packages with fighter support may
attempt incursion into the area of responsibility.
Events
Fighter aircraft fly into the CAP area but turn away when illuminated
by fighter radar.
Bomber aircraft fly into the CAP area but turn away when illuminated
by fighter radar.
Fighter aircraft fly into the CAP area and take aggressive action
when illuminated by fighter radar.
44
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
7.2.2 Team Context Diagrams
Context diagrams in the form of system context diagrams have been used as a
representational method in Structured Analysis (a form of analysis in software
engineering) since the 1960s. A context diagram is used to represent agents or actors
external to a system that can interact with the system. System context diagrams show
interacting systems and environments, and generally do not show internal structure of
systems. The particular notation that is suggested is derived from that used in a real-time
software design method called Structured Development for Real-Time Systems (Ward
and Mellor, 1985) and the product is referred to as a Team Context Diagram (TCD).
Detail about the nature of the interactions shown in the TCD is compiled into a related
Interaction Table that provides more detail about the interactions in terms of the nature or
content of the interaction and how it is mediated (such as the passing of orders by signal).
The significance of the context diagram and its associated tables is that they provide:
•
a high-level definition of the conditions within which the team carries out its task
which informs the construction of the conditions statements for training objectives
and the specification of training environments
•
the inputs and outputs to the team which informs subsequent fidelity analysis
7.2.2.1 Team Context Diagram Notation
The notation for drawing a TCD is shown in Figure 16. The TCD shows two types of
information of interest. These are the elements in the environment that the team interacts
with and the nature of the interactions themselves.
Figure 16 Team Context Diagram Notation
The TCD is constructed using the following components:
•
Team Circle – The central circle is labelled with the name of the team or
collective unit that is under analysis.
•
Environment Element Boxes – the rectangular boxes are labelled with the names
of the elements in the environment that the team interacts with. A single box can
45
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
be used to represent a set of elements of the same type provided the interactions
between the team and every element in the set are the same. A diverse range of
element types can be represented in the diagram including, but not limited to:
o Superior organisation/team
o Subordinate team(s)
o Peer organisations/teams (entities directly under the control of the superior
organization such as other companies in a battle group)
o Coalition forces
o Enemy
o Civilian elements (Non Government Organisations, Civilian Authorities,
Media)
o Neutral forces
o Systems the team operate (such as sensors and weapons)
o Physical environment (such as terrain, weather etc.)
•
Data Stores – Data stores show where a team and an environmental element
(such as a sub-unit) can access a shared data area such as an intranet page or
filestore to post or retrieve information.
•
Interaction Arrows – The arrows on the diagram show whether the interactions
are one or two way. An arrow can represent multiple interactions (this avoids
cluttering the diagram with multiple arrows between elements). Multiple
interaction arrows are used where the team can interact both directly and
indirectly through a data store.
•
Interaction Label – Having reduced the complexity of the diagram by using a
single arrow to represent one or more interactions between the team and a given
environment element, it is necessary to have a cross-reference to the interaction
arrow so that more detail can be referenced to the diagram in the Interaction
Table. A textual description could be used but this would only be effective for a
single, one-way interaction. It is recommended that numbers are used for
simplicity.
7.2.2.2 Interaction Table Construction
The Interaction Table captures the detail of the various interactions shown in the TCD in
terms of the content/nature of the interaction and the mode of interaction. The entries in
the table are referenced by the interaction labels used in the TCD. The format of an
Interaction Table is shown in Table 11.
46
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 11 Interaction Table Format
No
From
To
Content/Nature
Mode
1
Team
Element 1
Content description
Mode description
Element 1
Team
Content description
Mode description
2
Team
Element 2
Content description
Mode description
3
Team
Element 3
Information type
Data store name
4
Team
Element 4
Content description
Mode description
Content description
Mode description
Content description
Mode description
Content description
Mode description
Element 4
5
Team
Team
Element 4
Information type
Data store name
Element 4
Team
Information type
Data store name
The columns contain the following types of information:
•
From/To – These columns capture the direction of the interaction being
described, noting that there will be separate entries for each direction on a twoway interaction arrow (as shown for interaction 1), and potentially more than one
entry for the same direction if multiple interactions are captured by an interaction
arrow (as shown for interaction 4).
•
Content/Nature – the heading content/nature has been used to reflect the fact that
the interaction may be either communication including the passing of an
information product (such as sending an order or requesting information) or some
form of action or effect (such as firing a missile or the effects of weather).
•
Mode – The mode of interaction is a description of how the interaction is
mediated. For communication it will be the communication channel (such as voice
by secure radio or email). In the case of indirect communication through a data
store then the data store should be identified such as a web page where documents
are posted (as shown for interaction 5).
7.2.2.3 Example External Context Description
Figure 17 shows a TCD and for the Tornado F3 Pairs Air Combat training example. The
TCD in Figure 17 shows all of the elements external to the aircraft that interact with the
pair and has therefore been labelled the external environment TCD. Weather and terrain
have been included in the diagram, as the pair has to take these into account during the
intercepts: close proximity to the ground affects the geometry of intercepts and visibility
affects target detection and influences tactics. The bandit missiles aids have been shown
separately from the bandit aircraft because the missiles act independently to a greater or
lesser degree once released. In most if not all training situations where weapons are
47
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
involved, weapons effects have to be considered and separate analysis of weapons is
therefore required to specify a training environment.
Figure 17 Example Tornado F3 Pair External Environment TCD
Table 12 is the corresponding Interaction Table and shows a representative set of
interactions associated with the interaction arrows in the external context TCD. These
illustrate the use of one-way and two way interactions in single and multiple form and
interaction via data stores. Interaction 2 shows communication of information via the
Joint Tactical Information Distribution System (JTIDS). JTIDS is a complex, secure
communications system and arguably it is something of a stretch to represent it simply as
a data store. However, it broadcasts messages and there is no explicit reply from
receiving systems in so far as one user does not know if another user has received the
data from their system or has chosen to view it. The issue from the training perspective is
that there is the potential for erroneous action if another user has not received or viewed
information that is posted without requiring a receipt.
48
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 12 Example Tornado F3 Pair External Context Interaction Table
No
From
To
Content/Nature
Mode
1
F3 Pair
FC
Request for target information
Radio
FC
F3 Pair
Target information
Radio
FC
F3 Pair
Tactical Picture
JTIDS
F3 Pair
FC
Target assignment
JTIDS
3
Weather
F3 Pair
Visibility - affecting sighting of bandit aircraft
Cloud and haze
and sun position
4
F3 pair
Bandit
Fighters
Engage with radar guided missiles
SkyFlash missiles
Engage with heat seeking missiles
Sidewinder missiles
Evade
Air Combat
Manoeuvring
(ACM)
Evade radar guided missiles when detected
on radar warning receiver
ACM and use of
chaff
Evade heat seeking missiles when seen
ACM and use of
flares
Manoeuvre into attacking position
ACM
2
Bandit
Fighters
5
6
F3 Pair
Bandit
Missiles
Evade and deploy defensive aids
ACM and use of
chaff and flares
Bandit
Missiles
F3 pair
Pursue F3 and destroy
Manoeuvre and
warhead detonation
in range
F3 pair
Bandit
Bombers
Engage with radar guided missiles beyond
visual range
SkyFlash missile
Engage with heat seeking missiles
Sidewinder missiles
Evade
ACM
Evade radar guided missiles when detected
on radar warning receiver
ACM and use of
chaff
Evade heat seeking missiles when seen
ACM and use of
flares
Evade RWR indicates radar illumination by
fighters
Manoeuvre aircraft
Bounds intercept geometry, causes radar
clutter
Proximity and
topology
Bandit
Bombers
7
F3 Pair
Terrain
F3 pair
F3 pair
49
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
7.2.3 Environment Description Table Construction
Even at this early stage of analysis it is possible to capture information about the
functionality of elements in the environment that will inform the specification of training
environments. For example, the capability required of an element such as a sub-unit will
determine if it is something that requires the element to participate in training or if it can
be represented by a role player or computer generated entity for instance.
Further information about the nature of the actions of the environmental elements is
compiled into an Environment Description Table, a suggested format of which is shown
in Table 13. The Environment Description Table is a table that will be populated with a
variety of data during the whole of the analysis process, the aggregation of which will
provide the necessary detail to define training environments. The entries recommended at
this stage are:
•
Element Description – The element description should provide a brief overview
of the role of the element in the environment, in other words “what does it do?”
•
Action/information required to generate interaction to the team – a
description is required of how the element generates the interaction in terms of
what information must be available to it and what knowledge is required.
•
Action on receipt of interaction from the team – a description is required of
how the element should react to the interaction from the team, providing an
understanding of the purpose of the interaction.
Table 13 Environment Description Table Format
Element
Element 1
Element 2
Element 3
Element 4
Element
Description
Description of
element 1
Description of
element 2
Description of
element 3
Description of
element 4
Outputs to
the Team
Description for
interaction 1
None
None
Description for
interactions 4 and 5
Inputs
received
from the
team
Description for
interaction 1
Description for
interaction 2
Description for
interaction 3
Description for
interactions 4 and 5
The information about the element and how it interacts with the team is critical from a
training perspective both in terms of comprehending the team task, but also informing
how an element would be represented in the training environment, including the
resources it would require. For example, a warfare team training in a synthetic
environment may need to be fed sightings of enemy units attacking the ship from a bridge
watchkeeper. Since the bridge does not exist (there is no window for a watchkeeper to
look out of) whoever is roleplaying the watchkeeper needs a representation of the
environment (such as a top down display showing the range and bearing of an enemy
unit) in order to provide the information.
50
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Given that there are likely to be numerous entries in the environment description table, it
is most likely that a spreadsheet will prove a more appropriate tool for recording the data
than a text document table.
Table 14 shows Environment Description Table entries that correspond to the elements in
the External TCD Figure 17. Not all elements have an entry in every cell. In this case, it
is not meaningful to describe what information the weather or terrain needs to interact
with the team. Standard voice procedure and codewords are used in the communication
between the Fighter Controller and the Team. This is recorded in the Fighter Controller
column.
51
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 14 Example Tornado F3 Pair Environment Description Table
Element
Fighter Controller
Bandit Fighters
Element
Description
The FC interprets and
air defence radar
display to provide target
location and vectors to
the target
Bandit fighters have
equivalent performance
to the F3 and are armed
with radar guided and
heat seeking missiles
and have radar warning
receivers
Outputs to
the Team
Target location and
vectors to the target by
voice (note : standard
voice procedure and
codewords used)
Updates to tactical
picture via JTIDS
Inputs
received
from the
team
Requests for target
information. Target
assignment information
via JTIDS
Bandit Missiles
Bandit Bombers
Weather
Radar guided and
heat seeking
missiles
Bandit bombers have
similar performance to
the F3 and are armed
heat seeking missiles
and have radar
warning receivers
Clouds, haze,
precipitation,
sun position
and light
levels
Ground
topology
Offensive and defensive
manoeuvre, heat
seeking and radar
guided missile shots,
deployment of
defensive aids
Pursuit of pairs
aircraft based on
radar and heat
signatures and
aircraft manoeuvre,
detonation of
warheads within
range
Defensive manoeuvre
BVR, offensive and
defensive manoeuvre
in visual range, missile
shots, deployment of
defensive aids
Visibility of
bandits
affected
Affects
intercept
geometry and
can cause
radar clutter
Offensive and defensive
manoeuvre, heat
seeking and radar
guided missile shots,
deployment of chaff and
flares
Deployment of
defensive aids
Offensive and
defensive manoeuvre,
heat seeking and
radar guided missile
shots, deployment of
chaff and flares
N/A
N/A
52
Terrain
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
7.3 Internal Task Context Description
Having determined the external context of the team in terms of the entities it interacts
with and the nature of the interactions the next logical step is to determine its internal
structure and the role that each of the elements of that structure. If the team occupy a
dedicated space or spaces it may be useful to obtain a seating plan for the team roles.
One would expect to find organisational structure charts already in existence either
because the team already exists, or because its composition would need to have been
identified to design the spaces it is to operate in (in the case of a team manning
equipment) or because its composition would need to have been defined for manning
purposes.
Similarly one would expect role information to be readily available.
However, in complex acquisitions such as for the QE Class Carrier, concepts evolve over
time and the currency of documents may need to be checked. Experience shows that
changes may be made in one area of structure and team organisation that impact
elsewhere and the impacts may not have been rippled through. Examples include changes
in manning levels not being reflected in seating and workstation allocation in spaces and
mismatches between allocation of roles. The TNA process can often highlight such
anomalies to appropriate SMEs.
7.3.1 Organisational Structure
A first step in understanding how a team or collective organisation works is to determine
its organisational structure. Where a large team is broken down into sub-teams, these sub
teams have to be identified. Where a team membership may exist in a number of forms,
such as manning states (e.g. Harbour watches, Cruising Watches, Action Stations), then
each manning state will need to be mapped out.
Figure 18 shows an organisational chart for the Tornado F3 Pair. Whilst it appears trivial
to construct as there are only four individuals involved, complexities can arise because of
the number of potential roles and how those roles can be re-allocated. The formation
leader may be either the pilot or WSO of the lead crew (usually determined by
experience) and has overall command responsibility for the pair. However, during the
search for bandit aircraft and in a subsequent beyond visual range engagement of the
bandits, the formation will be led tactically by the lead WSO. If the engagement
develops into a visual fight, then the lead pilot takes over tactical control. However, if
the wingman aircraft has the tactically superior position to the lead aircraft when it comes
to engaging the bandit beyond visual range, the wingman WSO will take over the tactical
leadership of the formation with tactical control passing to the wingman pilot for the
visual phase of the engagement.
53
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Figure 18 Example Tornado F3 Pair Organisational Chart 1
7.3.2 Role Definitions
Role definitions for each sub team or team member should be populated. Useful headings
include:
•
Team Name – e.g. Flight Deck Management Team 1a
•
Team Role Name – e.g. Flight Deck Officer
•
Role Rank – Lt/Cdr / Lt
•
Location – (if relevant)
•
Role Description
•
Interfaces
•
Role Allocation
Table 15 and Table 16 show example role definitions for the Lead and Wingman WSOs
respectively. These roles are complementary and this becomes apparent in the role
description. As these roles can also be dynamically reallocated during an intercept this is
captured in the role descriptions by adding an extra field to the role descriptions
concerning role allocation. This information can inform the subsequent analysis of the
team task.
54
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 15 Example Role Definition for the F3 Pair Lead WSO
Team
F3 Pair
Role
Lead WSO
Rank
Flt Lt/Sqn Ldr
Location
WSO Cockpit
Role
Description
•
Managing the radar search for bandits including allocation of radar
search areas
•
Liaising with Fighter Controller
•
Performing radar meld with wingman WSO once bandits identified
•
Tracking bandits once identified
•
Directing the beyond visual range engagement
•
Guiding pilots eyes to bandits during visual merge
•
Conducting visual search for bandits and bandit missiles during visual
engagement
Interfaces
Radar, JTIDS, RHWR, Radio, Intercom, Weapons System, Visual Scene
Role
allocation
Usually held by the WSO in the Lead crew but can be allocated to the
wingman WSO by the lead WSO at the start of an intercept if the wingman
has the tactically superior position in formation against the bandit
Table 16 Example Role Definition for the F3 Pair Wingman WSO
Team
F3 Pair
Role
Wingman WSO
Rank
Flt Lt / Sqn Ldr
Location
WSO Cockpit
Role Description
Conducting radar search for bandits in search area directed
by Lead WSO
Performing radar meld with Lead WSO once bandits identified
Conducting radar search for other bandits whilst lead WSO
tracks bandits identified
Directing the beyond visual range engagement
Guiding pilots eyes to bandits during visual merge
Conducting visual search for bandits and bandit missiles
during visual engagement
Interfaces
Radar, JTIDS, RHWR, Radio, Intercom, Weapons System,
Visual Scene
Role allocation
Usually held by the WSO in the wingman crew but can be
taken by the lead WSO if the wingman has the tactically
superior position in formation against the bandit
55
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
7.3.3 Internal Team Context Diagram
Drawing a TCD can be a useful way of capturing the essential features of the immediate
or internal operating environment that the team operates in, particularly in capturing the
systems that they operate.
Figure 19 shows an Interface TCD for the Tornado F3 Pair. It should be noted that this
diagram shows the interfaces for each crew in the pair. There is no need to duplicate the
diagram for each aircraft as they are the same. There are many more systems on the
Tornado F3 than are shown on the diagram. Those shown have been selected based on
their relevance to the training of pairs tactics. Items such as electrical systems, hydraulics
systems and ejector seats have been omitted as they are not relevant to tactics training,
whereas fuel gauges have been included as low states are a cue to disengage from an
intercept and return to base.
Figure 19 Example Tornado F3 Pair Interfaces TCD
Table 17 is the corresponding Systems Interaction Table and shows a sample of the
interaction descriptions. The challenge at this stage in the analysis is deciding on the
level of detail to record. Interactions with systems tend to be quite complex as the data
output is often complex as are the input interactions. Ultimately a lot of detail may be
needed, particularly if a synthetic training environment has to be specified. It is
suggested that a relatively high level description is used at this stage as there is no point
in doing extensive analysis until the decision has been made that there is a training
requirement and training objectives have been established. If, during further stages of
analysis, it is found that more detail is required the table can always be amplified. It is
likely that that much of the detail will appear as amplification in the mode column.
56
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 17 Example Tornado F3 Pair Interface Interaction Table
No
From
To
Content/Nature
Mode
2
F3 pair
Radar
Control of radar scan area
WSO radar hand controller
Radar
F3 Pair
Radar picture
WSO and pilot radar displays
F3 Pair
Radio
Transmit selection
Pilot and WSO, press to
transmit buttons on joystick and
nav radar hand controller
Outgoing voice comms
Helmet Microphones
Incoming voice comms
Pilot and WSO headsets
5
Radio
F3 Pair
Table 18 shows the corresponding entries in the Environment Description Table (noting
that this is not a new table, these are additional columns in the table constructed for the
previous TCD). The same issue about level of detail applies to this table, the guidance
being to try to capture sufficient detail for the level of analysis being conducted. It is
useful to include a reference to the appropriate technical document(s) should further
detail be required. Precise descriptions of cockpit fields of view may become relevant if
simulation is required as a training solution.
Table 18 Tornado F3 Pair Environment Description Table Entries
Element
Element
Description
Radio
VHF and UHF radio
with control panel,
transmit buttons on
pilot stick and WSO
radar hand controller
(Technical Reference)
Radar
Visual Scene
Foxhunter multi-mode
radar controlled
through control panel,
WSO hand controller
and pilot hand
controller with front
and rear cockpit
displays
Tandem seating limits WSO field of
view forwards through the canopy.
Head up display central in pilot field
of view. Terrain and weather can be
seen. Sighting of bandit aircraft,
missiles and defensive aids
depends on range and visibility
(Technical Reference)
Outputs to
team
Voice comms
Radar picture and
current settings
View of terrain, weather, bandits,
bandit missiles and flares, other
aircraft in pair
Inputs from
team
Channel settings,
press to transmit,
outgoing voice comms
Radar configuration
settings
N/A
In the Tornado example use and access to the various systems is distributed between the
pilot and the WSO in each aircraft. Some systems are accessible to both (radios, RHWR
displays) whilst others are seat specific (e.g. flying controls for the pilot). With larger
teams this distribution gets more complex. At this stage it can be useful to compile a
matrix of who has access to which system to aid understanding of team functioning. A
systems matrix is one way to do this as shown in Table 19
57
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 19 Example System Matrix
System
User
Pilot
WSO
Flying Controls
X
RHWR
X
X
Radio
X
X
Head up Display
X
JTIDS
X
……
7.3.4 Team Communication Structure
It is useful at this stage to identify how communications can be supported within the team
and with external agencies outside the boundary of the team. There are a number of ways
that this information can be portrayed. The suggested methods are a Communications
Diagram, a communications Matrix and a Textual Description.
7.3.4.1 Communication Diagram
The notation for a Communications diagram is illustrated in Figure 20 which shows a
Communications Diagram for the Tornado F3 pair. The use of a diagram is particularly
powerful when sub-teams or team members are in different locations (including split
from their sub team or team and located within another sub team).
The notation is the same as that for a TCD with the following additions:
•
Nested boxes are used to show sub-teams and team members within a sub team
•
The arrows represent lines of communication and can represent multiple modes
(not shown in the example, but for a single-seat fighter the pilots could
communicate using radio and wing waggling and these two could be represented
with a single arrow)
•
An arrow terminating at a sub-team box indicates all members of that sub-team
can be communicated with by the mode(s) represented by that arrow (e.g. arrows
9 and 10 for wing waggling)
•
An arrow originating from a sub-team box indicates that all members of the subteam can communicate with the recipient using that communications mode(s)
represented by the arrow (e.g. arrow 6 for radio communication)
•
A dashed box indicates an element outside the team boundary (e.g. the Fighter
Controller)
58
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
The arrows on the diagram could be labelled with the communications mode but even on
a small diagram such as Figure 20, this could make the diagram too cluttered to read. The
arrows have therefore been numbered so that the detail can be recorded in an associated
Communications Description or Communications Matrix.
59
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Communications Description
The crews (Pilot and WSO) communicate internally
using intercom (7,8).
Each crew member can communicate with both
members of the other crew and the fighter controller
using the radio (1,4,6) (radio transmissions are
heard internally over the intercom by the other crew
member).
The WSOs can also communicate with each other
and the Fighter controller using JTIDS. (2,3,5)
Pilots can communicate with the other crew visually
by wing waggling. (9,10)
Figure 20 Example Tornado F3 pair Communications Diagram and Textual Description
Table 20 Example Tornado F3 Pair Communications Matrix
Communications Channel
Role
Visual
Intercom (leader)
Lead Pilot
X
x
x
Lead WSO
Rx only
x
x
Wingman Pilot
X
x
x
Wingman WSO
Rx only
x
x
x
x
x
Intercom (wingman)
Fighter Controller
60
Radio
JTIDS
x
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Ultimately, the communications diagram can be extended to form an Internal Context
Diagram for the Team by including other, non-communications interactions. However,
this necessitates the determination of:
•
which internal communications connections are actually used and the content of
the interactions on these connections.
•
the source (sub-team or team member) of interactions with external environment
elements.
•
the recipients (sub-team or team member) of interactions from external elements.
Whilst some of this information may be available at this stage and therefore the Internal
Context diagram could be started, it is likely that much of the information will be
determined during the subsequent task analysis stage and its construction is probably best
left until that stage.
7.3.4.2 Communications Description
The communications description provides a simple way of presenting the information
about the communication modes related to the arrows in the Communications Diagram.
The numbers in brackets cross-refer to the numbers on the diagram. An example of such
a description is shown alongside the Communications Diagram in Figure 20. For
relatively small Communications Diagrams this is probably the most efficient way of
presenting this information.
7.3.4.3 Communications Matrix
A Communications Matrix provides a tabular representation which captures the modes of
communication which are possible both internally between team members and between
team members and external elements.
Table 20 shows an example Communications Matrix for the Tornado F3 pair that
corresponds to the Communications Diagram shown in Figure 20. The orientation of the
matrix (whether the channels are shown as rows or columns) is simply a matter of
convenience based on the number of channels and the number of people/roles that are
communicating. A Communications Matrix is of particular value when the
communications network is large or complex and Communications Description would
become unwieldy.
61
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
7.4 Task Analysis
Having characterised the context of the team/collective task, the next stage is to conduct a
task analysis.
7.4.1 Hierarchical Task Analysis for Team & Collective Training - HTA(TCT)
The Hierarchical Task Analysis for Team and Collective Training HTA(TCT) approach
is adapted from the approach developed by Annett (2000) under a previous MOD
research contract. Tasks are hierarchically decomposed and represented in an HTA
diagram. Supporting detail is recorded in a task description table. Task sequence
diagrams are also produced to provide a temporal view of task execution. Finally a Task
Role matrix is constructed to provide a quick reference to participants in each task and
sub-task.
Data sources for the HTA(TCT) would typically include documentation (such as
CONOPS, CONUSE, CONEMP and role group summaries if available) and discussions
with SMEs. Observation of the task may be possible if the TNA is being conducted to
update extant training. For new systems this will not be possible.
One of the more challenging questions when conducting any form of HTA is “at what
point do you stop?” The technically accurate answer is “when it is fit for purpose”. In
practical terms for Team and Collective training the decomposition can stop when
sufficient information about how the team interacts has been gleaned. Typically this will
be when an individual task has been reached or a multi-actor task is sufficiently simple
that further decomposition is not needed.
7.4.1.1 HTA(TCT) Diagram Notation and Format
The format used for the HTA(TCT) diagrams is shown in Figure 21.
62
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Figure 21 HTA Diagram Format
The HTA(TCT) diagram presents two sorts of information. The bulk of the diagram is
made up of a hierarchical presentation of the overarching task and sub-tasks that must be
met in order to achieve it. The format will be familiar, as it is essentially the same as that
used for an organisational chart or instructional scalar. In addition to the task hierarchy,
plans are shown which describe the sequencing for each set of sub-tasks.
The notation used for constructing the diagrams (adapted from Stanton, 2006) is detailed
in Table 21. Rounded boxes are used for the plans so that they are easily distinguished
from sub-tasks. Similarly, diagonal lines are used to link plans to the node where they
apply to avoid confusion with task and sub-task connectors.
Table 21 HTA(TCT) Diagram Element Symbols and Descriptions
Element
Symbol
Contents/description
•
Task/sub-task number
•
Sub-teams and team members conducting the
task
•
Task description
•
These connect tasks to their sub-tasks
•
Plan + number of task/sub-task that it refers to
Plans
•
Sequence in which the sub-tasks are executed
Plan connectors
•
These connect plans to the node where the subtasks connect to the task which they are
amplifying
Tasks and subtasks
Task and sub-task
connectors
The use of separate boxes to show plans is recommended simply on the grounds of
clarity. The plan information could be shown in the task box but the box could become
rather large. That said, if charting software is used for drawing the HTA it may not have
the necessary facilities for showing the plans separately, in which case the plan
information can be located in the associated task box.
Plans can be quite complex and, as a consequence, a textual description could be quite
large and unwieldy on the HTA diagram. A suggested syntax for expressing plans more
tersely (adapted from Stanton, 2006) is shown in Table 22. The choice between using the
suggested syntax, or text or a mixture of both for describing a plan lies with the analyst.
However, the guiding principle must be that of clarity – a triumph of syntactical dexterity
over clarity serves no useful purpose to the reader of the HTA.
63
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 22 Syntax for plans and examples of use
Text
Symbol(s)
Example of use and text equivalent
then
>
1>2
Do 1 then 2
and
+
1+2
Do 1 and 2 together
or
/
1/2
Do 1 or 2
Any of
:
1:2:3
Do 1, or 2 or 3
If condition X
then 2
X? >
Thirsty? > 2
If thirsty then do 2
7.4.1.2 Example HTA (TCT) Diagram
An example of an HTA diagram for the Tornado F3 pairs task “Destroy bandit beyond
visual range” is shown in Figure 22. This shows the goal as being sub-divided into five
sub-tasks, numbered 1.1 to 1.5. With the exception of sub-task 1.1 (Check for bingo
fuel), these sub-tasks are decomposed to another level. Sub-task 1.4 (Engage Bandit
BVR) has two further levels of decomposition.
One of the challenges in constructing an HTA diagram is deciding how to lay it out and
how much information to put on one page. Even with only three levels of decomposition,
it can be seen from Figure 22 that the HTA diagram becomes quite large and that it is not
always possible to show all items at the same level of decomposition on the same line.
The decomposition for task 1.4 has been shown on the bottom half of the page to avoid
having to split the diagram across more than one page. Figure 22 is pretty much at the
limit in terms of the amount of information that can be shown on a single page and still
be readable. Where a diagram needs to be split across multiple pages, appropriate subtasks should be selected to be decomposed on other pages. In this example, sub-task 1.4
would be a candidate for decomposing on another page as it is as complex as all the other
sub-tasks put together and is the only one that has two levels of decomposition.
Plan 1 and plan 1.4.3 illustrate the use of a mix of plan syntax and free text to describe
the plans. This choice was made as, in each case, it was difficult to capture the
complexity of the plan by use of the syntax alone and a free text description would have
taken up too much room.
Abbreviations for the names of the sub-team and team member roles have been used to
conserve space in the task boxes. These are expanded in the legend at the bottom of the
diagram.
64
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Figure 22 Example HTA for the F3 Pair Task Destroy Bandit beyond Visual Range
65
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
7.4.1.3 Task Sequence Diagrams
HTA diagrams are a particularly effective means of representing the breakdown of a task
into sub-tasks as they capture both multiple levels of detail of decomposition and detail
about task sequencing. However, they have one limitation in that it is difficult to get a
visual appreciation of the relative sequencing of tasks. This is of particular significance
from the perspective of workload as sub-teams or team members may be involved in
multiple tasks which are concurrent. To address this issue, the use of a supplementary
representation of sub-task sequencing not usually constructed during HTA, is
recommended. The structure and notation recommended is adapted from that used for
Pert Charts (often used to show task sequencing for project management purposes). The
diagrams have been termed Task Sequence Diagrams. This choice of technique was in
part based on the observation that SMEs, when asked to draw a diagram showing the
stages in a task, will often draw this style of diagram. Constructing such a diagram may
therefore be helpful in eliciting information from SMEs for the purposes of constructing
the HTA diagram.
The notation is shown in Table 23. The notation for tasks and sub-tasks is the same as
that used for HTA diagrams for consistency so that they can be copied from one diagram
type to the other.
Table 23 Task Sequence Diagram Notation
Element
Terminators
Tasks and subtasks
Continuation
arrows
Task connectors
Symbol
Contents/description
•
These are placed at the start and end of the
sequence and contain a description of the
initiating and terminating conditions for the task
•
Task/sub-task number
•
Sub-teams and team members conducting the
task
•
Task description
•
These horizontal arrows show how a task extends
over time , the arrow head being aligned with the
end of the last task in the sequence that the
activity occurs in parallel with
•
These are used for linking tasks in sequence
The format of a Task Sequence diagram is shown in Figure 23. This diagram shows how
three tasks are sequenced. Sub-task 1.1 has a horizontal arrow extending to the right of it
indicating that its execution continues until task 1.3 terminates. Sub-tasks 1.2 and 1.3 are
shown as being conducted in sequence.
66
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Figure 23 Task Sequence Diagram Format
Figure 24 shows a Task Sequence for the F3 Pair Task 1 “Destroy bandit BVR). This
captures the sequencing of the high-level sub-tasks 1.1 to 1.5, showing very clearly that
sub-task 1.2 is conducted in parallel with firstly sub-task 1.2 and then sub-tasks 1.3 and
1.4 which are also conducted simultaneously. Further diagrams can be drawn to show
successive levels of detail as illustrated by Figure 25 which shows the sequencing
decomposition for sub-task 1.2 Detect Bandit BVR.
Figure 24 Task Sequence Diagram for the F3 Pair Task 1 Destroy Bandit BVR
Figure 25 Task Sequence Diagram for the F3 Pair Sub-Task 1.2 Detect Bandit BVR
7.4.1.4 HTA(TCT) Task Description Tables
Task Description Tables capture the full range of detail required about the tasks and subtasks that will be used in subsequent stages of the TNA process. A suggested template for
HTA task tables, with a description of the contents of each field, is shown in Table 24.
This is an adaptation and extension of the approach used by Annett (2000). A task table
should be completed for each sub-task in the HTA hierarchy.
67
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 24 HTA(TCT) Task Description Table Structure
Task: Number and task statement
Purpose: Brief description of why the task is required
Initiating condition: When should the activity be started
Terminating condition: When should the activity be terminated
Sub-tasks and participants: List of the sub-tasks and the sub-teams and/or team members
that execute them
Plan: Narrative description of the sequencing of the sub-tasks involved
Teamwork Description: Description of the teamwork requirements to fulfil the task.
Communications modes should be included and cross-checked with the communications
diagram and matrix/description
Inputs
Content/Nature, Source & Mode
External
Inputs from elements outside of the team (should tie up with inputs shown in
the context diagram(s)). The mode of the input should tie up with the mode
listed in the interaction table(s)
Internal
Inputs received from other sub-teams or team members Communications
should match a link shown in the communications diagram. The mode of the
input communications should match a mode shown in the communications
matrix/description
Products
Content/Nature, Destination & Mode
External
Outputs to elements outside of the team (should tie up with outputs shown in
the context diagram(s)). The mode of the output should tie up with the mode
listed in the interaction table(s)
Internal
Outputs to other sub-teams or team members. Communications should match
a link shown in the communications diagram. The mode of any output
communications should match a mode shown in the communications
matrix/description.
Other Outcomes: Description of the side effects of the activity e.g. use of resources, attrition
Critical Errors and Consequences: A description of the critical errors that could be made
and their consequences
Assessment measures and data requirements: A description of the criteria for evaluating
the task (process and product) and the data that is required to make the assessment
The structure of the table is designed to provide a logical breakdown of the task or subtask, with related items located together. Specifically:
•
The task number and task statement cross-refer to the HTA Diagram.
•
The statement of the task’s purpose and initiating and terminating conditions
put the task into context.
•
The listing of sub-tasks and participants, the plan and the teamwork
description provide an overview of the mechanics of the task and how it is
executed. Including the communications modes for teamwork ensures that
68
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
required modes that have to be supported in training are not overlooked. If the
task is not decomposed any further, the fields for sub-tasks and the plan will be
left blank. If the task is performed by an individual then the teamwork field will
be left blank.
•
The fields for inputs, outputs and other outcomes capture the detail of which
inputs and cues the team is responding to and how the responses are mediated.
These should be cross-checked against the context descriptions. Other outcomes
are included as they may impact on how the task and other tasks are achieved.
Issues such as consumption of fuel and munitions and weapons effects are also
significant from the perspective of training cost and safety, and may therefore
influence the training option selected. Not all fields will have entries.
•
Critical errors and consequences are included for two reasons. Firstly, this
information will be required at the gap analysis stage to inform the decision about
whether the task needs to be trained. Secondly, consideration of how a task can go
wrong and the consequences of such an event can illuminate subsequent
discussion of how performance of the task can be evaluated and what measures
are appropriate. Potential consequences of errors in the conduct of a task could
include:
o the loss of positive effect that would have stemmed from successful task
completion
o loss of capability
o mission failure
o personnel may be injured in the completion of the task
o damage to essential equipment
o collateral damage or injury to 3rd parties
o a breach of law or military regulations (such as Rules of Engagement
being improperly applied).
•
Assessment measures and data requirements are put last as they are best
considered once a complete understanding has been developed of the process by
which the task is performed and the nature of the inputs and outputs of the task.
On first inspection, completing a task table for each sub-task may appear to be an onerous
and intimidating task. However, the table structure has been designed not only as an
output of this stage of analysis, but also to provide an analytical framework to use during
the information gathering. The table can be used both to guide analysis of documents
which describe the task being analysed, and to frame discussions with SMEs. It is
suggested that completing the tables as the HTA diagram is developed provides an
expedient method of ensuring that information collected is not lost or forgotten and
should expedite the analytical process. Difficulty in completing fields in the form is a
69
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
strong indicator that the task is not completely understood and needs to be revisited in the
analysis.
Table 25 shows a Task Description Table for the F3 Pair Task Meld Radar which has not
been decomposed further. Consequently, no sub-tasks are listed and the plan field is left
blank. Table 26 shows the Task Table for the Detect bandit BVR task from which the
Meld Radar task was decomposed. A point to note is that information from the lower
table (such as assessment measures) can be aggregated up into the table for the higher
level task.
Table 25 Task Description Table for F3 Pair Task 1.2.5 Meld Radar
Task: 1.2.5 Meld Radar
Purpose: Both WSOs need to have the same radar picture and confirm that they have both
identified the same aircraft as the bandit in order to plan and execute the intercept effectively
Initiating condition: Bandit located
Terminating condition: Melded radar pictures
Sub-tasks and participants:
L-WSO, W-WSO
Plan: N/A
Teamwork Description: The WSO who has located the bandit has to communicate the
bandit location to the other WSO, and they then have to ensure that they are both identifying
the same aircraft as the bandit.
Inputs
Content/Nature, Source & Mode
External
Target Manoeuvre –Bandit Fighters
Radar pictures – radar, JTIDS display - JTIDS
Internal
Communication between WSOs
Products
Content/Nature, Destination & Mode
External
Melded radar pictures - radar
Internal
Other Outcomes: Time taken is time available to bandit to break radar lock and manoeuvre
Critical Errors and Consequences:
Slow radar meld gives bandit more time to break radar lock and evade or move into an
offensive position (loss of tactical advantage by F3 pair)
Assessment measures and data requirements:
Time taken to meld radar pictures – requires recording of radar pictures, recording of WSO
comms during meld. Standard: Radar meld achieved accurately and sufficiently quickly as
not to lose tactical advantage.
70
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 26 Task Description Table for F3 Pair Task 1.2 Detect Bandit BVR
Task: 1.2 Detect Bandit BVR
Purpose: The bandit has to be located in the patrol area in order to intercept it
Initiating condition: On arrival in the search area
Terminating condition: When the bandit has been located, or if the formation becomes defensive
or if bingo fuel (fuel required to return to base) is reached
Sub-tasks and participants:
1.2.1 L-WSO Allocate radar search bands
1.2.2 L-WSO, L-P, W-P Manoeuvre formation
1.2.3 L-WSO, W-WSO Liase with fighter Controller (FC)
1.2.4 L-WSO, W-WSO Locate bandit
1.2.5 L-WSO, W-WSO Meld radar
Plan: After radar search bands have been allocated (1.2.1) the formation is manoeuvred (1.2.2)
whilst liaison with the FC (1.2.3) and location of the bandit (1.2.4) take place. Once the bandit is
located the radar meld (1.2.5) takes place.
Teamwork Description: The main aspect of team work is communication between the WSOs as
they search the airspace for the bandit. The radar meld is the critical phase of communication
once the bandit is detected.
Inputs
Content/Nature, Source & Mode
External
Target location from the FC via radio/JTIDS
Vectors to target from the FC via radio
Target location via radar display
Target manoeuvre
Internal
N/A
Products
Content/Nature, Destination & Mode
External
Request for target information to FC via radio
Target location to FC via JTIDS/radio
Internal
Melded radar pictures
Other Outcomes:
Fuel is consumed during the search and aircraft location changes.
Time taken for the radar meld is time available to the bandit to break radar lock evade or move into
an offensive position
Critical Errors and Consequences:
Slow radar meld gives bandit more time to break radar lock and evade or move into an offensive
position (loss of tactical advantage by F3 pair)
Assessment measures and data requirements:
Efficiency and effectiveness of search – requires recording of manoeuvre of F3 pair and bandit and
radar pictures. Standard: Systematic search conducted in accordance with SOPs
Time taken to meld radar pictures – requires recording of radar pictures, recording of WSO comms
during meld. Standard : Radar meld achieved accurately and sufficiently quickly as not to lose
71
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Task: 1.2 Detect Bandit BVR
tactical advantage
7.4.1.5 Task Role Matrix
The Task Role Matrix provides a simple look up table of which roles are involved in
which tasks and sub-tasks. This can be useful for analysing teamwork and can inform
training overlay analysis
An example Task Role Matrix is shown in Table 27. Indenting sub-tasks and sub-subtasks in the task listing aids clarity. Identifying which sub-team an individual is in is also
useful as interactions across sub-team boundaries become more obvious. In this instance,
the table clearly shows that the Leading WSO is involved in all of the tasks listed in the
table fragment and that many of these involve the Wingman Crew.
Table 27 Task and Role Matrix Example
Roles Participating
Lead-Crew
Task List
WingmanCrew
L-P
LWSO
W-P
W-WSO
1.1 Check for Bingo Fuel
X
X
X
X
1.2 Detect Bandit BVR
X
X
X
X
1.2.1 Allocate radar search bands
1.2.2 Manoeuvre formation
X
X
X
X
1.2.3 Liaise with Fighter Controller
X
X
1.2.4 Locate bandit
X
X
1.2.5 Meld radar
X
X
1.3 Check if Defensive
X
X
X
X
1.3.1 Check wpns and defensive aids
states
X
X
X
X
1.3.2 Evaluate tactical status
X
X
X
X
X
X
X
X
1.4 Engage Bandit BVR
1.4.1 Allocate lead
X
1.4.2 Agree tactics
X
1.4.3 Intercept bandit
1.4.3.1 Give vectors to pilot
X
X
X
X
Etc….
72
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
7.5 Capture Environmental Task Demands
Capturing an overview of the environmental task demands provides useful background to
the analysis at this stage as it helps conceptualise the task. The completion of an
Environmental Task Demands table provides a convenient format for recording key
information. An example Environmental Task Demands Table is shown at Table 28.
Table 28 Example Environmental Task Demands Table
Scenario
Reference
Name/Reference number of the scenario for which the demands are being described.
Environmental
Demand
Significance
Rating
Description of how demand occurs
(H/M/L)
Threat
H
Bandit fighters are armed with radar guided and heat seeking missiles and
radar, posing a lethal threat both BVR and in visual range. Bandit
bombers are equipped with heat seeking missiles and pose a lethal threat
in visual range
Performance
pressure
H
Performance pressure is high because any error may result in the lethal
threat posed by the bandits being realised
Time pressure
H
Air combat intercepts happen in a short time frame given that the closing
speed of the fighters is in the order of 1000 knots. From detection of a
target at 100 nm to the completion of a visual fight may be a matter of
minutes
High workload
H
The crews have to operate complex systems whilst making tactical
judgements during fast moving intercepts and can reach the limit of their
capacity during an intercept
Multiple information
sources
M
Information sources include: Radar, RHWR, JTIDS weapons systems
display, aircraft systems displays, flight instruments, intercom, radio and
the visual scene
High information
load
M
All the sources listed above may be providing necessary information at
the same time
Incomplete,
conflicting
information
M
If the radars are configured differently in each aircraft and the scan
patterns between aircraft are uncoordinated, the overall view of the search
space may be incomplete.
Rapidly changing,
evolving scenarios
H
Air combat is a highly dynamic, fast moving environment where fighter
status can change from offensive to defensive in a matter of seconds.
Requirement for
team coordination
H
Team co-ordination within crews and between crews is fundamental to the
execution of intercepts
Adverse physical
conditions
M
Air combat manoeuvring can involve turning G-forces of 7- 9 G making it
challenging to operate weapons and aircraft systems
Auditory overload/
Interference
M
There are 5 auditory sources: intercom, radio, RHWR warnings and
aircraft systems warnings
Visual Overload
M
Radar, RHWR, JTIDS weapons systems display, aircraft systems
displays, intercom, radio and the visual scene, Radar and the visual scene
73
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
7.6 Teamwork Analysis
During the TCHTA the basic mechanics of the teamwork involved in the task have been
determined. This stage of the analysis captures the higher order teamwork process
priorities for the team/collective tasks and consolidates the descriptions of teamwork
interactions by role.
7.6.1 Teamwork Process Priorities
The teamwork process priority table as shown in Table 29 facilitates the capture of the
critical teamwork processes. Judgement is required as to what level of task (TO) process
priority tables are constructed for. The guidance is to complete the tables at as high a
level as possible as their main purpose within the TCTNA is to inform the definition of
scenarios. They will also be of use during training design to inform assessment priorities
for teamwork processes.
Table 29 Example Teamwork Process Priority Table
TO: BVR intercept of multiple bandits
Teamwork
Processes
Priority
(H/M/L)
Task conditions/environmental
requirement for the process
task
demands
that
create
Task coordination
H
Workload
management
L
Information
coordination
H
Resource
coordination
M
Collaborative
problem solving
H
Determining the most appropriate intercept tactics is a key success
factor
Collaborative
planning
H
The plan has to be determined by both crews in the shortest possible time
Conflict
management
L
the
Pairs tactics are conducted in a highly dynamic environment and coordination
is fundamental to success
Radar meld and multiple target discrimination is fundamental to a successful
intercept. Multiple aggressive fighters provide the greatest challenge.
7.6.2 Teamwork Interaction Analysis by Role
Teamwork interactions are the low-level observable behaviours of teamwork that occur
between two team members. It therefore makes most sense to capture the priorities for
interactions by role for situations where training is occurring at a team level. When
training is aimed at a higher collective level, collecting this level of detail would be
merited if collective level operation has a particular impact on members of a particular
team or sub-team. The Teamwork Interaction Table as shown in Table 30 provides a
means of recording the appropriate information. Supporting KSAs and taskwork can be
74
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
captured as appropriate. The example in Table 30 is for the wingman WSO. Recording
the supporting KSAs that are required facilitates a cross check that individual training
provides the necessary underpinning KSAs. The resource allocation line has been left
blank as resources cannot be passed between individuals in a tandem, two-seat aircraft
cockpit. It is more meaningful simply to think of passing responsibility for using
available resources (task allocation) such as use of radios or defensive aids.
Table 30 Teamwork Interaction Table
Role
Wingman WSO
Teamwork
Interactions
Actions
Supporting KSA / Taskwork
Performance
Monitoring
Monitoring lead aircraft tactical
advantage during BVR engagements
Pairs tactics
Task Allocation
Giving evasive action directions to pilot
during visual engagements
Air combat manoeuvering, fighter
tactics, threat assessemnt
Resource
Allocation
Communication
Backup
Behaviours
Reporting radar contacts to formation
leader.
Communicating radar picture to Lead
WSO during radar meld
Employing defensive aids during visual
combat
Standard voice procedure
Defensive aids employment
parameters
7.7 Training Gap Analysis
Having developed a complete description of the tasks that the team or collective
organisation has to perform, the next step is to identify which tasks potentially need to be
trained and what their relative priorities are for training. If resources or time are limited
then it is possible that only the highest priority tasks will be trained. The “do nothing”
option has to be considered, in other words the acceptability of providing no training has
to be evaluated. If it is determined that doing nothing is an acceptable course of action,
then the remaining stages of analysis are not required. A positive decision to train
necessitates the specification of training objectives and the subsequent analysis of the
training overlay and the training environment.
In the individual training context, the term gap analysis is used because the issue can be
described as determining the difference or gap between an individual’s current capability
and the capability required to carry out the task. Deficiencies in legacy training may also
widen the gap or difference between the input standard and the required output standard
determine what must be trained. Training priorities may be determined by analysis of the
75
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
difficulty, importance and frequency of the tasks concerned. Tasks of high difficulty and
importance would be high priority candidates for training with tasks of low difficulty and
importance being of the lowest priority. The frequency of conduct of the task is also
relevant in that there is the possibility that high frequency tasks may have some potential
for being trained on the job. This is meaningful in the individual training context as there
is the potential for mentoring and instruction by a more experienced operator in the job
context.
The collective training context is more complex. Typically a team or collective
organisation will have a mix of people at different levels in the organisation with a
variety of backgrounds and experience levels. This is further complicated by the fact that
often people are posted in and out of the organisation at different times, so the overall
experience level of the group may be constantly changing. Therefore, defining an input
standard is problematic. Similarly, training on the job is more problematic as an idea
since there is no team to mentor and train the team itself.
7.7.1 The Risk Management Approach
The approach recommended for identifying priorities for team and collective training is
to use a risk management approach predicated on the “do nothing” option. In other
words, the risk associated with not providing training for a given task is estimated based
on the likelihood of errors being made and the severity of the consequences of those
errors. Those tasks with a high level of risk associated with them not being trained would
be high priority tasks for training. From a risk management perspective, the decision as to
whether or not a task requires training is essentially a statement as to whether or not
training is required as mitigation for the perceived risk.
7.7.2 The Training Priorities Table
To apply the risk management approach it is necessary to construct a table containing the
following entries for each task:
•
Critical errors
•
Likelihood of each error
•
Consequences of each error
•
Severity of each error
•
Risk associated with each error
•
The decision if training is required in mitigation
A suggested format for such a table, described here as a Training Priorities Table, is
shown in Table 31. The information concerning the critical errors associated with each
task is already available in the Task Description Tables produced during Training Task
Analysis and simply needs to be copied across. Guidance for rating the likelihood and
76
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
severity of errors is shown in Table 32, in the form of factors affecting the likelihood of
an error and the indicators of high severity errors. These ratings, along with the
assessment of the overall level of risk and recommendations about the requirement for
training as mitigation are matters of subjective judgement and require SME input.
77
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 31 Example Training Priorities Table
Task
Critical Error
Likelihood
Consequences
(H, M, L)
Severity
Risk
(H, M, L)
(H, M, L)
Training
required as
mitigation
(Y, N)
1.0 Destroy Bandit Fighter
BVR
Ineffective search
L
Bandit not detected and able to attack
H
M
Slow radar meld
M
Bandit has additional time to evade or take
the offensive
M
M
2.0 Destroy Bandit Fighter in
Visual Range
Poor use of pairs tactics
H
Bandit able to evade and take offensive
H
H
Y
3.0 Destroy Bandit Bomber
BVR Range
Ineffective search
L
Bandit not detected and able to continue to
target
M
L
N
Slow radar meld
M
Bandit has additional time to evade
M
M
Poor use of pairs tactics
H
Bandit able to evade and take offensive
H
H
4.0 Destroy Bandit Bomber
in Visual Range
78
Y
Y
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 32 Factors Affecting the Likelihood of an Error and Indicators of High Severity Consequences
Factors Affecting the Likelihood of an Error
•
Task complexity (the number of steps within the task).
•
Degree of teamwork required (the required degree of communication
between team members in task completion is a reasonable proxy for this).
•
Judgement of Task success - whether the task can be judged to have
been completed successfully by team members (if this is not the case, there
is no possibility for team performance self-correction).
•
Task recoverability/reversibility – there are examples of tasks which are
‘difficult’ (such as threading a needle) however because lack of success in
task execution can be easily judged and the task reattempted, this reduces
the likelihood of an uncorrected error occurring.
•
Time pressure (task delay tolerance) – operating under time pressure
increases the likelihood of error.
•
Task performance expected under conditions of stress or physical fatigue is
more prone to error
•
High task loading - occurs if the task is performed concurrently with other
tasks.
•
Information Availability – if information to perform the task may be
incomplete or of uncertain quality.
•
Time after Training – time between training and task execution.
Indicators of High Severity Consequences
•
Mission Failure
•
Loss of capability
•
Injury to personnel
•
Damage to essential equipment
•
Collateral damage or injury to 3rd parties
•
Breach of law or military regulations (such as ROE
being improperly applied).
•
Significant financial loss
79
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
7.8 Team/Collective OPS and TO Development
Operational Performance Standards (OPSs) and Training Objectives (TOs) have
essentially the same format of performance, conditions and standards. The difference
between the list of OPSs and the list of TOs is that the OPS list will contain all tasks
whereas the TO list will only contain TOs for those tasks that are to be trained.
Because of the way that the analysis has been structured and recorded, all of the data is
already available to construct the list and much of it can be simply copied from the
appropriate tables. The performance statements and standards can be copied from the
Task Description Tables directly. One judgement that has to be made is the level of
decomposition of the objectives that is presented. The HTA process yields a detailed
decomposition of the tasks which could be used to populate an instructional scalar that
covers both training objectives and enabling objectives. It may be appropriate to limit the
OPS list and the corresponding list of TOs to the top level of the decomposition. The
detail is not wasted, as it provides the training designers with the enabling objectives and
saves further analysis.
The conditions for the tasks are in fact defined by the system context descriptions – the
description of each environmental element in the Environment Description Table. Noting
that not all elements will necessarily apply to all TOs and OPS statements, an expedient
way of dealing with the complexity of the environmental conditions is to copy the
element descriptions into a numbered list and refer out to the list.
7.8.1 Linking Tasks and Training Objectives to Mission Task Lists
There is a current aspiration to generate a coherent structure to link collective task
performance and the associated training objectives to the tasks listed in the Mission Task
Lists (MTLs) so that the contributors to Operational Performance can be identified and
training costs justified. It is also of significance from the perspective of change
management, facilitating the identification of changes that are required to training as a
consequence of changes to the MTLs.
80
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Figure 26 General mapping of Tasks to METs
The issue which this presents is that many of the tasks and related training objectives may
well link to many tasks listed in the MTL. This is illustrated diagrammatically in Figure
26. The consequence of this issue is that many checks have to be made to identify all the
linkages. To put the magnitude of this issue into context, in each of the QE Class Role
Group summaries prepared by the Aircraft Carrier Alliance there are in the order of 110
tasks identified for Air Traffic Control (ATC) roles and approximately 310 listed tasks in
the MTL(Maritime) (MTL(M)). Potentially, in excess of 30,000 cross-checks would have
to be made to determine how the tasks link to the MTL(M) Careful inspection of the
MTL(M) shows that there are five tasks which the ATC tasks support (see Figure 27),
reducing the number of cross-checks required to 550.
In information systems engineering this problem is resolved with a linking object (in
database design this is a “join table”) which avoids the duplication of redundant
information. In the ATC example the linking object which connects all ATC collective
tasks is the function that is delivered through the synergy of all ATC collective tasks – in
this case the ATC Service itself. Multiple ATC collective tasks (launch, recovery,
handovers, overflight etc.) support the function “Operate ATC Service”. Likewise the
“Operate ATC Service” contributes to multiple military tasks. Figure 27 expresses this
idea graphically.
81
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Figure 27 Mapping of Example ATC Role Group Summary Tasks to Military Tasks
in the MTL(M)
This approach avoids the situation of having to cut and paste all the MTL(M) references
into each of the ATC task descriptions and training objectives. All that is required is to
identify which function is supported by the collective task, and then what METs that
function (in this case “Operate ATC Service”) supports.
Therefore, the recommended approach is to identify the top-level functions, under which
the tasks and training objectives can be grouped, and map these to the MTLs.
82
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
8 Training Overlay Analysis
8.1 Introduction
The Instructional Overlay is the totality of all instructional methods and functions and
associated hardware, software and communication channels which deliver those
instructional functions. For example the instructor station in a flight simulator constitutes
part of the instructional overlay as this is the device that generates alterations to the task
environment (e.g. the hardware and software that enables engine fires to be simulated and
weather to be selected). The purpose of training overlay analysis is to determine
appropriate instructional methods for the tasks to be trained, the instructional functions
that have to be supported to deliver the methods identified and the specification of how
these instructional functions have to be supported.
In individual training the selection of instructional methods and the identification of
instructor tasks is relatively simple, often with a single instructor employing a range of
methods with a group of students. It becomes more complicated when individual training
has to take place in a group context as more instructors and more resources may be
required. For example, training platoon commanders usually requires students on the
course to act as platoon members whilst one individual is practising the platoon
commander role. Such training requires a training area, people to role-play enemy and
multiple instructors to run the training.
Team and collective training takes this complexity to another level. Taking the example
of Fleet Synthetic Training which is run by the US Navy for training Carrier Battle
Groups which RN warfare teams and battle staffs participate in, the training audience is
approximately 2000 strong and the instructional and support staff number in the order of
250. Of the 250 instructional and support staff, 50 are instructors engaged in observing
the teams at work on-board ship or in warfare team simulators. Another instructional
team manages the running of the scenario. They in turn are supported by a team of role
players who interact with the training audience and manipulate the elements in the
simulated environment (such as enemy vessels and aircraft) to cause the injects to
happen.
The approach described in this chapter is designed to provide a logical method for both
determining appropriate instructional strategy and identifying instructor tasks.
8.2 Instructional Theory
This section provides a short overview of instructional theory that can easily be applied to
Team and Collective Training.
8.2.1 Clark’s Content-Performance Matrix (Clark, 2008)
There are a number of taxonomies available that guide the classification of training
content but not all provide related guidance on suitable elaboration strategies. Clark’s
83
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Content-performance Matrix both guides the classification of content and has related
guidance on elaboration strategies (Clark, 2008). The matrix is reproduced at Table 33.
Table 33 Training Content Performance Matrix with Application Level Elaboration
Strategies, adapted from Clark (2008)
Content Type
Performance
Remember
Elaboration Strategy for Apply
Apply
Facts
Remember
the facts
N/A
N/A
Concepts
Remember
the definition
Classify new
examples
Provide the definition
Provide examples
Provide counter examples
Provide practice examples for classification
Process
Remember
the stages
Solve the
problem
Make
inferences
Explain key concepts of the process
Explain the stages of the process
Provide practice exercises requiring application
of process knowledge
Assessment by observation of performance
Procedure
Remember
the steps
Perform the
procedure
Clear statement of the steps of the procedure,
with illustrations
A demonstration of the procedure
Hands on practice, using the actual equipment,
with feedback
Assessment by observation of performance
Principle
Remember
the
guidelines
Perform the
task
State guidelines or show examples of guidelines
being implemented
Solve
problems
Present non–examples
Practice application of guidelines in realistic
circumstances, with feedback. Assessment by
observation of feedback
What makes the matrix particularly useful is that elaboration strategies are offered for
both remembering content and applying it. The table shows a high level synopsis of the
recommended elaboration strategies for application, as team training is focused on skills
application. Team training is typically concerned with teaching the application of
procedural skills and the application of principles (such as doctrine). The point of note is
that the recommended elaboration strategies require the teaching of the underpinning
knowledge before practice takes place.
84
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
8.2.2 Part-task training
Figure 28 Part Task Training Sequence (adapted from Wickens and Hollands, 2000)
Part-task training is a familiar approach used widely in individual training. A typical parttask training sequence adapted from Wickens and Hollands (2000) is shown in Figure 28,
which illustrates subtask A being trained, followed by subtask B and then the whole task
(A and B together) being trained. The key criterion for the utility of part task training is
that it is only successful if the subtasks are loosely coupled. If there is a high degree of
interconnection between the tasks then part task training is unlikely to be suitable.
The part-task approach is potentially of use in team and collective training where difficult
subtasks exist that are not highly connected to other subtasks and could therefore be
practiced separately. Such tasks can be identified from the TCHTA diagram and Tasks
Description Tables. The difficulty of the task is reflected in the critical errors and
consequences fields and the description of the teamwork requirements, and the degree of
connection to other tasks can be evaluated based on the complexity of the plan that
describes its relative sequencing (also shown in the Task Sequence Diagram) and the
inputs from other tasks.
8.2.3 Reigeluth’s Simplifying Conditions Method (Reigeluth, 1999)
The essence of Reigeluth’s Simplifying Conditions Method (SCM) is that, when training
complex tasks, the whole task should always be taught and practiced, but in increasing
levels of difficulty (Reigeluth, 1999). The initial scenario would be as simple as possible
(such as driving a car in a wide open space). The diversity and complexity of scenario
elements would gradually be increased leading to scenarios representative of the most
complex situations (such as driving a car through a busy town at rush hour). This type of
training sequence is illustrated in Figure 29.
85
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Figure 29 Simplifying Conditions Method Training Sequence (Adapted from
Reigeluth, 1999)
Whilst the detailed application of SCM is most pertinent to the detailed training design
phase, it has relevance at a high level during TNA as there may be a need to specify a
simplified environment for the early stages of task practice.
8.2.4 Team Training Methods
Salas and Priest (2005) identify three basic methods for the delivery of team training:
1. Information based.
2. Demonstration based (e.g. video).
3. Practice-based (e.g. guided practice).
These three methods are consistent with the approaches recommended by Clark (2008)
for the elaboration sequences for teaching principles procedures and processes, with
information and demonstration forming initial instruction to be followed by practice.
Salas and Priest (2005) also identify the following key principles of training:
•
Team training should lead to teamwork skill development (e.g. leadership,
adaptability).
•
Both process and outcome measures are needed for teamwork diagnosis.
•
Simulations should allow team members to experience some alternative course of
action.
•
Response strategies, provided during training, should be linked to cues in the
environment and from other team members.
86
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
•
Team members should be provided sufficient opportunities to interact with novel
environments in order to develop adaptive mechanisms.
•
Team training should be more than a “feel good” intervention.
•
Guided practice is essential.
•
Training should establish a mechanism to foster teamwork.
•
Both teamwork and taskwork competencies are needed for effective team
functioning.
From the TCTNA perspective, these principles underline the importance of providing
representative practice environments in which the training audience are presented with
credible scenarios which stress their capability to conduct both the taskwork and
teamwork elements of their tasks. Furthermore, effective coaching and feedback
(including After Action Review) have to be provided.
The implications for this stage of analysis are that it is essential to identify the following:
•
The critical features of generic scenarios that are required to stress the teamwork
and taskwork components of the tasks to be trained.
•
The data that have to be gathered in order to inform assessment of processes and
products of team activity.
•
The instructional tasks that have to be fulfilled to plan, deliver and evaluate the
training.
8.2.5 The Relationship between Tasks and the Operational Environment
There is a significant relationship between TOs and the operational environment that has
to be understood from the training perspective. Different tasks require inputs from and
provide outputs to different elements in the environment, as captured in the context
descriptions at the start of the Team Task Analysis Phase. The corresponding TOs will
therefore require representations of these sets of environmental elements. This is
illustrated in Figure 30 which shows the mapping of tasks to the operational environment.
87
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Figure 30 Mapping of TOs to the Operational Environment
Each of TO 1 to 4 shown in the diagram require different but overlapping sets of
elements in the operational environment. TOs 1.4 and 3.1 also have overlapping
requirements but for much smaller subsets of environmental elements. The accurate
identification of part task training requirements and simplified conditions opportunities
ensures that overly complex training environments are not specified.
8.3 Instructional Method Selection
Team tasks can typically be considered to involve carrying out procedures and processes
and applying principles to solve problems in both planning and the execution of plans.
The general instructional sequence of providing initial instruction, followed by
opportunities to practice the required skills in a realistic environment with feedback on
performance, and culminating in assessment still holds true. Generally practice and
instruction will take place in the same environment and are closely linked. It therefore
makes sense to consider the methods for initial instruction and the methods for practice
and assessment as two separate components.
8.3.1 Selecting Methods for Initial Instruction
From a task perspective, the general elaboration strategies for teaching processes,
principles and procedures require the teaching of the appropriate knowledge components
underpinning them. If this is not undertaken and training consists of practice followed by
assessment, trial and error could prove to be an expensive way of developing an
understanding of these underpinning knowledge elements. For example, classroom-based
training on standard operating procedures could provide an invaluable opportunity for the
team to develop a shared mental model of how the team as a whole should be operating
88
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
and how the functions of sub-teams need to be co-ordinated. Similarly, instruction on the
application of doctrine, including such matters as the application of Rules of Engagement
in the context of a particular theatre of operations, provides the team with an opportunity
to develop shared mental models of how problems should be addressed and is likely to be
time well spent.
The initial instruction phase provides a key opportunity for addressing the teamwork
aspects of the tasks to be trained. Discussion of the priorities for teamwork interactions
and processes in the light of environmental task demands has the potential to provide the
team with sound mental models of how the team should function. Table 34 provides
suggestions for the content that should be covered in the initial instruction related to
teamwork objectives.
Table 34 Suggested Content for Initial Instruction on Teamwork
Teamwork Elements
Level 1
Level 2
Level 3
Level 4
Team member KSAs
Teamwork
Component
Interactions (TCIs)
Teamwork Processes
Team Properties
The organisational structure
and relationships of roles
The need for shared
mental models of the task
and how the team
operates and
communicates
Priorities by role for :
The priorities for :
•
Communication
•
Task Coordination
•
Performance
Monitoring
•
Workload
management
•
Backup
behaviours
•
Information
Coordination
•
Resource
allocation
•
Resource
coordination
•
Task allocation
•
Collaborative
problem solving
•
Collaborative
planning
•
Conflict management
The significance of
adapatability and cohesion in
the conduct of the teams’
tasks in the light of the
environmental task demands
Given the nature of the
tasks in the light of the
environmental task
demands,
Having identified the topics that require initial instruction, appropriate methods need to
be identified. Classroom–based lectures and discussions are likely candidate methods.
Others such as demonstrations may be possible depending on the nature of the task being
trained.
89
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
A suggested output from this phase of analysis is an Initial Instruction Table. An example
Initial Instruction Table is shown at Table 35.
Table 35 Example Initial Instruction Table for F3 Pairs Training
Training Objective
Topic for Initial Instruction
Method
1.0 Destroy Bandit
Fighter BVR
Pairs tactics for engaging radarequipped bandits BVR
Classroom lecture and discussion
with illustrations of good and bad
examples of the application of
tactics
Principles for coordinating a pair’s
engagement
2.0 Destroy Bandit
Bomber BVR
Pairs tactics for engaging non
radar-equipped bandits BVR
3.0 Destroy Bandit
Aircraft in Visual
Range
Pairs tactics for visual attacks
4.0 Identify key
teamwork KSAs for
the conduct of pairs
intercepts
Critical teamwork behaviours in
pairs intercepts
Classroom lecture and discussion
with illustrations of good and bad
examples of the application
teamwork principles
8.3.2 Selecting Methods for Practice and Assessment
In general, the most likely method for practice of team and collective tasks will be the
practice of the task in a representative environment with assessment based on observation
of the team carrying out the tasks with formal debrief at the end of appropriate phases of
activity. However, there is a need to identify sub-tasks which require part-task training
and tasks or sub-tasks that would benefit from practice in simplified conditions.
8.3.2.1 Identification of Part-Task Training Requirements
Consideration needs to be given to whether or not it is necessary or appropriate to train
some subtasks separately. Candidate subtasks for separate training would include:
a. Sub-tasks which do not require the whole team. If there are sub-tasks that only
require a sub-team it may make sense to train those tasks separately as a precursor
to whole team practice. The Task Role matrix provides a simple lookup table for
identifying subtasks which fall into this category.
b. Sub-tasks which have different environmental requirements. If there are subtask components which require only a sub-set of the environmental elements it
may make sense to train these tasks separately from the perspective of efficient
training resource utilization. Such subtasks can be identified from the Input and
Output fields of the Task Description Tables.
c. Sub-tasks which are difficult and require additional practice. Tasks which
are considered to be particularly difficult may well benefit from additional
practice if it is considered sufficient practice cannot be achieved during practice
90
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
of the whole task. Such subtasks can be identified from the Typical Errors and
Consequences fields of the Task Description Tables.
Candidate subtasks for separate training may well fall into more than one of these
categories. An example would be the training of a Battle Staff to apply the Combat
Estimate. The development of a Combat Estimate is based around answering seven key
questions that have to be addressed when formulating an operation plan. During the
execution phase of an operation the Battle Staff have to simultaneously control the
execution of the plan, adjust the plan in response to events during execution (which may
require reapplication of the Combat Estimate process) and apply the Combat Estimate
process to further phases of the operation. During execution of the plan inputs are
required from subunits who are executing the plan which constitutes a more complex
environment than that for the development of the plan. The development of the combat
estimate is complex and is a task which merits additional practice.
8.3.2.2 Identification of Simplifying Conditions Method Requirements
Consideration should be given to identifying where there is benefit to be gained from
practicing tasks in simplified conditions prior to practicing the tasks in the full
environment. This is particularly applicable to complex tasks and tasks where there are
significant environmental task demands. In such cases there is likely to be a high degree
of teamwork required and there may need to be training effort focussed on the teamwork
aspects of the task.
An example from the F3 pairs training task would be the practice of intercepts against
bandit fighters in visual range. This task requires operation of all the aircraft systems and
air combat manoeuvring which can result in the crews experiencing sustained G-Forces
of up to 7-G or more. It is particularly challenging when conducted at low-level because
of the inevitable risk of collision with the ground or sea. Practicing the task in simplified
conditions without the physical stressors, the element of danger and the need to deal with
other aircraft systems could allow the students to focus on the key issues of the
application the appropriate tactics and the teamwork interactions and processes necessary
to carry out the intercept effectively.
8.3.2.3 Assessment and Feedback Methods
Given that the general approach for team and collective training is to facilitate practice of
the required teamwork and taskwork skills in a representative environment, assessment is
essentially based on observation of team performance. The detail for assessment has
already been captured in the Task Description Table for each task identified in the
TCHTA.
There are a number of options for the provision of feedback which include:
•
Coaching of individuals during task performance
•
Coaching of sub-teams during task performance
91
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
•
Hot – debriefs at the conclusion of an exercise component or phase
•
After Action Review of whole team performance at the end of an exercise phase.
8.3.2.4 Practice and Assessment Methods Table
The Practice and Assessment Methods Table provides a simple record of the methods
selected for each training objective, identifying whether the whole task will be trained in
a representative environment alone or if a simplified environment will be required. Any
subtasks that require part-task training are also identified.
Probably the most challenging aspect of completing this table is the estimation of the
time that is required for training. This is a significant value because it will impact on the
cost of training. There are no convenient formulae or even heuristics to guide this
estimation. The best guide is the time that similar training takes and SME instructor
estimations. Constraints on course duration if there are any and on resource availability
(such as aircraft training hours available) have to be taken into consideration.
92
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 36 Example Practise and Assessment Methods Table
Training
Objective
Practice method
Assessment method
Feedback Method
Time Estimate
2.0 Destroy
bandit fighter in
visual range
1. Simplified conditions
practise of intercepts
Observation of task performance
Hot debrief after each intercept in a
training sortie
5 hours
2. Practice of whole task in
representative environment
Evaluation of recorded data of
missile launch parameters
Observation of task performance
Evaluation of recorded data of
missile launch parameters
Formal AAR with all exercise players post
sortie
Hot debrief after each intercept in a
training sortie
Formal AAR with all exercise players post
sortie
93
7 hours
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
8.3.3 Training Scenario Specification
A critical part of the training overlay is the nature of the scenarios that are delivered in
which the training audience practices their tasks. Noting the guidance on team training in
section 8.2.4, it is imperative that the scenarios are representative and credible.
Furthermore, they should reflect the environmental tasks demands identified for the task
during the Team/Collective Task Analysis. Whilst the development of detailed scenarios
is the domain of training design and subsequent preparation for specific training events, it
is necessary to capture the key features of required scenarios in a generic way as these
inform the specification of the training environments at the next stage of analysis. Any
potential training environment must be capable of delivering appropriate scenarios or it
will serve no useful purpose for training. That said, it may well be both inefficient and
inappropriate to attempt to provide the full operational environment for training each task
(hence the consideration of simplified conditions environments).
There are five sources of information from the task analysis phase that need to be
considered when developing training scenarios, illustrated in Figure 31.
Figure 31 Information Inputs to Generic Scenario Specifications
The generic scenario specifications provide the baseline information for the development
of the scenario. The task descriptions amplify this as they capture the inputs that have to
be provided and the outputs that have to be delivered. However, it is the environmental
task demand information, in conjunction with the information about the types of error
that are likely, that provide the characterisation of what makes the task difficult. The
scenario should contain elements that place the appropriate demands on the team. In the
case of F3 Pairs training, factors such as visibility, difference in altitude between the
bandits and the fighters (a bandit above the fighters has a tactical advantage), and the
aggressiveness of the bandits are significant elements which ideally should be
controllable. It is also necessary to consider the teamwork elements that are priorities as
this may influence the scenario requirements. Communication, performance monitoring,
backup behaviour, coordination and task allocation being significant for a pair’s
94
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
intercept. Appropriate bandit tactics can generate the requirement for backup of the
leader by the wingman.
It may be that the generic scenarios identified at the start of the analysis process can each
be amplified to ensure there is sufficient detail to inform the description of the training
environment (with detail such as what control of the elements is required and how their
numbers and actions may vary). Recording which training objectives are covered by
which scenario can provide a useful cross reference of coverage of training objectives. It
may also be appropriate to develop scenarios for aggregations of training objectives or
even individual objectives if they are likely to require different training environments,
such as live firing.
It is suggested that scenario specification is carried out in two stages. The first stage is to
develop a narrative description of the properties of the required training scenario. A
suggested format for recording these is the Training Scenario Table shown in Table 37.
Table 37 Example Training Scenario Table
Training Scenario Description 1
Training Objectives
2.0 Destroy Bandit Fighter in Visual Range ……
A single bandit fighter should enter into visual range of the fighter pair and
take appropriate aggressive defensive action to counter the attack using air
combat manoeuvre and flares against heat seeking missiles. If the Tornado
pair loses tactical advantage then the Bandit should counter attack taking heat
seeking missile shots. Visual engagements should be conducted at both
medium and low level and occur in degraded visibility as well as high visibility
conditions. Bandit tactical advantage from initial positioning and altitude and
aggressiveness of bandit tactics should be controllable.
This then needs to be mapped into the Environment Description Table as shown in Table
38. This mapping facilitates the specification of the training environments.
95
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 38 Environment Description Table Entries for Task Scenario Requirements
Element
Fighter
Controller
Bandit Fighters
Bandit
Missiles
Bandit
Bombers
Heat
seeking
missiles
Not
required
Heat
seeking
missiles
Not
required
Aircraft
Weather
Terrain
Training Scenario 1
Simplified
conditions
practice
Not
Required
Bandit fighters take
evasive action using air
combat manoeuvring and
flares if defensive and
counter attack taking heat
seeking missile shots if
offensive
Effects of G not
required
Operation of aircraft
systems other than
weapons systems and
flight controls and
displays not required
Clouds, haze,
precipitation , sun
position and light
levels
Ground
topology.
Clouds, haze,
precipitation , sun
position and light
levels
Ground
topology.
Bandit initial tactical
advantage and
aggressiveness of bandit
tactics should be variable
Full task
practice
Not
Required
Bandit fighters take
evasive action using air
combat manoeuvring and
flares if defensive and
counter attack taking heat
seeking missile shots if
offensive
All systems and effects
of G required
Bandit initial tactical
advantage and
aggressiveness of bandit
tactics should be variable
96
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
8.4 Instructional Task Identification
At this stage in the analysis it is necessary to identify the generic instructor tasks that will
have to be fulfilled in order to deliver the training according to the methods identified.
How these are actually instantiated will depend on the actual environments that are
selected. Consideration also has to be given as to what elements in the instructional
environment have to be configured and controlled by the instructors and the requirements
for data capture during scenario execution.
8.4.1 Instructor Task Table
It is suggested that an Instructor Task Table is developed to provide a description of the
instructor tasks and identify requirements for the training environment to support the
training overlay. The exact nature of the task will be dependent on the nature of the task
being trained and the task environment. The following are suggested as generic headings
under which the table entries can be developed:
•
Deliver Initial Instruction
•
Deliver Practice
o Prepare Scenario
o Deliver Scenario
•
ƒ
Configure the scenario
ƒ
Control Scenario
ƒ
Change Scenario
Assess practice
o Monitor and assess individual and team performance
o Monitor and assess performance outputs
o Provide feedback during task performance
o Debrief practice
As the exact nature of each of these tasks will vary according to the training objective
being taught it is suggested that the description of the tasks be broken down by training
objective. An example of an Instructor Task Table and illustrative entries is shown in
Table 39.
97
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 39 Example Instructor Task Table Format and Entries
Instructor Task
Description
2.0 Deliver Practice
2.2 Deliver Scenario
2.2.1 Configure the scenario
Simplified conditions practice of TO 2.0
Set the weather conditions and the geographical
area of practice.
Define bandit profiles
3.0 Assess Practice
3.2 Monitor and assess performance
outputs
Simplified conditions practice of TO 2.0
Monitoring of aircraft manoeuvre during intercepts
Visualisation of the manoeuvre of the F3 pair in 2D and 3-D should be available for debrief.
Monitoring and recording of intercom chat and
radio transmission from both crews required
Monitoring and recording of visual scene (out of
window)
3.4 Debrief practice
Simplified conditions practice of TO 2.0
Use visualisation of the manoeuvre of the F3 pair
in 2-D and 3-D for debrief.
8.4.2 Training Overlay Requirement Specification
Having identified the instructor tasks, the descriptions can be examined to identify the
requirements for the training environment. Requirements identified, such as defining the
bandit profiles and the monitoring of aircraft manoeuvre in the examples given in Table
39 can be mapped directly into the Environment Description Table as illustrated in Table
40.
This completes the analysis of the training overlay.
98
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 40 Environment Description Table Training Overlay Requirements Entries
Element
Fighter
Controller
Bandit Fighters
Bandit Missiles
Bandit
Bombers
Aircraft
Weather
Monitoring of aircraft
manoeuvre during
intercepts
Clouds, haze,
precipitation,
sun position
and light levels
should be
capable of
being preset
by the
instructor
Terrain
Training Objective 2.0 Destroy Bandit Fighter in Visual Range
Simplified
conditions
practice
Overlay
requirements
Not Required
Bandit fighters fly a
predefined profile and
counter attack with a
degree of
aggressiveness to
match student
performance. The
instructor running the
intercept should be
able to change the
profiles and bandit
tactics as required
during the sortie.
Replay of bandit
manoeuvre, missile
launches and use of
defensive aids
required for debrief
Heat seeking
missile
trajectory and
effects required
for debrief
Not required
Visualisation of the
manoeuvre of the F3
pair in 2-D and 3-D
should be available
for debrief
Monitoring and
recording of intercom
chat and radio
transmission from
both crews required
Monitoring and
recording of visual
scene (out of
window)
99
Sea or land
selectable by
the instructor.
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
9 Training Environment Analysis
9.1 Introduction
Training environment analysis is conducted in three stages. The first stage is the
specification of the required environments. The second stage is the identification of
environment options and the third stage is the detailed definition of putative
environments.
9.2 Training Environment Specification
Training environment specification is achieved by first rationalising the number of
training environments required and then conducting a fidelity analysis.
9.2.1 Training Environment Rationalisation
The first stage of training environment specification is to rationalise the number of
training environments that need to be specified. Training environment requirements for
each training objective have already been identified in the training overlay analysis
phase. The aim of this phase is to identify the optimal set of training environments.
Selection of optimal environment is illustrated in Figure 32 a and b. Figure 32a shows
that there is a large degree of overlap between TOs 1, 2 and 3 and therefore it seems
sensible to specify Training Environment 1 for the whole task provision of training for
these TOs and will cater for some aspects of TO 4. Training Environment 2 provides all
the environmental elements required for TO 4. In the F3 pairs example this could be
equivalent for providing a training environment for all aspects of Pairs intercepts except
for the full training in the use of the cannon at close range. Live firing with the cannon is
an annual training requirement and is conducted on a range using towed targets. This
would be analogous to Training Environment 2. TOs 1.4 and 3.1 are provided with
Training Environment 3 which is a part-task training environment. Training Environment
4 shown shaded in Figure 32 represents the provision of a simplified conditions training
environment for TOs 1-3. The “hole” in the middle of Training Environment 4 represents
the set of elements common to TOs 1-3 that are omitted in the simplified conditions.
100
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
a.
Whole Task and Part‐task Training Environments b.
Simplified Conditions Training Environment
Figure 32 Mapping of TOs to Training Environments
101
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
9.2.2 Fidelity Analysis
Having identified the required training environments, the next stage is to determine the
fidelity requirements for each environment.
Fidelity refers to the degree of
correspondence between the training environment and the operational environment. To
take a simple example, consider a training environment that requires the inclusion of a
German Shepherd attack dog. If the representation of the attack dog is a one-foot tall,
pink, fluffy dog which, when prodded with a sharp stick, walks sideways towards you,
vigorously wagging its tail and then attempts to lick your hand, it would probably not be
perceived to be a particularly credible attack dog. Furthermore, it would be unlikely to
be of any particular value in training someone in how to treat such an animal. What the
example does illustrate is that the central concerns for fidelity are the physical form and
function of elements in the environment. Technically, these aspects are referred to as
physical and functional fidelity.
The challenge for fidelity specification is determining “how much” fidelity is required.
To push the attack dog example a little further, consider the requirement for training the
reactions on detecting an attack dog. From a physical fidelity point of view, one-foot tall,
pink and fluffy is probably not going to do the trick. Full size, accurate appearance, a
loud bark, and plenty of teeth are probably nearer the mark. From a functional fidelity
perspective, how it eats its dinner from a bowl and the ability to mark its territory when
encountering a lamp post are not likely to be of relevance. However, the distance at
which it can detect someone by scent and the effect of wind direction on the range at
which it can detect scent probably are, as is the speed at which it can run and under what
conditions it attacks.
A practical issue that needs to be considered is that SME input will be required.
However, there will be more than one opinion on the subject. Obtaining a consensus view
is advised. In the case of providing a simplified conditions environment for Tornado F3
pairs training, there were three main sets of views about the fidelity of controls and
displays. Some considered generic controls and displays were acceptable, some thought
that displays had to be exact but controls could be generic, and others thought that both
controls and displays had to be exact replicas. The final outcome was a sound British
compromise. Replica throttle, stick and radar hand controllers were procured. For the
systems displays, touch screens were used with faithful representations of the key
displays.
9.2.2.1 Specifying Fidelity Requirements
In order to fully specify fidelity requirements for a given training environment, each
element in the environment has to be considered in turn. The Environment Description
Table contains all of the elements that have to be described in fidelity terms. Fidelity
specification therefore amounts to making appropriate entries in the table. In the
Team/Collective Training Model (Figure 14) the Task Environment is shown as being
composed of five different categories of elements:
a. physical environment elements
b. manned system elements
102
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
c. systems elements
d. human elements
e. resource elements
Different data is required for each of these types of elements. The definitions and entries
for these five types of elements are considered in turn.
9.2.2.1.1 System Fidelity Requirements
For the purposes of fidelity specification, a system is anything with a mix of controls,
displays and output devices that the training audience has to interact with, such as
warfare systems, radios, and weapons systems. Descriptions of physical and functional
fidelity are required which consider the attributes in Table 41.
Table 41 System Fidelity Requirements
Physical Fidelity Requirements
Attribute
Description
Size
Does the item need to be full size or is a smaller representation
acceptable?
Location
Do the controls and displays have to be correctly spatially located with
reference to the operator’s position, or if not what is acceptable?
Appearance
Do the colour and texture matter? What are the critical appearance
attributes?
Controls
Are all the controls required, if not, which are?
Feel
Does the feel of the controls have to be replicated exactly?
Weight
If the system is portable does it have to be a representative weight and
balance?
Motion
What motion cues does it have to provide?
Sound
What sounds have to be produced and to what degree of fidelity?
Functional Fidelity Requirements
Attribute
Description
Format
Does the format of displays have to be replicated exactly?
Content
Can any display content be omitted?
Response
Does system response have to be replicated exactly or if not, what
elements can be omitted and what tolerance on system response is
acceptable?
Appearance to
other system
elements
If the system interacts with other entities in the environment what
attributes must it have (e.g. an aircraft has a radar signature and heat
signature)?
103
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
The appearance of the system to other elements in the environment is included to ensure
that data for all interactions is captured. It is included in the functional fidelity section as
it may be a dynamic attribute, for example an aircraft heat signature will vary according
to thrust or reheat selection.
9.2.2.1.2 Resource Fidelity Requirements
Resources are all the elements in the environment that are not human or manned systems.
These include logistics elements and equipment. The attributes in Table 42 need to be
considered.
Table 42 Resource Fidelity Requirements
Physical Fidelity Requirements
Attribute
Description
Appearance
Do the colour and texture matter? What are the critical appearance
attributes?
Feel
If the item can be touched, does the feel of the item have to be
replicated exactly?
Weight
If the item is portable does it have to be a representative weight and
balance?
Sound
What sounds have to be produced and to what degree of fidelity?
Functional Fidelity Requirements
Attribute
Description
Behaviour
What aspects of behaviour have to be produced to generate
interactions with the team and to respond to interactions from the
team?
Interaction information
requirements
Information required to generate interactions with the team or
respond to team interactions
9.2.2.1.3 Human Element Fidelity Requirements
Fidelity requirements for the human elements in the environment are of particular
importance. In order to be able to determine if a role player can be used, it is necessary to
capture the required knowledge and skills needed to generate interactions with the team
and to respond to team inputs. In simple situations, such as passing radio traffic, an
untrained operator could read from a script. However, in the case of a carrier strike group
exercising in a synthetic environment under the control of a Maritime Component
Command (MCC) the interactions are more complex and would required senior staff with
command experience to play the role of a staff member of the MCC. It is also necessary
to know what environmental information they require to generate inputs and responses
(for example the Fighter controller in the F3 Pairs training example needs to have a
dynamic, 2-D view of the tracks of both the pair and the bandits with altitude
information). The attributes in Table 43 need to be considered.
104
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 43 Human Fidelity Requirements
Physical Fidelity Requirements
Attribute
Description
Appearance
What aspects of physical appearance and dress are significant?
Language
What language/dialect should the person speak if they interact by
voice with the training audience?
Functional Fidelity Requirements
Attribute
Description
Behaviour
What aspects of behaviour have to be produced to generate
interactions with the team and to respond to interactions from the
team?
Interaction information
requirements
Information required to generate interactions with the team or
respond to team interactions
Knowledge and skills
What knowledge and skills are required to produce the required
behaviour given the information and systems provided?
9.2.2.1.4 Manned Systems Fidelity Requirements
Manned systems, such as bandit fighters, are a hybrid between resources and human
elements. They appear dynamically in the environment but respond based on human
decision making, knowledge and skills. If the element is an enemy vehicle, it is necessary
to determine how realistic it should be in appearance or however else it is perceived and
how its behaviour is determined. Table 44 illustrates the attributes that need to be
considered.
Table 44 Manned Systems Fidelity Requirements
Physical Fidelity Requirements
Attribute
Description
Appearance
Do the colour and texture matter? What are the critical
appearance attributes?
Sound
What sounds have to be produced and to what degree of fidelity?
Functional Fidelity Requirements
Attribute
Description
Behaviour
What aspects of behaviour have to be produced to generate
interactions with the team and to respond to interactions from the
team?
Interaction information
requirements
What information is required to generate interactions with the
team or respond to team interactions?
Knowledge and skills
What knowledge and skills are required to produce the required
behaviour given the information provided?
Appearance to other
system elements
If the system interacts with other entities in the environment what
attributes must it have (e.g. an aircraft has a radar signature and
heat signature)?
105
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
9.2.2.1.5 Physical Environment Elements
Static features of the environment such as terrain require only a physical fidelity
description whereas dynamic elements (such as rain or waves) require functional
specifications as well. Table 45 illustrates the attributes that need to be considered.
Table 45 Physical Environment Fidelity Requirements
Physical Fidelity Requirements
Attribute
Description
Appearance
Do the colour and texture matter? What are the critical appearance
attributes?
Feel
If the item can be touched, does the feel of the item have to be
replicated exactly?
Sound
What sounds have to be produced and to what degree of fidelity?
Functional Fidelity Requirements
Attribute
Description
Behaviour
What aspects of behaviour have to be produced to generate
interactions with the team and to respond to interactions from the
team?
Interaction
information
requirements
Information required to generate interactions with the team or
respond to team interactions
9.2.2.1.6 Recording Element Fidelity Specifications
Element specifications are recorded in the Environment Description Table. This has the
advantage of keeping the Environment Description Table as the single master document
that provides a comprehensive description of the required environment set. There will be
a set of entries for each of the training environments described. This is illustrated in Table
46
106
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 46 Tornado F3 Pair Environment Description Table
Element
Fighter Controller
Bandit Fighters
Bandit Missiles
(Human)
(Manned System)
(Resource)
Radar Homing and
Warning Receiver
(System)
Training Environment 1 Simplified Conditions Environment for whole task training of all objectives
Fidelity
Requirements
On receipt of target
information request, provide
target location, height,
heading and speed and
vector to target
Requires a tactical display
showing the positions and
speeds of the F3 and bandit
aircraft
Knowledge of tactics and
control procedures for air
intercepts, ability to calculate
appropriate vectors given a
plan view equivalent to a
radar display
Representative appearance
in visual range.
Accurate radar signature
and heat signature
Speed, acceleration,
climbing and turning
capability to be fully
representative
To attack : radar /visual
indication of F3 positions
and manoeuvre, knowledge
of tactics
To evade missile attack:
radar warning receiver
indications of F3 radar lock,
knowledge of tactics
including use of defensive
aids
Pursue F3 Pair aircraft
Accurate speed and turning
performance
Radar guided missiles:
Radar signatures of F3,
range to target
Heat seeking missiles: heat
signature of F3 and flares,
range to target
Radar guided missiles:
Radar signatures of F3 and
chaff,
Heat seeking missiles: heat
signature of F3 and
107
Size and location of display
does not need to be exact
but must be easily viewed
Screen colour should be
accurate
Audio warnings should be
accurately replicated
Detection ranges should be
exactly modelled
Visual indications should be
exact
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
9.3 Training Environment Option Identification
Generally the environment options for practicing a team task will either be the use of
the live environment in some form or the use of simulation. If weapons effects are
involved there will invariably be a requirement for the simulation of weapons effects
with separate provision for live firing if relevant.
The general considerations for the use of simulation include:
1.
Safety - Simulation will be the most appropriate option where there is
an unreasonable level of danger involved in using the live environment one
would not fire live weapons at human or manned targets or practice dealing
with an aircraft engine fire by setting an engine on fire in flight.
2.
Cost - Simulation also offers the opportunity for significant cost
savings in equipment maintenance, consumables cost and other cost associated
with training delivery. A Tornado F3 costs tens of thousands of pounds per
hour to operate whereas the overall cost of a simulator would be in the order of
hundreds of pounds per hour.
3.
Time - The use of simulation can also make considerable savings in
time. For example, to conduct an hour of air intercept training for 5 intercepts
takes about 3 hours for the sortie, taking into account aircraft checks, transit
time to and from the training area, and the time to reposition aircraft after each
intercept. The equivalent training could be achieved in a simulator in about 40
minutes.
4.
Resource availability – Simulation has the potential to address many
resource constraint issues. With ever increasing environmental pressures,
access to training areas is becoming more problematic. In a simulator the only
constraint on geographical area is the availability of a suitable terrain database.
Environmental conditions such as weather, darkness etc cannot be controlled in
the live environment where as many conditions can be selected at will in a
simulator. Resource availability also includes factors such as the ability to
accurately represent threats from platforms other than those in the current
inventory.
5.
Instructional features – Simulators provide significant advantages in
terms of the instructional features that they offer. For example, the possibility
of freezing the simulator do allow coaching if performance is poor, followed
by resumption of the exercise post coaching, is a powerful feature, as is the
ability to reset the simulator to a previous point in time to allow a repeat of the
exact scenario experienced. In addition, considerably more performance data
can be collected automatically in a simulator.
There are a variety of different types of simulation which could be considered. The
major categories include:
a. Virtual – Virtual simulation can be defined as “real people operating
simulated equipment in a virtual environment” and is exemplified by
vehicle and ops room simulations.
108
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
b. Constructive – Constructive simulation can be defined as “real people
exercising military decisions on the basis of information constructed by
a computer system.” A common type of constructive simulation is the
classic wargame and is typified by the wargames such as the one
supporting the Command and Staff Trainer (CAST) at the Land
Warfare Centre.
c. Live Simulation – Live simulation can be defined as “ real people
operating real equipment with simulated effects in a live environment”
and is typified by the use of instrumented flying ranges and the Tactical
Engagement Simulation (TES) systems used at BATUS.
d. Embedded simulation – Embedded simulation is the incorporating of
simulation capability into operational equipment. Simulation modes
built into warfare systems would fall into this category.
e. Networked simulation – Networked simulation is the networking
together of multiple simulators. Examples include the Combined Arms
Tactical Trainer (CATT) and the Medium Support Helicopter Aircrew
Training Facility (MSHATF), both of which have multiple vehicle
simulators connected together on one site.
f. Distributed Simulation – Distributed refers to the networking of
simulators and simulator networks across different sites. Examples
include the connection of the Cooke Warfare Team Trainers in
Portsmouth being connected to US Navy simulation systems in
Norfolk, Virginia and the connection of the Mission Training by
Distributed Simulation (MTDS) system at RAF Waddington being
connected to equivalent USAF systems in Mesa Arizona.
g. Synthetic Wrap – Synthetic wrap is an ingenious combination of the
use of live and virtual simulation to provide an extended battlespace for
training. This enables Operational pictures to be populated with
elements that are outside the geographical area being used for training.
The challenge is transitioning an element traversing the constructive
space into the live space.
h. Augmented Reality – Augmented reality refers to the technique of
integrating synthetic elements into the live environment. A typical
example might include the insertion of synthetic target images and
weapons effects into a weapons display.
Identifying suitable options can be informed by the capabilities of extant systems but
often advice will be required on what constitutes the contemporary “art of the
possible”. Capability Joint Training Experimentation and Simulation (Cap JTES) in
MoD Main Building should always be consulted for advice, particularly as there is a
requirement for all new simulations to have the capability to be connected to extant
systems to provide greater capability for distributed training in the collective and
joint domains.
109
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
9.4 Training Environment Option Definition
Once an environment option has been identified it is necessary to provide a formal
description of it. It is suggested that this can be achieved with the construction of two
tables: an Environment Option Description Table and an Environment Option
Properties Table.
The Training Environment Option Description Table provides an overview of the
option in terms of a narrative description of the option, a list of the resources
required and details of how the instructional and support functions are implemented,
including training requirements. An example for the JOUST networked simulation
system, shown in Figure 33 is shown at Table 47. A table would be created for each
option for the environment being considered.
JOUST
Figure 33 JOUST Networked Flight Simulation System
110
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 47 Example Training Environment Option Description
Table
Simplified Conditions Environment for Pairs Intercept Practice
Option Name: JOUST
Description: JOUST is a networked, desktop flight simulation system that has the capacity to
train two student crews and can be sited in a single room. Bandit aircraft can be preprogrammed virtual entities in the environment or two manned consoles can be used. The
performance data for the Tornado aircraft and the bandit aircraft is based on manufacturer’s
data in the case of the Tornados and intelligence data in the case of bandit aircraft. The student
workstations have representative controls and displays and an intercom system is provided
which has a press to transmit facility to replicate radio use for communication between aircraft.
Missile fly outs and effects are accurately modelled as are the effects of chaff and flares.
The bandit workstations are single seat and have generic controls and displays.
An instructor console is provided which has facilities for configuring all the parameters required
for the exercises including geographical area, weather conditions and bandit profiles (if
required). It provides a 3-D display of the intercepts and shows aircraft tracks and missile fly
outs. It can be rotated to give a plan display. This can be used for both monitoring student actins
and for replay of the intercepts for debrief purposes.
Resource list: JOUST networked system in one room
Instructional/Support
Function
Instructional/Support
Role
Deliver Practice
Primary Instructor
Numbers
Training Requirements
1
Operation of the JOUST
system to include:
Prepare Scenario
scenario construction
Deliver Scenario
system configuration
Configure the scenario
control of scenarios
Control Scenario
use of replay facilities.
Change Scenario
Assess practice
Monitor and assess
individual and team
performance
Monitor and assess
performance outputs
Provide feedback during task
performance
Debrief practice
Operate Bandit Aircraft
Assistant instructor
2
Operation of generic
Bandit station
Having developed the overall description of the option, the next stage is to capture in
detail how it meets the requirements for the environment. This entails describing
how each element of the environment is represented and how the instructional
overlay requirements are met. The requirements for each element are described by
the corresponding column for that element in the Environment Description Table. It
is suggested that the appropriate entries in the table are extracted and used to
populate the first column of an Environment Option Properties Table for that
element. Subsequent columns can then be used to record the detail for each option
111
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
being considered for the training environment in question. Example table entries are
shown in Table 48.
Table 48 Example Training Environment Option Properties Table
Entries
Environment: Simplified Conditions Environment for Pairs Intercept Practice
Environment Element: Bandit Missiles
Requirement
Option
JOUST
Live Flying
Heat seeking and radar
guided missiles – effects
based on launch parameters,
kill probability and pairs
manoeuvre
Radar guided and heat seeking
missile flyouts and effects shown
in simulation on instructor display
Weapons effects simulation not
available
Evaluation of missile
effectiveness for debrief
Missile effectiveness modelled in
simulation
Shot effectiveness of Bandit missiles
evaluated by instructors inspection
of video recording of cockpit
displays and estimate of launch
parameters and Kill probability
Video replay of missile effects
for debrief
Video replay facility provided on
the instructor station which can be
viewed by the students
Video of cockpit displays replayed
using mission debrief facility
112
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
9.5 Training Environment Option Evaluation
Having developed the Environment Option Properties Tables, comparison of the
technical option becomes a relatively simple matter of comparing column entries on their
technical merits. The technical evaluation of alternative options for each environment can
be shown quite simply using a standard traffic light colour coding scheme in an
Environment Options Comparisons Table as illustrated in Table 49. This allows key
weaknesses in each option to be quickly identified and gives a visual impression of the
degree to which the option satisfies the requirement.
Table 49 Training Environment Options Comparisons Table
Environment: Simplified Conditions Environment for Pairs Intercept Practice
Environment Element: Bandit Missiles
Requirement
Option
JOUST
Live
Flying
Comments
Heat seeking and radar guided
missiles – effects based on launch
parameters, kill probability and
pairs manoeuvre.
Evaluation of missile effectiveness
for debrief
Video replay of missile effects for
debrief
113
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
10 Training Options Evaluation
10.1 Introduction
The evaluation of training options is ultimately a subjective judgement of the balance of
time, cost and perceived efficacy of the alternative options, given the likely training
throughput. From one perspective this is likely to be simplified for team and collective
training as there are likely to be fewer options to consider given that there will be
relatively few training environment options.
10.2 Estimation of Costs
In principle the cost associated with a given option can be categorised as:
•
Setup costs, including training environment acquisition and training development
costs.
•
Initial and ongoing instructor training costs.
•
Maintenance costs of the training environments.
•
Costs associated with running a given instance of the training course/event.
•
Costs associated with the number of individuals attending the training
course/event.
•
Costs to integrate the solution into existing training environments (such as
networking a simulator into existing simulator networks).
The cost model may be somewhat different if the training solution is to be acquired as a
training service.
Given that it is likely that a synthetic training solution will be required for some or all of
the training, advice from MoD and probably industry will be required to determine costs
in any meaningful way.
10.3 Estimating Effectiveness
Estimating the likely effectiveness of a training option is a difficult subjective task,
although the efficacy of equivalent solutions should any exist can be a useful guide. The
detailed comparison of the technical capabilities of alternative environment options
afforded by comparisons made in the training environment analysis section, does go some
way to taking the guess work out of that aspect of effectiveness assessment.
10.4 JSP 822 Guidance
The guidance in JSP 822 for the comparison of training options should always be
followed.
114
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
11 Conclusions and Recommendations
11.1 Conclusions
This research was conducted to close the methodological gap that had been identified in
the conduct of TNA for team and collective training. The resultant TCTNA approach has
been developed following a four stage process of developing a team/collective training
model, reviewing team analysis methods applicable to TCTNA, developing a theoretical
TCTNA model and instantiating the theoretical model and developing guidance.
The Team/Collective Training Model has been designed to underpin the TCTNA
approach by providing a map of the components of team/collective performance and
training that should be considered during the analytical process. At the core is the Team
Performance Model, supported by a teamwork taxonomy, which shows how team
processes act on the inputs from the environment to produces outcomes aimed at
achieving a desired effect in the environment. Novel features of the model include the
characterisation of the environment itself (expressed in terms of the different categories
of element that make up the environment and the demands which the dynamic nature of
the environment make on task performance), the mapping of instructional processes onto
the Team Performance Model, and the explicit identification of the requirement for
resources and systems to support the instructional processes.
The review of team analysis methods confirmed that there was no extant methodological
approach that was sufficient in itself for the conduct of TCTNA in its entirety, although
HTA(T) could serve as the foundation for task analysis and concepts from the CDA,
MATT and MEC approaches could be applied at different stages of analysis.
The theoretical model of TCTNA, based on the TNA Triangle Model devised in a
previous phase of HFI DTC research (HFI DTC, 2009), has five components, the
instantiations of which contain a variety of new techniques and adaptations of existing
techniques to cater for the greater complexity of team and collective training.
Team/Collective Task Analysis can be mapped onto the OTA and Gap analysis phases of
conventional TNA.
Constraints Analysis, Training Overlay Analysis, Training
Environment Analysis, and Training Option Selection map onto the Training Options
Analysis phase of conventional TNA. Constraints analysis is conducted in parallel with
the other phases of analysis and captures both the constraints that exist and their
implications for the choice of training solutions. Team/Collective Task Analysis uses
adaptations of software design methods to capture key information about the task
environment, and an extension of HTA(T) as a method for analysing team/collective
tasks from both a task and teamwork perspective. Training Overlay Analysis involves not
only a consideration of the methods to be used for training, as would be conducted for
individual TNA, but also a detailed analysis of the instructional and support roles
required to deliver training. Training Environment Analysis includes templates for the
specification of the different types of elements in the environment identified in the Team
Performance Model and an approach for comparing the suitability of the features of
alternative training environments. Another unique feature of the TCTNA method is the
construction of the Environment Description Table which is populated with data during
each phase of analysis and provides a detailed data set which can be used to inform the
contractual specification of training environments where required. Limited guidance is
also provided on training option selection. Templates and worked examples have been
115
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
provided to illustrate the application of each of the techniques advocated in the main
analytical stages.
The development of the TCTNA approach has benefited from the active participation of
Service TNA specialists, particularly the TNA staff at the Fleet Human Resources
Training Support Group. The opportunity exists to further refine the techniques
developed by eliciting their feedback on the experience of applying the TCTNA method
to the QE Class Carrier Collective TNAs currently under way.
The greatest opportunity for exploitation of this work within MoD lies in the inclusion of
the TCTNA approach within JSP 822. To this end, the authors of this report have been
invited to participate in the ongoing revision of JSP 822 and this report will be used to
inform that work.
The TCTNA method also has potential for application to Joint training, although some
adaptations of the method may be required and is a potential area for further research.
11.2 Recommendations
It is recommended that:
a. Feedback is sought from TNA specialists currently applying the TCTNA
method so that the techniques advocated can be refined and revised as
necessary.
b. The TCTNA method is incorporated into JSP 822.
c. The application of the TCTNA method to Joint training is investigated.
116
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
12 References
Adair, J (1997) Leadership Skills Chartered Institute of Personal Development, London.
Annett, J., Cunningham, D. & Mathias-Jones, P (2000) A method for measuring Team
Skills, Ergonomics, Vol 43 No 8 (pp1076-1094).
Annett, J., Duncan, K. D., Stammers, R. B. and Gray, M. J. (1971) Task Analysis
(London:HMSO).
Annett, J. (1997) Analysing team skills, in R. Flin, E. Salas, M. Strub and L. Martin
(Eds), Decision Making Under Stress: Emerging Themes and Applications, Ashgate,
Aldershot.
Arthur, W, Edwards, B.D, Bell S.T., Villado, A.J. & Bennett, W. Team Task Analysis:
Identifying Tasks and Jobs that are Team Based. Human Factors, Vol 47, Issue 3.
Baker, Day and Salas (2006) Teamwork as an Essential Component of High-Reliability
Organizations. Health Services Research. Vol 41, Issue 4, August 2006. 1576–1598.
Bowers, C.A., Morgan, B.B., Salas, E. & Prince, C. (1993) Assessment of Coordination
Demand for Aircrew Coordination Training, Military Psychology 5(2) pp 95-112.
Burke, S. (2005) Team Task Analysis, in Stanton, N. Hedge, A., Brookhuis, K, Salas, E.,
Hendrick, H. (Eds) Handbook of Human Factors and Ergonomics Methods, CRC Press,
London.
Cannon-Bowers, J.A. & Salas, E. (1998) Decision Making Under Stress: Implications for
Individual and Team Training, American Psychological Association, Washington DC.
Clark, R. C. (2008) Developing Technical Training 3rd Edn, Pfeiffer, San Francisco.
DSTL (2006) Analysis of Team Training – A Methodology. DSTL Report 200606.
DSAT QS (2003) Defence Systems Approach to Training Quality Standard.
Fitts, P.M. (1964) Perceptual-Motor Skill Learning in Melton, A.W. (Ed) Categories of
Human Learning (pp243-285), Academic Press, New York.
Hackman, J. R., & Morris, C. G. (1975) Group tasks, group interaction process, and
group performance effectiveness: A review and proposed integration. In L. Berkowitz
(Ed.), Advances in Experimental Social Psychology Vol. 8, Academic Press, New York.
(1-55).
HFI DTC (2005) Operational Information Management Skills Knowledge and Attitudes
Requirements HFI DTC Report No HFIDTC/WP.2.1.1/1.
HFI DTC (2007) Information Exploitation Competencies HFI DTC Report No
HFIDTC/2.1.1/3.
117
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
HFI DTC (2008) A Critique of Media Selection Models – analysis of applicability of
extant models to current UK Military Training HFI DTC report No
HFIDTC/2/WP12.1.1/1.
HFI DTC (2009) Training Needs Analysis – The Application of an Information
Processing-based Approach HFI DTC Report No HFIDTC/2/WP12.1.1/2
Huddlestone, J. and Harris , D (2003) Air Combat Student Performance Modelling Using
Grounded Theory Techniques, Proceedings of the Interservice/Industry Training
Simulation and Education Conference, Orlando Dec 2003.
JSP 502 Training needs Analysis for Acquisition Projects.
JSP 822 (2007) Part 5 Chapter 3 Defence Training Support Manual 3 Training Needs
Analysis.
Klein, G (2000) Cognitive Task Analysis of Teams in Schraagen, J.M., Chipman, S.F.,&
Shalin, V.L. Cognitive Task Analysis, Laurence Earlbaum Associates.
Klein, G. & Armstrong A.A. (2005) Critical Decision Method in Stanton, N.A. Hedge,
A., Brookhuis, K, Salas, E., Hendrick, H. (Eds) Handbook of Human Factors and
Ergonomics Methods, CRC Press, London. (35-1 – 35-8)
Marks, M.A., Matheu, J.E. & Zaccaro, S.J. (2001) A temporally based framework and
taxonomy of team processes, Academy of Management Review Vol 26 No3( pp356-376)
Naikar, N, & Sanderson, P. (1999) Work Domain Analysis for Training System
Definition and Acquisition in The International Journal of Aviation Psychology, 9(3)
(271-290).
NATO (2004) Evaluation of Collective Training in a Distributed Simulation Exercise
Proceedings of the NATO Research and Technology Organisation Human Factors and
Medicine Panel Symposium, Genoa Oct 2003.
NATO (2005) Military Command Team Effectiveness: Model and Instrument for
Assessment and Improvement NATO Research and Technology Organisation Technical
Report TR-HFM-087.
Nieva,V.F., Fleishman, E.A., & Reick, A (1978). Team dimensions: Their identity, their
measurement, and their relationships. Final Tech. Report, Contract DAHI9-78-C- 0001.
Advanced Research Resources Organisation, Washington, DC.
Orasanu, J. M. (1993). Decision making in the cockpit. In: Weiner, E. L., Kanki, B. G.
and Helmreich, R. L. (Eds). Cockpit resource management. London: Academic Press
Limited.
Partington, D. (2002) Essential Skills for Management Research, Sage, London.
Reigeluth, C.M. (1999) The Elaboration Theory in Reigeluth, C.M. Instructional-Design
Theories and Models Vol II, Lawrence Earlbaum Associates, New Jersey.
Roby, T. (1968) Small Group Performance. Rand McNally & Company, Chicago.
118
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Rousseau, V., Aube, C. & Savoie, A. (2006) Teamwork behaviours A review and
Integration of Frameworks in Small Group Research Vol 37 No 5, (540-570).
Salas, E., Sims, D. E., & Burke, C. S. (2005). Is there a “big five” in teamwork? Small
Group Research, 36, 555-599.
Salas, E., Dickinson, T.L., Converse, S., & Tannenbaum, S.I. (1992). Toward an
understanding of team performance and training. In R.W. Swezey & E. Salas (Eds.),
Teams: their training and performance Ablex, Norwood, NJ. (3–29).
Salas, E. & Preist, H. Team Training in Stanton, N., Hedge, A., Brookhuis, K., Salas, E.
and Hendrick H. (2005) Handbook of Human Factors and Ergonomics Methods, CRC
Press, Florida. (44-1 – 44-7).
Salmon, P.M., Stanton, N.A., Walker, G.H., Jenkins, D.P. (2009) Distributed Situational
Awareness. Ashgate, Aldershot.
Stanton, N. (2006) Hierarchical Task Analysis: Developments, Applications and
Extensions, Applied Ergonomics Volume 37, Issue 1 January 2006, (pp55-79 ) Elsevier.
Stanton, N.A., Salmon, P.M., Walker, G.H., Baber, C., Jenkins, D. (2005) Human
Factors Methods A Practical Guide for Engineering and Design, Ashgate, Aldershot.
Strauss, A., Corbin, J. (1990) Basics of Qualitative Research: Grounded Theory
Procedures and Techniques, Sage, London.
Swezey, R. W., Owens, J. M., Bergondy, M. L., & Salas, E. (1998). Task and training
requirements analysis methodology (TTRAM): An analytic methodology for identifying
potential training uses of simulator networks in teamwork-intensive task environments.
Ergonomics, 41, 1678-1697.
Tannenbaum, S. I., Beard, R. L., & Salas, E. (1992). Team building and its influence on
team effectiveness: An examination of conceptual and empirical developments. In K.
Kelley (Ed.), Issue, theory, and research in industrial/organizational psychology (pp.
117-153). Amsterdam: Elsevier.
Tannenbaum, S.I., Smith-Jentsch, K.A., Behson, S.J. (1998) Training Team Leaders to
Facilitate Learning and Performance in Cannon Bowers, J.A. & Salas, E.(Eds) Decision
Making Under Stress: Implications for Individual and Team Training, American
Psychological Association, Washington DC.
Tesluk. P., Mathieu, J. E., Zaccaro. S. J., & Marks, M. (1997). Task and aggregation
issues in the analysis and assessment of team performance. In M. T. Brannick, C. Prince,
& E. Salas (Eds.), Team performance assessment and measurement: Theory, methods,
and applications (pp. 197-224). Mahwah. NJ: Erlbaum.
Ward, P.T. & Mellor, S.J. (1985) Structured Development for Real-Time Systems
Volume 2: Essential Modelling techniques, Yourdon Press, New Jersey.
Wickens, C.D. & Hollands J.G. Engineering and Human Performance 3rd Edn, Prentice
Hall, New Jersey.
119
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Appendix A
Teamwork Models
A.1 Team Process Model (Annett, 2000)
Figure 34 Team Process Model (adapted from Annett, 2000)
120
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
A.2 Team Coordination Dimensions (Bowers et al, 1993)
Table 50 Definitions of Team Coordination Dimensions (Bowers
et al, 1993),
Coordination
Dimension
Definition
Communication
Includes sending, receiving, and acknowledging information
among crew members.
Situational
Awareness
(SA)
Refers to identifying the source and nature of problems,
maintaining an accurate perception of the aircraft’s location
relative to the external environment, and detecting situations that
require action.
Decision
Making (DM)
Includes identifying possible solutions to problems, evaluating the
consequences of each alternative, selecting the best alternative,
and gathering information needed prior to arriving at a decision.
Mission
Analysis (MA)
Includes monitoring, allocating, and coordinating the resources of
the crew and aircraft; prioritizing tasks; setting goals and
developing plans to accomplish the goals; creating contingency
plans.
Leadership
Refers to directing activities of others, monitoring and assessing
the performance of crew members, motivating members, and
communicating mission requirements.
Adaptability
Refers to the ability to alter one’s course of action as necessary,
maintain constructive behaviour under pressure, and adapt to
internal or external changes.
Assertiveness
Refers to the willingness to make decisions, demonstrating
initiative, and maintaining one’s position until convinced
otherwise by facts.
Total
Coordination
Refers to the overall need for interaction and coordination among
crew members.
121
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
A.3 The Models for Analysis of Team Training Taxonomy (Dstl,
2006)
Table 51 MATT Teamwork Behaviours (Dstl, 2006)
Behaviour
Definition
1. Communication Behaviours
1a Information Exchange
Seeking and passing information to all relevant team
members at appropriate times in relation to their task
needs.
1b Communication Skills
2. Co-ordination Behaviours
2a Procedural Co-ordination
Making use of standardised formats and conventions to
transmit information.
The integration and synchronisation of team interactions
in the completion of laid down procedures.
2b Collaboration
The process of organising team resources, activities and
actions to ensure that tasks are mutually shared and
completed in time.
2c Leadership and Task
Management
Directing and co-ordinating the activities of the team.
3. Adaptive Behaviours
3a Situation Assessment
3b Decision Making
Development of a common understanding of the situation.
Mutual involvement in the assessment of a situation and
choice of a course of action through discussion and
argument.
4. Back-up Behaviours
4a Performance Monitoring
and Feedback
Monitoring the performance of team mates, providing
constructive advice and giving and receiving feedback.
4b Mutual Support
Providing assistance to other team members who need it.
122
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
Table 52 MATT Team Member Attitudes and Characteristics (Dstl, 2006)
Attitude
Characteristics
1. Mutual Trust
Team members respect each other, listen to each other’s
proposals and views and encourage proactive behaviour.
2. Shared Vision
Team members agree on the direction, goals and mission of the
team and have a mutual belief in the importance of the team.
3. Team Orientation
Team members believe that the team approach is likely to be
more successful than acting as an individual.
4. Collective Efficacy
Team members hold positive common perceptions of group
achievements and potential.
Table 53 MATT Teamwork Knowledge Requirements (Dstl, 2006)
Type of knowledge Definition
Descriptors
Shared Task
Models
Shared models of the situation and
appropriate strategies for coping with
task demands.
Hold shared interpretation of
situations and how to deal with
them.
Cue-Strategy
Associations
Association of environmental data, with
appropriate task strategies requiring coordination.
Know how and when to change
co-ordination strategies.
Team Mission,
Objectives and
Resources
Goals held in common and the team
members and facilities for their
achievement.
Common understanding of
mission and team resources
required to achieve objectives.
Accurate Problem
Models
Correct understanding of the team’s
problems and strategies by which
members will cope with and solve them.
Team Member
Characteristics
Task relevant competencies, preferences,
tendencies, strengths and weaknesses of
team mates.
Boundary Spanning
Roles
Knowledge of how a team manages its
interactions with non-team members and
other units.
Task Sequencing
Integrating task inputs according to team
and task demands.
Team Role
Interaction Patterns
Knowing how the team communicates
and arrives at decisions.
Teamwork skills
Understanding the skills and behaviours
required for successful performance.
Knowing how to organise tasks in
sequences depending on priorities.
Understanding what needs to be
done in order for the team to
perform effectively.
123
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
A.4 Salas “Big Five” Model of Teamwork (Salas et al, 2005)
Table 54 Definitions of the Core Components and Coordinating Mechanisms of the
“Big Five” Model of Teamwork (adapted from Salas et al, 2005)
Teamwork
Definition
Core Components
Team leadership
Ability to direct and coordinate the activities of other team members,
assess team performance, assign tasks, develop team knowledge,
skills, and abilities, motivate team members, plan and organize, and
establish a positive atmosphere.
Mutual
Performance
Monitoring
The ability to develop common understandings of the team
environment and apply appropriate task strategies to accurately
monitor team mate performance.
Backup behaviour
Ability to anticipate other team members’ needs through accurate
knowledge about their responsibilities. This includes the ability to
shift workload among members to achieve balance during high
periods of workload or pressure.
Adaptability
Ability to adjust strategies based on information gathered from the
environment through the use of backup behaviour and reallocation of
intra-team resources. Altering a course of action or team repertoire in
response to changing conditions (internal or external).
Team orientation
Propensity to take other’s behaviour into account during group
interaction and the belief in the importance of team goals over
individual members’ goals.
Coordinating Mechanisms
Shared mental
models
An organizing knowledge structure of the relationships among the
task the team is engaged in and how the team members will interact.
Mutual trust
The shared belief that team members will perform their roles and
protect the interests of their teammates.
Closed loop
communication
The exchange of information between a sender and a receiver
irrespective of the medium.
124
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
A.5 Teamwork Behaviours (Rousseau et al, 2006)
Table 55 Teamwork Behaviours (Rousseau et al, 2006)
Preparation of work accomplishment
Team Mission
Analysis
Collective interpretation and evaluation of the team’s purpose, including
identification of its main tasks and the operative environmental conditions
and team resources available for carrying out the mission
Goal
Specification
Identification of the level of performance that team members have to achieve
Planning
Development of alternative courses of action for task accomplishment
Work Assessment Behaviours
Performance
monitoring
Keeping track of fellow team members’ work
Systems
monitoring
Tracking team resources and the state of the external environment.
Task-related collaboration behaviours
Coordination
Integrating team members’ activities to ensure task accomplishment within
established temporal constraints
Cooperation
Working together during task execution.
Information
exchange
Exchange information by whatever means
Team Adjustment Behaviours
Backing up
behaviours
Provision of tangible task-related help when a team member is failing to
reach the goals as defined by his or her role
Inter-team
coaching
Provision of feedback and confronting members who break norms
Collaborative
problem solving
Collectively find and implement a solution that brings actual conditions
closer to the desired conditions involves gathering and integrating
information related to the problem, identifying alternatives, selecting the
best solution, and implementing the solution includes decision making
Team practice
innovation
Team members’ activities designed to invent and implement new and
improved ways of doing their tasks
Management of Team Maintenance
Psychological
support
The voluntary assistance that team members provide to reinforce the sense
of well-being of their teammates
Integrative
conflict
management
Resolution of conflicts tasks, processes and interpersonal issues
125
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
- End of Document –
126
HFIDTCPIII_T13_01
Version 2/ 16 April 2011
127