Trip Report Department of Defense Human Factors Engineering

advertisement
!
31 May 2014
Trip Report
Department of Defense
Human Factors Engineering Technical Advisory Group
(DOD HFE TAG) Meeting #68 – Aberdeen Proving Ground, MD
19-22 May 2014
!
The 68th meeting of the DoD HFE TAG was held in Aberdeen Proving Ground, MD and hosted by
the US Army. The meeting was chaired by Dr. John Warner, US Army G-1 MANPRINT
Directorate (john.d.warner38.cov@mail.mil). The theme of the meeting was Collaboration among
Agencies and HSI Domains to Maximize Performance. Approximately 100 people attended the
TAG meeting, representing the Office of the Secretary of Defense (OSD), Army, Navy, Air Force,
NASA, FAA, Coast Guard, Department of Homeland Security, several human factors-related
technical societies and industry associations. Additional personnel representing government,
industry and academia attended the meeting as invited speakers. Selected briefings from TAG-68
will be available on the DoD HFE Tag website: http://www.hfetag.com/.
!
Five items are attached:
• DoD HFE TAG Background, attachment (1)
• TAG-68 Theme, Attachment (2)
• Program Summary, attachment (3)
• DoD HFE TAG Operating Board, attachment (4),
• TAG attendees, attachment (5) (when available)
• DoD HFE TAG Policies, attachment (6)
!
MONDAY, 19 May 2014 PLENARY SESSION PRESENTATIONS
!
Welcome:
COL Gregory McClinton; Commander, APG Garrison – Opening Remarks (missed
presentation)
!
Opening Remarks:
Dr. Patrick Mason, ASD(RE) HPTB, TAG Proponent. Dr. Mason discussed “readiness
levels.” Technology Readiness Levels are widely accepted within DOD Systems Acquisition as
measures of technology maturity. There are several other “readiness levels” under development,
including:
• Manufacturing Readiness Levels (MRL)
• Design Readiness Levels (DRL)
• System readiness Levels (SRL)
• Integration Readiness Levels (IRL)
• Human Effects Readiness Levels (HERL)
• Human Readiness Levels (HRL)
!
Human Readiness Levels could be used within the systems acquisition process to assist decisionmakers in matching human readiness to acquisition phases and with other readiness levels (missed
presentation)
!
Human Systems Integration: Challenges and Opportunities.
!
Dr. Mica Endsley, Chief Scientist of the Air Force. (missed presentation)
Naval S&T in HFE.
!
Dr. Tom Killion, Director Office of Technology (ONR O3T) (missed presentation)
HSI in Coast Guard Acquisitions: The PEO’s Perspective
!
RADM Joe Vojvodich, Program Executive Officer (CG-93) (missed presentation)
S&T Innovation through Convergence: Collaboration to Maximize
Performance
CDR Joseph Cohn (OSD HPTB). US Federal S&T funding is dropping. Industry
funding has begun decreasing also (since 2000). The USA is currently in fourth place in S&T
funding as a percentage of GNP (currently 3%). In order to maximize effectiveness, government
and industry should collaborate in science and technology investment. One area of focus is
“Natural Human Machine Interface.” Traditionally human research and machine research were
stove-piped. However, today government is engaged with NDIA and conducting collaborative
workshops, To succeed, we need convergence across disciplines.
!
NASA’s Technology Roadmaps and Investment Plan
Ms. Faith Chandler, Director of Strategic Integration for the NASA Office of the
Chief Technologist. Ms. Chandler (faith.t.chandler@nasa.gov) stressed opportunities to partner
with NASA. The Chief Technologies provides direction and priorities for the NASA technical
portfolio and leads technology transfer. In 2010, NASA began to approach investment strategy
differently, within a constrained budget. There are now 15 technology areas that invest in five, ten,
and “more” timeframes. Roadmaps are updated every four to five years. NASA is currently
looking to work with experts who know the state of the art in these technology areas.
!
FAA Human Factors Research and Engineering
Dr. Paul Krois, Director, Human Factors Division, NextGen ANG-C1. The FAA is
concerned over the loss of “stick and rudder” skills in commercial pilots. The FAA recently
published HF-STD-004, Requirements for Human Factors Programs. They are updating HFSTD-001, Human Factors Design Standard. They are developing new standards for technical
operations on symbols and marking and abbreviations. (Paul.krois@faa.gov)
!
!
!
!
!2
!
!
!
Tuesday-Thursday, 20-22 May 2014
SubTAG and Special Interest Group Meetings Attended at the TAG:
!
Technical Society/Industry SubTAG.
The Technical Society/Industry (TS/I) Sub TAG
met on Tuesday morning and afternoon. The TS/I SubTAG meeting was co-chaired by Ms.
Barbara Palmer (Booz Allen Hamilton, barbara_palmer@bah.com) and Mr. Stephen C. Merriman
(The Boeing Company, stephen.c.merriman@boeing.com). There was one presentations in the
morning session.
!
Development of a Human-Systems Integration Standard: Mr. Owen Seely, Senior
Human Systems Engineer for USMC Systems, Naval Surface Warfare Center, Dahlgren, VA
owen.seely@navy.mil. The Director of Systems Engineering, (Mr. Stephen Welby) and the
Defense Standardization Council are directing the development of a standard for Human-Systems
Integration (HSI). It will be a process standard, a standard practice (as opposed to a design
standard). MIL-STD-46855A and MIL-STD-882E are process standards for human engineering
and system safety, respectively. A DOD HSI Standard Working group has been established and
chartered. The first major task is to assess if there is a major gap in HSI requirements. The first
step in this gap analysis, to make an initial assessment of the available 32 government and industry
standards in HSI and HSI domains, has been completed. The second step is to take the nine
“keepers” from the first analysis and conduct a more thorough analysis. It is planned that a
government handbook will also be developed to help manage contractor HSI activities. The
working group will also be identifying potential cost savings that should result from applying the
new HSI standard in future military system acquisitions. It is thought that primary areas of cost
savings/avoidance may be in better coordination and reduced redundancy between domain
activities and products. Remaining activities include completing the gap analysis, conducting an
Analysis of Alternatives (AoA) by the end of FY14 and publishing a final Gap Analysis report.
!
Following the presentation, the TS/I meeting participants focused on nominating new chairs, cochairs or chair-elect officers. Ms. Laura Strater (SA Technologies, Inc.) volunteered to help with
future TS/I meetings; her exact role will be determined in the future.
!
In the afternoon session of the TS/I SubTAG, society representatives briefly discussed future TAG
schedule options presented by the TAG ExComm. The TS/I group recommended cutting the
plenary session to 90 minutes, thereby freeing up one more technical session time-slot. This could
have the effect of shortening the overall TAG meeting by two or two and a half hours. Other
discussions centered on:
o Current representatives of several technical societies
o TAG Website importance to technical society / industry TAG participation
o Time-slot for TS/I – possibly a “regular” time-slot would be better.
o EXCOMM – Is there really a position open for the TS/I? If so, we need to participate.
!3
!
o HSI Industry survey supporting the HSI Standard: Make sure we include items on
“roadblocks to performing HSI, contractor HSI organizational issues, issues with
contracting HSI.
o TS/I Webpage: Need to update the current TS/I organization/representative listing.
Human Factors Standardization (HFS) SubTAG: The Human Factors Standardization
SubTAG meeting was chaired by Mr. Alan Poston (aposton86@comcast.net). The Human Factors
Standardization (HFS) SubTAG met on 20 May 2014. Following the introduction of attendees,
the SubTAG continued with its agenda.
!
Status Reports:
!
a. MIL-STD-1474, Noise Limits. Mr. Bruce Amreim, RDECOM, asked, how loud are military
systems? The recoilless rifle is 190 dBA! Impulse and steady-state noise are evaluated
separately. Impulse noise criteria ignore the human near non-linearity and also ignore the
effects of hearing protection. The military standard is being revised. Two hundred fifty
comments have already been adjudicated. Draft #2 is nearly ready for release. It will be
published next fiscal year (FY15).
b. MIL-STD-1472, Human Engineering. Nothing new.
c. Developing Standardized Evidence-based Maximum Exertion Levels. Mr. Don Goddard,
Donald.e.goddard.civ@mail.mil) is working on revising lift and carry criteria for MILSTD-1472. Humans are built for moderate loading. It is believed that the NIOSH lifting
equations are excessively conservative. It is recommended that evidence-based criteria should
be developed.
d. ANSUR II Results: Nothing new.
e. Occupant-centric Platform: Ms. Dawn Woods (Army Soldier Systems Center, Natick, MA)
provided an update on the Occupant Centric Platform (OCP) program. This is a 6.3 effort,
funded with multi-million dollars over 5 years. The market survey and technical assessments
have been completed. Digital human models were created for the FCS program and then recreated in PRO_E. Encumbered anthropometry data collection has been completed and a final
report has been drafted. Seat restraints were evaluated (10 different systems). A memo design
guide was developed for bench width on ground vehicles in two configurations – “just
touching” and “compressed.” Openings for emergency hatches sere determined (26” x 28”
rectangle recommended). Next, they are looking at vehicle door sizes and ramp configurations.
Dr. Claire Gordon at Natick is involved as the chief anthropologist.
f. NASA-STD-3001: Nothing new.
g. NATO Human Systems Integration Handbook: Bill Kosnick (WPAFB) reported that the
handbook was completed and published in 2012 as a DTIC report (AD 593691). It is available
at: http://www.dtic.mil/docs/citations/AD593691.
h. Ambulance Standard: Mr. Jim Grove (Department of Homeland Security) reported on the
Ambulance Design, Safety and Standards Project. This is a 4-year old program entering its
last year (complete in FY15). It is intended to provide 30 mph crash safety criteria for
ambulances. Existing SAE standards cover test pulses, seats, cots, equipment, mounts,
cabinets and body structure. HDS is trying to fill in all of the gaps.
!4
i. NIST GUI Standard: Suzanne Furman, PhD reported on this “HSI Design criteria Standard.”
The first step reviewed existing HFE/HSI standards and published a paper. The second step
addressed the application of User –centered design (NISTIR-7194). The third step is
identifying interim steps to augment existing standards. There are five areas of need:
• Real-time non-local software (e.g., the “cloud”)
• Hand-held devices
• Touch interfaces
• Information for biometric collection devices
• Accessibility for DHS agents and the people they work with
j. SAE International G45 Human Systems Integration Committee: Steve Merriman
(Stephen.c.merriman@boeing.com) reported on recent progress and current activities of the
SAE International G45 committee. Within the last three years, the committee has completed
the following:
• Development and Publication of a new DID for HSI Program Plan
• Development and Publication of a new DID for HSI Report
• Revision/Publication of a revision of the DID for Human Engineering Program Plan
• Revision and Publication of a revision of the DID for HEDAD-O
• Revision of SAE Human Engineering Bulletin HEB-1
Currently, the committee is working on revising the DID for Human Engineering Design
Approach Document-Maintainer.
They have also converted the Navy’s old AD-1410
document to a draft industry standard on Design for Ease of Maintenance. Committee
members are currently supporting development of a DOD HSI Standard and standing by to
assist with revision of MIL-STD-1472G.
k. FAA Design Standard: Mr. Alan Poston reported that the current version of the standard is
2003. A task was awarded to update the standard in 2012. A final draft is expected by
February 2015. Three chapters are currently complete and three more are in technical review.
l. Development of a Human-Systems Integration Standard:
Mr. Owen Seeley
owen.seely@navy.mil, presented the recent history and current status of the DOD HSI process
standard. For details, please see the section on Technical Society/Industry SubTAG (above).
m. Human Factors Standardization at the Nuclear Regulatory Commission: Dr. John O’Hare
(NRC) first identified the primary human factors standards at the NRC: NUREG-0711, Human
Factors Standard (Process) and NUREG-0700, Human Factors Standard (Design). They are in
the first phase of updating their standards now: phase I is updating standards based on other
standards and phase II is updating standards based on new research. Traceability to source is
retained in these standards and additional information is included to help in application of the
guidelines. NUREG-0700 sections on alarms and automation have been updated and are
currently in peer review.
n. Approaches to Requirements Verification: Dr. Cynthia Null (http://www.nasa.gov/offices/
nesc/team/Cynthia_Null_bio.html#.U4etpSgU5kU). She provided some good guidance to the
attendees on constructing simple, verifiable requirements. She cautioned that a system can
meet requirements and still be unusable so great care must be taken to ensure that requirements
are complete, understandable and enforceable.
o. Human Reliability: Ms. Shannon L Cole (US Coast Guard) In the Coast guard, there is no
standard process for analyzing human reliability; even so, it is still useful in identifying areas
!5
where humans are the “weak link” in systems. Operational availability is key to effectiveness
and reliability and maintainability are key to AO. Typically, human components have been
representative of the availability of personnel rather than the availability and reliability of
qualified personnel. The USCG is collaborating with other agencies now to identify
commonalities in human reliability analysis, identify gaps, and identify common tasks
analyzed by different agencies. Liaison also is being established with the AIAA.
p. Human Factors Standardization at the Consumer Product Safety Commission: Ms.
Bonnie Novak (CPSC) is the Director of Human Factors (http://www,cspc.gov). The CSPC
was established in 1972 as an independent agency. Currently, three commissioners are seated
out of a total of five. Some of the different methods of ensuring human factors acceptability
are:
• Voluntary Standards
• Mandatory Standards
• Recalls/Compliance
• Information/Education
• Partnerships
• Guidelines
The CSPC FY2014 budget is only $118 million. There are a total of 550 staff, with 400 in the
Metro DC area and 150 in the field ad at ports. Ms. Novak has transformed human factors at
SPSC to bring in solid human factors/ergonomics rigor. The human factor discipline currently
uses a voluntary standards committee, individual product standards and age determination
guidelines. In the future it is hoped that there will be a standard practice for human factors
program requirements. The biggest problem is that the audience is so broad, from infants to the
very old, so a program (process) requirement is more practical than design guidelines. A Standard
Practice effort is scheduled to start in 2015. With a staff of nine, Ms. Novak must leverage work
by other agencies to have sufficient impact.
!
Human Modeling and Simulation SubTAG: The meeting was co-chaired by Mr. John
Rice (Society for Simulation in Healthcare, john.rice@noboxes.org) and Ranjeev Mittu
(ranjeev.mittu@nrl.navy.mil). The first speaker was Ranjeev Mittu, who spoke on Fleet Integrated
Synthetic Test and Training Facility, Pacific (FIST2PAC). The Navy has no end-to-end (target to
weapon on target) end-to-end testing and/or training facility. Challenges include Fast Attack Craft
(FAC), and Fast Insure Attack Craft (FIAC) defense – both are high priority Fleet requirements.
!
The second presentation was by Dr. Bill Lawless (Paine College) who spoke on Assessing Human
Teams Operating Virtual Teams. He has taken a theoretical mathematical approach. He has
determined that the best defense is two coordinated teams plus an independent team.
!
The next presenter was Mr. Manny Diego, ARL-HRED-STTC (Orlando), who spoke on
Distributed Soldier Representation (DSR). Soldiers are not realistically portrayed in models.
They are generally “super humans,” whose performance doesn’t degrade, unless killed. DSR is an
attempt to correct this situation. This is an ARL project and it represents the Soldier with greater
fidelity. It attempts to represent cognition, stress, morale, physiology, resilience, decision-making,
leadership, psychology, etc. capabilities of the Soldier.
!6
!
The next presenter was Dr. Marianne Paulson (NAWC-TSD-Orlando) who spoke on Performance
Shaping Functions (PSF) Applied to Distributed Soldier Representations in Simulated
Environments. The focus of this work was on improved predication of human performance
reflecting impacts of environmental stressors. They developed algorithms that run in Micro Saint
models (however these have not yet been validated). The models attempt to represent thermal,
motion, performance and task effects on humans.
The next presenter was Dr. Paul Deitz, USAMSAA (paul.h.deitz.civ@mail.mil) who spoke on A
Major Contribution of the Human Factors Community to Military Analysis- Military Mission
Framework. This is a multi-level modeling network where one entity may impact upon others.
!
The next presenter was Sylan Bruni (sbruni@aptima.com) who spoke on Live Virtual
Constructive and Game-Assisted Experimental Designer. This is a decision aid to assist the
experimenter in creating and selecting experimental designs and configurations which are the best
modeling and simulation assets (tools) to use. It involves a 10-step interview process that asks
about variables, constraints, experimental design preferences, etc. It is mostly oriented towards
Army tools. He will be working with Military Academy folks at West Point.
!
The session concluded with a panel session involving Mr. Jack Sheehan (consultant), Dr. Martin
Steele (NASA), and Mr. Jesse Citizen (Director, Modeling and Simulation Coordination Office,
www.msco.mil). They discussed existing and emergent Modeling and Simulation issues. Dr.
Steele referenced the NASA Standard for Models and Simulation (NASA STD-7009)
!
System Safety/Health Hazards/Survivability SubTAG: The SubTAG was chaired by
John Plaga (john.plaga@us.af.mil). The first presenter was Dr. Paul Fedele, US Army Research
Laboratory, who spoke on Level-dependent Hearing Protector Model. This is an on-going
project. The Auditory Hazard Assessment Algorithm for Humans (AHAAH) is an electro-acoustic
model that calculates human hearing damage. It allows analysis of acoustic waveforms as they
move through the air. The stapes has a stiffening spring constant which prevents/minimizes
hearing damage. The end result is basilar membrane damage, either fatigue or failure. There are
three independent modes of pressure wave transmission, all three of which affect the hearing
protection. Level-dependent hearing protection resists flow with higher acoustic pressure. Low
impulse pressures are resisted less and higher pressures increase flow resistance. The Combat
Arms Hearing Protector is the standard, non-linear protection. ARL is continuing to test protectors
and compare their effectiveness. There are both passive and active protection technologies.
!
The second presenter was Mr. Rich Zigler, Army Research Laboratory Survivability/Lethality
Analysis Directorate (SLAD), who spoke on Maximizing Performance with HSI/MANPRINT
& Soldier Survivability. Most of the decisions on an acquisition program affecting cost are
determined by the time of the Material Development Decision (MDD). Seventy to 75% of costrelated decisions are made by Milestone A. And 85% of cost decisions are made before a prime
contractor is on contract for system development. Authoring god sections L and M of the
development contract are critical to HSI/MANPRINT. Government HSI personnel should try to
get on the SSEB. During Engineering and Manufacturing Development (EMD),the design
!7
solidifies and leaves little room for improvement after that. The real key is translating HSI
requirements (e.g., reduce fratricide, reduce damage, reduce injury, minimize fatigue and
workload) into verifiable design requirements.
!
The next presenter was John Plaga, who briefly summarized the Air Combat Command/Surgeon
General’s Review of Science, Technology, and Devices to Monitor Human Performance. The
primary author was William Albery, supported by Booz Allen.
!
The next presenter was Lloyd Tripp, PhD, USAF 711th Human Performance Wing, who spoke on
The F-22 Helmet-mounted, In-flight Physiological Monitoring System and Beyond. The F-22
experienced a Class A mishap due to pilot hypoxia. Pilots were then required to wear fingermounted pulse oximeters and wristwatch-like devices that provide real-time data. Dr. Tripp
advised that use of a finger-mounted pulse oximeter was a bad idea because high g prevents blood
circulation to the fingers. When the hand is squeezed, artificially low readings are obtained. The
sensor needs to be relocated above the heart so an oxygen sensor was embedded into the helmet in
a position to monitor blood flow. The USAF Scientific Advisory Board required flight test in 120
days, which was extremely difficult. After testing, it took about a year to move from a prototype
helmet to flying a production helmet in about a year. The system name is TAPCOMS – Tactical
Aircraft Physiological and Cognitive Monitoring System. The system is capable of monitoring
acetone levels, hydration, blood flow, and oxygenation.
!
The last presenter was LtCol Jeff Parr, PhD candidate, Air Force Institute of Technology, who
spoke on Development of a Multi-axial Neck Injury Criterion for Aircraft Ejection (MANIC).
This work is focused on the evaluation of helmet mounted displays and escape system safety. The
goal is to provide systems engineers with better information for decision making. Multi-axis
forces and moments (Fx, Fy, Fz, Mx, My, Mz) are used to clearly predict injury (5% risk of AIS2
vs. 10% risk of AIS3). It is applicable to the full range of pilots/aircrew and appropriate to all
phases of ejection. The focus is on the Head/C1 juncture. NIJ injury criteria, established by the
National Highway Transportation System, are the neck injury criteria used by the auto industry.
Current US government defined neck injury criteria (NIC) used for escape system design do not
meet Air Force requirements. The objective is to reduce peak g loads on the neck. The MANIC
model is based on live human and cadaver data. Using time history data from sled test shots, they
compared traditional injury criteria to MANIC criteria. The MANIC model is very conservative
relative to NIC. Also see: http://pro.sagepub.com/content/56/1/2070.full.pdf, and, Perry, C.E., Rizer,
A. R., Smith, J. S., and Anderson, B. (1997). Biodynamic Modeling of Human Neck Response During Vertical
Impact.” SAFE Journal, 27(3), 183-191.
!
Unmanned Systems Interest Group:
Three different sessions were chaired by Mr. AJ
Muralidhar (Naval Surface Warfare Center, Dahlgren, VA, ajoy.muralidhar@navy.mil). Sessions 1
and 3 were not attended. Session 2 was as follows. The first speaker was Dr. Michael Miller,
AFIT who spoke on A Method for Guiding Design of Systems Employing Adaptive Control. It is
difficult to select which tasks should be shifted between human and machine. What method should
be used to decide? If done correctly, adaptive control should prevent over-reliance on automation,
inappropriate trust, poor situation awareness and atrophy of skills. A simple game model was used
!8
to explore the allocation of tasks and effects. Several technical papers have been authored thus far
and research is continuing.
!
The next presenter was Dr. Cynthia Null (NASA) who spoke on Autonomy: The role of HSI. Dr
Null discussed similarities and differences between automation, autonomy, mission complexity,
and other concepts.
!
The next presenters were Julian Abich and Dr. Lauren Reinerman-Jones, Institute for Simulation
and Training, University of Central Florida, who spoke on Augmenting Robotic Behaviors
Using Physiological Measures of State. The role of robots is transitioning from “tools” to
“teams” Better; more timely communication between humans and robots is required. They are
currently examining questions of teaming between humans and robots, based on trust.
!
The next presenters were Alex Stimpson and Jason Ryan, Human and Automation Laboratory,
MIT, who spoke on Modeling Approaches for Human Supervisory Control Environments. The
focus is on modeling human/automated system teaming. The main question being examined is
“What sorts of data can be collected during training that will allow one to better predict real world
performance?” Machine learning may be able to help decide.
!
Human Factors Engineering/Human Systems Integration: Management and
Applications. The session was chaired by Pamelyn Maynard (pamelyn.maynard@navy.mil)
and LCDR Jeff Grubb (jeff.grubb@navy.mil). The first speaker was Dr. Sandro Scielzo, SA
Technologies, Inc/UT Arlington) who spoke on Subjective Multi-dimensional Workload Index
for Distributed Teams. There is no consensus on how to conceptualize workload; there is no
theoretical framework; and, there are no valid workload measures for teams. Workload is being
investigated in a broad context, not just in the cognitive dimension. A model was developed during
phase 1 and validation was undertaken in phase 2. An abridged version was developed of the Team
Subjective Assessment of Workload (T-SAW) model. Workload indices were developed to identify
problems. Currently, a one-page summary report is generated from the model. Results are
currently being examined as applied to medical teams and sports teams. The project will be
completed in August 2014 and a report will be generated by the US Army Research Laboratory.
!
The next presenter was Dr. Bob Pokorny, Intelligent Automation, Inc/711th HPW, who spoke on
Interactive Maintenance Assistant Job Aids that Provide System Visualization. This project was
sponsored by ONR. The focus is on using a electronic tablet as a job aid to improve performance.
Previous work was conducted by NAVSEA on the Adaptive Visualization Training System. In an
experiment, 1 hour of training on the tablet was added to a 12 day training session’ a 24% increase
in maintainer performance resulted.
!
The next presenter was William Kosnick, HSI Directorate of the USAF 711th HPW, who spoke on
Human Systems Integration at 711th Human Performance Wing: What’s new? Bill provided a
quick history of HSI in the Air Force.
2004: USAF SAB made recommendations on implementing HSI
!9
!
2005:
2008:
2012:
2013:
2014:
HSIO established under the Vice Chief of Staff
Established 711th PHW, HP Directorate, and HSI teams at MAJCOMs
Decision Brief to SAF/AQ, AF/SG and AFLMC/CC
Enhanced HSI training and standardized processes
Established HSI Cell at Global Strike Command
The HP Directorate has increased from four to 27 Subject Matter Experts (SME). There are HSI
SMEs at 5 MAJCOMS where 10 programs are being addressed. Medical service initiatives are
focused on HSI requirements. There are 16 SMEs with HSI certification (NPS or WDETP) and
two are currently enrolled in the NPS program. HSI risk assessments are a major task conducted
for USAF Program Managers. Ground control stations still pose a big HSI problem for the Air
Force. Aircrew integrated ensembles are constantly being worked on. The HSIO’s primary role is
one of HSI advocacy, ensuring that HSI becomes a standard practice.
!
The next presenter was Mr. Frank Lacson (franklacson@pacific-science.com), Pacific Science and
Engineering Group, who spoke on HSI and Systems Engineering Technical Collaboration for
DOD Acquisition: The Multi-service Human Systems Integration Framework (HSIF).
Currently, there are multiple coordination and integration challenges, lots of policy/standard/
guidance, and lots of inconsistency of application. Organizations do not talk with each other and
there is a misalignment of priorities between program schedule, cost and performance. The HSI
framework consists of process diagrams that display HSI domain activities (at a very detailed
level) across the entire DoD acquisition life cycle. This project will be completed in August 2014.
The sponsor is Bill Kosnick at the USAF 711th HPW.
!
Design Tools and Techniques SubTAG: The DTT SubTAG meeting was co-chaired by
Mr. Stephen Merriman (The Boeing Company, stephen.c.merriman@boeing.com) and Dr. Michael
Feary (NASA Ames, Michael.s.feary@nasa.gov) on 22 May 2014. The meeting was attended by
18 participants. There were no changes to SubTAG leadership and there were no changes made to
the SubTAG charter. Ms. Chelsey Lever, (chelsey.lever@navy.mil), NSWC, Dahlgren, VA
expressed interest in becoming a chair-elect for the DTT SubTAG.
!
The first presenter was Ms. Katrin Helbing (Department of Homeland Security, TSA,
Katrin.Helbing@dhs.gov) who spoke on Objective, Quantifiable Measurement Tool to Measure
On-display Image Quality. An inherent component of aviation security is the transportation
security officers’ (TSOs) ability to detect potential threats in carryon and checked baggage. TSOs
review displayed X-ray (2D) and CT (3D) images of passenger’s bags. Visual inspection of these
images plays a large part in the security effectiveness. There are currently no objective methods to
quantify X-ray or computed tomography (CT) image quality for both fixed and moving images as
they apply to security screening. Limitations include:
• Image quality is not tied to operator performance or capabilities, nor is it tied to
detection of various threat components.
• Image quality is currently assessed by running a ‘test kit’ (ASTM F792) through the xray machine, and ‘seeing’ what the smallest resolvable element is on the screen.
• Image must be stationary for assessment.
!10
Key research goals include:
• Use of COTS hardware with custom algorithms to develop a robust image quality
measurement system for use on fixed and dynamic X-ray and CT imagery.
• Development of novel edge-based image measures for effectively describing complex
structures in X-ray imagery to objectively quantify image quality.
• Development of techniques for robustly modeling the relationship between functional
image quality ratings and human performance; tie metrics to X-ray threat detection
performance.
The prototype was completed in April 2014.
!
The second presenter was Mr. Matt Wilson, Simventions, Inc. (mwilson@simventions.com) who
spoke on Mission Task Analysis Tool (MTAT). The sponsor for this work was NAVAIR 4.6. The
MTAT is a software-based tool that supports individuals and teams conducting task analyses. It
provides an integrated workspace that pulls together the various contributors to the task analysis.
MTAT consists of five modules: mission analysis, function analysis, function allocation, task
analysis and resource analysis. The mission analysis module provides the mission context for the
analysis; it supports identification of mission elements and development of scenarios. The function
analysis module identifies functions, maps them onto the scenarios, supports allocation of
functions and identifies associated tasks. The task analysis module captures the tasks, defines how
they are accomplished (e.g, timing and associations), identifies required information and workload
impacts. The resource analysis module identifies and organizes resources within a logical
hierarchy in order to facilitate identification of KSAs. The MTAT tool has been used to evaluate
different versions of the F/A-18 aircraft. The tool can generate an export file for IMPRINT.
!
The next presenter was Mr. Charles Dischinger, Jr. (Charles.dischinger@nasa.gov), NESC/
Human Factors Discipline Deputy, Marshall Space Flight Center, NASA, who spoke on Jack
Library of Postures and Motions. Many NASA design projects use digital human anthropometric
models in the development of worksites, including for ground processing of launch vehicles.
Marshall Space Flight Center (MSFC) primarily uses Siemens Jack (Jack) for this sort of design
analysis for the Space Launch System (SLS). The SLS and other Launch Vehicles (LV) must be
designed for human tasks associated with vehicle assembly and maintenance tasks at the launch
site. Analysts have identified difficulties with achieving naturalistic postures and motions for the
digital human model (DHM). Because the commercial tool lacks behaviors appropriate to earth
gravity, the DHM does not automatically assume postures that humans solve naturally; e.g., for
lifting a heavy object or stepping through a hatch. Attempts by analysts to position the DHM and
to have it perform the series of postures associated with the task (lift or step up and over, for
example) are time-consuming and often unsuccessful. The project described in this paper
integrated motion capture technologies with the Jack virtual modeling tool to create a library of
postures and motions that can be imported into the virtual environment and used to assess various
human factors aspects of LV worksite designs. This presentation discussed the process for data
collection and importation to Jack models, as well as the method of applying the model “behavior
primitives,” as they are called, to worksite analysis. The Jack postures are available via DVD
though Mr. Dischinger at NASA.
!11
!
DOD HFE TAG Operating Board Meeting: The Operating Board meeting was chaired
by CDR Henry Phillips (NAVAIR 4.6T), Vice Chair of the DOD HFE TAG; LCDR Jeff Grubb;
and, LCDR Greg Gibson.
!
Special Topic: Human Readiness Levels were mentioned during the Plenary session. The DOD
HFETAG was requested to develop a POA&M so a tiger team was recruited. The team will
consist of:
Dan Wallace, Navy
Hector Acosta, HQ Air Force Recruiting Service
Dawn Woods, Army Soldier Support Center, Natick
Diane Mitchell, US Army ARL
Jesse Chen
Janae Lockett-Reynolds, DHS
Shannon Cole, USCG
Kat Muse Duma
Bonnie Novak, CSPC
!
!
Caucus Reports: (including suggested places for future meetings)
• USAF: John Plaga is the incoming Caucus representative.
• Army: No report
• USN: Brown Bag meetings are held via DCO
• TS/I: Suggestion made to add a new Special Interest Group – Human Factors in Cyber
Security. We will submit a half-page draft charter for consideration by the EXCOMM. AJ
Muralidhar (NSWC Dahlgren) volunteered to help author the charter.
• FAA: No report.
• NASA: No report
• DHS: Janae Locket-Reynolds. Discussed HRLs and internal organizational topics.
Discussed possible internships at DHS.
!
SubTAG Reports:
• Cognitive Readiness: The SubTAG is forming a Working Group to review the literature to
define cognitive readiness. It will define the field, scope, how to test, etc. Hopefully, an
operational definition will be ready by the next TAG meeting.
•
Controls and Displays: Good meeting.
• Design Tools and Techniques: Good attendance. Three excellent presentations, all
releasable. One person volunteered to serve as chair-elect.
• Extreme Environments: No report.
• HFE/HSI: Two sessions were held at this TAG meeting (joint with Human Performance
Measurement)
• HF in Training: No report.
!12
•
•
•
•
•
•
•
!
Modeling and Simulation: Nine presentations. Considering a training workshop on
Modeling and Simulation on the Monday preceding the next TAG meeting.
Personnel Selection: Good attendance. Good collaboration between Army, Navy and Air
Force. A possible new activity may be begun to generate a taxonomy.
Safety/Health Hazards/Survivability: All presenters were from WPAFB, concentrating
on head/neck injury. Also a presentation on the SAFE association.
Standardization: Lots of good presentations.
Human Performance Measurement: No separate report.
TS/I: Presentation by Owen Seely (NSWDD, W16 HSI) on the newly begun DOD project
to develop a Human Systems Integration Standard.
Unmanned Systems: There were three sessions and 14 presentations. Half the attendees
were new to the TAG. This SubTAG was recently “promoted” from a special interest
group. Chair and Vice chair nominations will be made soon.
TAG-69: Next TAG meeting will probably be in the Orlando FL area, sometime between March
and June 2015.
!
TAG Website: Will be updated in the near future. It will include TAG products listing.
!
!!
Submitted by:
Stephen C. Merriman - The Boeing Company
DOD HFE TAG, TS/I Credentialed Representative of SAE International, SAFE and AsMA/HFA
Co-Chair of the DTT SubTAG, Co-chair of the TS/I SubTAG
972-344-2017 (office)
214-533-9052 (cellular)
stephen.c.merriman@boeing.com (office)
scmerriman@tx.rr.com (home)
!
!13
ATTACHMENT (1)
!
DOD HFE TAG Background
!
The DoD HFE TAG was begun via memorandum of agreement signed by the Service Secretaries
in November 1976. Goals of the TAG were established as follows:
!
!
• Provide a mechanism for exchange of technical information in the development and
application of human factors engineering.
• Enhance working level coordination among Government agencies involved in HFE
technology research, development and application.
• Identify human factors engineering technical issues and technology gaps.
• Encourage and sponsor in-depth technical interaction, including subgroups as required in
selected topical areas.
• Assist as required in the preparation and coordination of Tri-Service documents such as
Technology Coordinating Papers and Topical Reviews.
The TAG addresses research and technologies designed to impact man-machine system
development and operation throughout the complete system life cycle. Topics include:
!
!
• Procedures for use by HFE specialists, system analysts and design engineers in providing
HFE support during system development and modification
• Methodologies to identify and solve operator/maintainer problems related to equipment
design, operation and cost/effectiveness
• Mechanisms for applying HFE technologies, including formal and informal approaches to
validation and implementation, and the determination of time windows for application.
The TAG comprises technical representatives from Government agencies with research and
development responsibilities in the topical areas mentioned above. Additional representatives
from activities with allied interests affiliate with the TAG as appropriate. Technical experts in
special topic areas may augment attendance at specific meetings. Also participating in the TAG
are official representatives of technical societies (e.g., Human Factors and Ergonomics Society,
SAFE Association) and industrial associations (e.g., Government Electronics and Information
Technology Association) with a stated interest in HFE. These representatives may attend
subgroup and general plenary sessions and they must be credentialed by the TAG prior to
attending any meetings.
!
To facilitate detailed technical information exchange, the TAG is composed of committees and
subgroups, or “SubTAGs.” Committees are established to address specific issues or problems and
are disestablished upon completion of their tasks. SubTAGs address problems of a general or
continuing nature within a specific field of HFE technology. Membership in SubTAGs and
committees may include non-government personnel involved in research, development and
application. Attendance by non-government individuals is possible if the person is either
sponsored by a government agency or if accepted by the TAG chair prior to the meeting. Chairing
!14
of the various subgroups and committees is rotated among the Services, NASA, FAA, DHS and
TS/I members, as provided in individual charters.
!
The current sub-groups typically meeting at the HFE TAG meeting are as follows.
!
Sub-TAGs:
• Cognitive Readiness
!
•
Controls and Displays
•
Design: Tools and Techniques
•
HFE/Human Systems Integration: Management and Applications
•
Human Factors in Extreme Environments
•
Human Factors Standardization
•
Human Factors Test and Evaluation
•
Human Factors in Training
•
Human Modeling and Simulation
•
Human Performance Measurement
•
Personnel Selection and Classification
•
System Safety/Health Hazards/Survivability
•
Technical Society/Industry
•
Unmanned Systems
•
User-Computer Interaction
!15
ATTACHMENT (2) Meeting Theme
!!
TAG-68 Theme
!
!
Collaboration among agencies to maximize performance
While intellectual collaboration has always been an integral part of human factors
engineering, collaboration among agencies continues to evolve. New systems and
technologies have prompted an increased need for collaboration to solve problems, with
experts from different agencies each bringing unique perspectives, skills, and expertise.
Collaborations between agencies allow a cross-pollination of methods and ideas and
provide opportunities to learn from other disciplines. Different agencies may work
together to address a single area, with each agency focusing on different aspects of the
same issue or area or may form a coalition to jointly address an issue.
!
!
The challenge for this TAG is to identify opportunities and benefits of inter-agency
collaboration.
!16
ATTACHMENT (3)
Department of Defense
Human Factors Engineering Technical Advisory Group Meeting 68
!Monday, 19 May
19-22 May 2014, Aberdeen Proving Ground, MD
•
•
•
•
•
0830 - 1000 Executive Committee meeting
1000 - 1100 New member orientation
1100 - 1300 Luncheon break
1300 - 1700 Plenary Session
1800 - 2000 No-Host Mixer (location TBD)
•
•
•
•
•
•
•
•
•
•
•
•
•
0730 - 0830 Technical Society/Industry
0830 - 1100 Standardization Session I
0830 - 1100 HFE/HSI: Management and Applications
0930 - 1000 Networking, coffee
1100 - 1230 Luncheon Break
1230 - 1430 Standardization Session II
1230 - 1430 Training Session I
1230 - 1430 Test and Evaluation
1430 - 1500 Networking, coffee
1500 - 1700 Training Session II
1500 - 1700 Modeling and Simulation
1500 - 1700 Cognitive Readiness
1700 - 1800 Service Caucuses
Tech Society/Industry
DHS/FAA/NASA
USN/USMC/USCG
Army
Air Force
!Tuesday, 20 May
!Wednesday, 21 May
•
•
•
•
•
•
•
•
•
•
•
•
0830 - 1100 Unmanned Systems Session I
0830 - 1100 User-Computer Interaction
0830 - 1100 System Safety/Health Hazards/Survivability
0930 - 1000 Networking, coffee
1100 - 1230 Luncheon Break
1230 - 1430 Unmanned Systems Session II
1230 - 1430 Extreme Environments
1230 - 1430 Controls and Displays
1430 - 1500 Networking, coffee
1500 - 1700 Personnel
1500 - 1700 Human Performance Measurement
1800 - 2200 No-Host Dinner & Social (location TBD)
•
•
•
•
•
0830 - 1100 Design: Tools and Techniques
0830 - 1100 Unmanned Systems Session III
1100 - 1230 Luncheon Break
1230 - 1400 Operating Board Meeting
1400 - 1700 Tour of ARL/HRED
!Thursday, 22 May
!17
ATTACHMENT (4)
DOD HFE TAG Operating Board
Executive Committee
!
Chair (Army)
J John Warner
John.d.warner38.civ@mail.mil
Vice Chair (Navy)
CDR Henry Phillips
Henry.phillips@navy.mil
Immediate Past Chair (Air
Force)
MAJ Eric Phillops
eric.phillips@wpafb.af.mil
Army Representative
Dawn Woods
dawn.l.woods6.civ@mail.mil
Navy Representative
AJ Muralidhar
ajoy.muralidhar@navy.mil
Air Force Representative
MAJ Jeff Scott
Jeffrey.j.scott22.mil@mail.mil
NASA Representative
Cynthia Null
Cynthia.h.null@nasa.gov
FAA Representative
Thomas McCloy
tom.mccloy@faa.gov
DHS Representative
Janae Lockett-Reynolds
janae.lockett-reynolds@dhs.gov
TS/I Representative
Stephen Merriman
stephen.c.merriman@boeing.com
OSD POC
Jill McQuade
jill.m.mcquade.civ@mail.mil
!
!
!
SubTAG Chairs
SubTAG
Chair/Co-Chair
Contact Information
Cognitive Readiness
Gregory Gibson (chair)
gregory.gibson@nrl.navy.mil
Controls & Displays
Marianne Paulsen (chair)
marianne.paulsen@navy.mil
Design Tools & Techniques
Michael Feary (co-chair)
Steve Merriman (co-chair)
michael.s.feary@nasa.gov
stephen.c.merriman@boeing.com
Extreme Environments
Mihriban Whitman (chair)
Mihriban.whitmore-1@nasa.go
v
HFE/HSI
Pamelyn Maynard (chair)
pamelyn.maynard@navy.mil
Human Performance
Measurement
Jeff Grubb (co-chair)
Rahel Rudd (co-chair)
jeff.grubb@navy.mil
rahel.rudd@wpafb.af.mil
Modeling & Simulation
Ranjeev Mittu (co-chair)
John Rice (co-chair)
ranjeev.mittu@nrl.navy.mil
John.rice@noboxes.org
!18
Personnel
Hector Acosta (co-chair)
Brennan Cox (co-chair)
hector.acosta.2@us.af.mil
brennan.cox@med.navy.mil
Standardization
Al Poston (chair)
aposton86@comcast.net
Tech Society/Industry
Barbara Palmer (co-chair)
Steve Merriman (co-chair)
palmer_barbara@bah.com
stephen.c.merriman@boeing.c
om
Test & Evaluation
Amanda Bandstra (cochair)
Darren Cole (co-chair)
amanda.bandstra@navy.mil
darren.cole.1@us.af.mil
Training
Beth Atkinson (chair)
beth.atkinson@navy.mil
Unmanned Systems
Ajoy Muralidhar (chair)
ajoy.muralidhar@navy.mil
User Computer Interaction
John Taylor (chair)
John.k.taylor3@navy.mil
!
ATTACHMENT (5) DoD HFE TAG Attendees
!
!
!
!
!
!
Will be added when available.
!19
ATTACHMENT (6)
!
DoD HFE TAG Policies
!1. Membership (General membership policies are outlined in the Operating Structure, under "Group
Composition.")
! 1.1 Individuals who are not affiliated with Government agencies (but who are associated with technical
!
societies or industrial associations with a stated interest in human factors engineering) wishing to
affiliate with the TAG may contact the current Technical Society/Industry SubTAG Chair to
ascertain eligibility under the TAG Operating Structure. Once eligibility has been ascertained, the
individual should submit a letter on the organization's letterhead, confirming his/her status as the
organization's representative, to the current Chair of the Technical Society/Industry SubTAG.
1.2 Emeritus Membership may be approved by the Executive Committee on a case-by-case basis for a
former TAG member who is retired from government service or defense industry. Emeritus
Membership is automatically deactivated during any period or re-employment with the government
or defense industry.
!2. Meeting Sites (Sites are recommended by the service caucus whose turn it is to host the TAG with a
view toward a balance in geographic location and meeting facilities.)
! 2.1 TAG members are encouraged to recommend potential meeting sites.
! 2.2 Organizations who wish to host the TAG should contact their Service Representative or the current
TAG Chair.
!3. Agenda
(The agenda is determined approximately three months before the scheduled meeting. The
Chair Select selects the topics from those recommended by the Service Representatives, hosting agency and
the TAG Coordinator.)
! 3.1 TAG members are encouraged to suggest potential agenda topics or topics suitable for tutorial
sessions to their Service Representative, the current TAG Chair, or the TAG Coordinator.
!4. Registration
(Registration fees and the date of the close of registration are announced in an information
letter sent approximately two months before the scheduled meeting.)
! 4.1 All attendees are expected to pre-register and prepay by the announced close of registration.
! 4.2 Only individuals receiving late travel approvals may pre-register on-site. Payments made at the
meeting site must be in cash.
!5. Minutes
(The Minutes of each meeting serve as the principal mechanism for the reporting of TAG
activities. The Minutes will be published as a draft document on the website.)
! 5.1 Individuals or agencies desiring to be included on the distribution list for a specific meeting should
contact the TAG Coordinator.
6. SubTAGs and Committees (See the Operating Structure, section entitled "TAG SubTAGs," for
specific information regarding the purposes and operating procedures of SubTAGs and committees.)
!
!
6.1 All SubTAGs and committees are encouraged to meet in conjunction with the TAG at least once
each calendar year.
!20
!
!
!
!
6.2 All SubTAGs and committees meeting in conjunction with the TAG are required to provide a
chairperson for the specific meeting.
6.3 All SubTAG and committee chairpersons are to submit a brief report of each meeting to be
included in the set of TAG Minutes covering the SubTAG/committee meeting time frame.
6.4 All SubTAGs and committees are required to provide the TAG Coordinator with an up-to-date list
of their membership for use in the distribution of TAG announcements.
6.5 All SubTAGs are required to submit to the Executive Committee a Charter including, but not
limited to, statements regarding:
! 6.6
!
!
!7.
•
•
objectives
scope
•
•
membership policies
chair selection/tenure
•
meeting schedule
Committees are required to submit to the Executive Committee a document including, but not
limited to, brief statements regarding:
•
•
•
objectives
membership policies
chair selection/tenure
6.7 Rotation of the chair position is determined by SubTAG charter. If the position cannot be filled by
the appropriate service at the election meeting, the SubTAG may progress to the next service
willing to chair the SubTAG
SubTAG Establishment
!
7.1 Groups interested in addressing technical areas not covered by existing SubTAGs may request the
TAG Chair to provide meeting time.
! 7.2 Formal SubTAGs and committees may be established by recommendation of the Executive
Committee.
!8. Chair/Representative
Selection (General selection procedures are outlined in the Operating Structure
under "Conduct of Business.")
! 1. A Service caucus may be called by the TAG Chair or the current Service Representative.
!8.2 Methods of determining the Chair Select and Service Representatives are Service dependent.
! 8.3 Unexpired terms of office will be filled by appointment by the Executive Committee, until a caucus
of the Service can be called at the next regularly scheduled TAG meeting.
!9. Funding
The funding required for the organization, conduct, franking, and documentation of all TAG
meetings shall be done jointly by the three Services and other selected agencies. The specific mechanisms
to obtain and allocate funding from the Services/agencies shall be arranged by the Current Chair, Chair
Select, and Immediate Past Chair.
!10. Policy Changes
! 10.1 Additions to or amendments of the above policies may be recommended by submitting the
suggested change(s) in writing to the TAG Chair.
!
!21
!
!
!
!
10.2 Policies may be amended by a majority vote of those Operating Board members in attendance at
the Operating Board meeting at which amendments have been proposed.
Amended 14 November 1989 at TG-23, Killeen, Texas.
Amended 3 May 1994 at TAG-32, Oklahoma City, Oklahoma.
Amended 8 May 1996 at TAG-36, Houston, Texas.
Amended 7 November 2002 at TAG-48, Alexandria, Virginia.
!22
Download