Common Human Performance Vocabulary

advertisement
Common Human Performance Vocabulary
This compilation of terms extends into display design, human computer interaction, as well as
root causes analysis and other related human factors engineering fields.
Acceptable margin of error — A criterion, based on sample size, which helps determine the
level of confidence you can have in your research findings.
Accident — An unfortunate mishap especially one causing damage or injury.
Accountability — The expectation that an individual or an organization is answerable for
results; to explain its actions, or be subject to the consequences judged appropriate by others; the
degree to which individuals accept responsibility for the consequences of their actions, including
the rewards or sanctions.
Action — Externally observable, physical behavior (bodily movements or speech). (See also
behavior.)
Active errors — Errors that change equipment, system, or plant state triggering immediate
undesired consequences.
Active listening — Interviewing technique in which the interviewer rephrases participant’s
responses to confirm understanding. Rephrasing should be neutral and provide opportunities for
participants to describe issues in greater depth.
Active voice — The voice of a verb tells whether the subject of the sentence performs or
receives the action. In the active voice the subject performs the action and the direct object
receives the action. For example, “The user selects the drop-down box.” The active voice differs
from the passive voice where the subject receives the action, e.g., “The drop-down box is
selected by the user.” See passive voice.
Administrative Control — Direction that informs people about what to do, when to do it, where
to do it, and how well to do it, and which is usually documented in various written policies,
programs, and plans.
Adverse Trend -A series of occurrences in which the frequency combined with the
significance of the occurrences warrants further evaluation and/or corrective action.
Affinity diagram — A group decision-making technique designed to sort a large number of
ideas, concepts, and opinions into naturally related groups. Used in documenting task or content
relationships, often created from card sort activities or group brainstorming.
Affordance — When a control behaves as its appearance suggests. For example, a push button is
said to have good affordance when it looks clickable. Examples of poor affordance: a push
button that doesn’t look clickable or a control looks like a push button, but is not. Good
affordance increases intuitiveness.
Alignment — The extent to which the values, processes, management, and existing factors
within an organization influence human performance in a complementary and non-contradictory
way; facilitating organizational processes and values to support desired safe behavior.
Analysis paralysis — Spending too much time reducing data or analyzing tasks at the cost of
overlooking emerging opportunities for design improvement.
Anatomy of an Event — A cause-and-effect illustration of the active and latent origins
(linkages) of plant events initiated by human action.
Apparent Cause — problem- or condition-cause determination based on the
evaluator’s judgment and experience where reasonable effort is made to determine
WHY the problem occurred. This might include fact finding, analysis, interviewing,
benchmarking, reviewing data or maintenance history, or other methods as
appropriate.
Assumption — A condition taken for granted or accepted as true without verification of
the facts. (See also belief, mental model and unsafe attitudes.)
At-Risk Practice — A behavior or habit that increases the chance for error during an
action, usually adopted for expedience, comfort, or convenience.
Attitude — An unobservable state of mind, or feeling, toward an object or subject.
Auto-complete — A process whereby a drop-down list, combo box, or text entry field fills in
once the user has typed enough characters to have a complete match.
Barrier — Anything that keeps operations or processes within safe limits or protects a system or
person from a hazard. (See also controls and defense.)
Barrier Analysis — An analysis of the adequacy of physical and administrative barriers in
place to prevent the occurrence of an event. A barrier is any structure, rule, or work practice
that separates or safeguards equipment or personnel from a potential event. Barriers may be
physical (doors, lockout devices, guards) or administrative (procedures, forms, meetings).
Basic navigation model — An elementary model for navigation that defines how a user can
move within a site. The basic models are:
• Hierarchical drill-downs
• Sequential page model
• Persistent menus
• Search engines
When used properly, the models support users’ tasks. When used in combination, complex
navigation models are possible that support complex task requirements.
Behavior — The mental and physical efforts to perform a task; observable (movement, speech)
and non-observable (thought, decisions, emotional response, and so forth) activity by an
individual. Generally, we treat observable behavior as measurable and controllable.
Behavior Engineering Model —An organized structure for identifying potential environmental
and individual factors that impact performance at the job site, and for analyzing the
organizational contributors to those factors.
Belief — Acceptance of and conviction in the truth, existence, or validity of something,
including assumptions about what will be successful.
Belt and suspender rule — A metaphor for using attributes one at a time. For example, make
the header bold or increase the font size, not both. When using a belt, one doesn’t need
suspenders.
Benchmarking — A process of comparing products, processes, and practices against the best in
class, the toughest competitors or those companies recognized as industry leaders; discovering
innovative thinking or approaches.
Benchmark testing — Testing an application against a set of standard best practices or
established criteria.
Bookmarks — 1. A list of favorite Web sites stored by a Web browser, also called “favorites.”
2. Target of a link within a Web page, also called an “anchor.”
Bounded field/Unbounded field — The ability of a control to allow for freeform entry vs.
forced selection from a set of options. A bounded field (e.g., list box) forces selections making it
less error prone than a text entry field, which supports freeform entry. A text field with a format
mask gives the field a bounded quality, making it less error prone (e.g., date fields with format
slashes).
Branding — The deliberate process of creating individuality and market value around the
concept of a product name. Effective branding efforts enable companies to convey
distinctiveness and value to their various audiences.
Branding elements — Elements such as graphics, text, theme, etc. used to create branding.
Breadcrumbs — An auxiliary form of navigation consisting of a trail of links from the current
Web page back to the home page of a Web site.
Bulletin board service (BBS) — An online service where users can download files and games,
ask questions, post announcements, and hold discussions on a particular topic.
Cause (Causal Factor) —. A condition or an event that results in an effect (anything that shapes
or influences the outcome). This may be anything from noise in an instrument channel, a pipe
break, an operator error, or a weakness or deficiency in management or administration. In the
context of DOE Order 5000.3A there are seven major cause (causal factor) categories. These
major categories are subdivided into a total of 32 subcategories.
Cause Analysis — the methodologies used to systematically perform an analysis of a given
event to determine why the event occurred.
Change Analysis — An analysis of the difference between what actually happened and what
was expected to happen. Also, change analysis can be used to compare the conditions at the time
of the event with conditions prior to the event.
Change Management — A methodical planning process to establish the direction of change,
align people and resources, and implement the selected modifications throughout an
organization, large or small.
Checking – The act of confirming the actions of a performer are correct, without error.
Chromatic aberration — The unequal refraction of light rays of different lengths passing
through a lens, producing a blurred image.
Chromostereopsis — The visual effect of vibration or floating when the eye is attempting to
accommodate at extreme ends of the color spectrum (e.g., when reds and blues are placed side by
side).
Coaching — The process of facilitating changes in behavior of another person through direct
interaction, feedback, collaboration, and positive relationships. (See also feedback.)
Cognitive (cognition) — Descriptive of mental activity related to sensing and thinking phases of
information processing; perception, awareness, problem-solving, decision-making, and
judgment.
Cognitive walkthroughs — A usability testing strategy in which a developer group
systematically evaluates each element on every screen in the context of the various tasks (e.g.,
how likely would a user be to click this button on Task A? What would happen if they did? etc.).
Color depth — The number of colors that can be displayed on a monitor at any given time. The
Video Graphics Array (VGA) standard allows a color depth of 256, for example.
Common Cause — A single cause that is common to several events.
Common Mode Failure — Multiple structures, systems or components (SSC's) failing or
having the potential for failure in the same way or due to the same event.
Complacency — Self-satisfaction accompanied by unawareness of actual dangers, hazards, or
deficiencies; being unconcerned in a hazardous environment.
Configuration control – The management of plant operational configuration, physical
configuration, design configuration, and design bases to ensure that owner and regulatory
requirements are satisfied at all times and to ensure consistency among the design bases and
design requirements, the physical plant, and facility configuration information.
Contributing Factors — those conditions or an event which, in combination with the cause,
increases the consequences of an event or adverse trend or otherwise changes the outcome.
Corrective Action — actions taken to correct and prevent the recurrence of a problem or failure
which may include both long- and short-term corrective actions.
Common region — The Gestalt principle of grouping that states items grouped within a region
(e.g., using a background plane) tend to be perceived as belonging together.
Comparison tests — Tests done at almost any stage of the development life cycle that compare
applications against a set of established criteria. These tests can be done with users (referred to as
a “within subjects design”) or by experts.
Computer expertise — Relative comfort with technology; adeptness with using a computer or
advanced technology.
Conceptual design — Represents a system structure as users perceive it. Begins the transition
from research and planning into precursors for design including task design and information
architecture. Sets the foundation for developing a site navigation framework by clearly defining
the users, their tasks and environment, and how they conceptualize information architecture.
Includes usability testing of task design and information architecture in its pre-prototype form.
Confirmation bias — The human predisposition to notice information that is consistent with our
current beliefs but to ignore information that conflicts with our current beliefs.
Connectedness — The Gestalt principle of grouping that states that items connected with visual
elements (e.g., lines) tend to be perceived as belonging together.
Conservative Decision-Making — Reaching conclusions by placing greater value on safety
than the production goals of the organization—decisions demonstrate recognition and avoidance
of activities that unnecessarily reduce safety margins.
Content design A term given to a set of design areas that focuses on the information value of
content, as opposed to the presentation of it. The content topics include editorial style,
internationalization, and accessibility. The value of the term “content design” is relative; i.e., it is
intended to differentiate these topics from other topics for purposes of evaluation and
development.
Content graphic — A type of graphic designed for the purpose of providing specific content, as
differentiated from graphics that add aesthetic value or brand value. Examples of content
graphics include complex charts, maps, and product photographs. Of all the graphic types,
content graphics are the most likely to serve as a destination in their own right, as opposed to a
marker for entry into information (e.g. icon).
Context effect — The effect of surrounding elements on the perceived meaning or use of an
isolated element. For example, the meaning of an individual link within a menu of links is
greatly influenced by the links immediately around it.
Contextual inquiry — A direct data gathering method in which the usability analyst “shadows”
an end-user through their day/tasks. Helpful for developing a clear understanding of both the
context of the tasks and a compressive environmental analysis.
Controls — Administrative and engineering mechanisms that can affect the chemical, physical,
metallurgical or nuclear process of a nuclear facility in such a manner as to effect the protection
of the health and safety of the public and workers, or the protection of the environment. Also,
error-prevention techniques adopted to prevent error and to recover from or mitigate the effects
of error; to make an activity or process go smoothly, properly, and according to high standards.
Multiple layers of controls provide defense in depth.
Cookies — A small file with user-specific information that the server writes to the user’s hard
disk for later access. Originally intended as a mechanism for customization (remembering
favorite purchases and wish lists, storing shopping cart contents), many have raised serious
concerns about privacy issues.
Critical activity/task – An engineering activity, an evolution, or a task that is vital to nuclear
safety, industrial safety, environmental protection, regulatory compliance, or plant/system
performance. This typically involves one or more critical attributes, such that undetected errors
with these activities/tasks will result in intolerable consequences to the plant or to personnel.
(Vital means the engineering product can have a direct, and possibly immediate, adverse impact
either during installation or testing or upon implementation of the product in question.)
Critical attributes – Risk-related aspects of engineering activities that could directly affect the
following.
• reduction in safety margins
• alignment of physical configuration and design requirements
• operability/functionality of risk-important systems and equipment, especially critical
components (such as Maintenance Rule equipment)
• protection against single-point failure vulnerabilities
• control of human error by the user at critical steps of related activities
• protection of the environment
• prevention of regulatory concern
• adequacy of installation and constructability
• control of security, generation, and economic risks
• past success instead of failure used as a basis for design
Critical Step — A procedure step, series of steps, or action that, if performed improperly, will
cause irreversible harm to equipment, people, or the environment.
Cross check (cross-validation) — Error checking technique usually applied to forms that
compares two or more field inputs.
Culture — An organization’s system of commonly held values and beliefs that influence the
attitudes, choices and behaviors of the individuals of the organization. (See also safety culture.)
Cultural Control — Leadership practices that teach (consciously and unconsciously) their
organizations how to perceive, think, feel, and behave.
Decision table — An information mapping technique that simplifies complex logic presented in
textual form by re-writing it as a visual table.
Defect – An undesired result of an error committed earlier in the engineering process, which
becomes embedded in either the physical plant or design bases documentation.
Defense — Means or measures taken to prevent or catch human error, to protect people, plant, or
property against the results of human error, and to mitigate the consequences of an error.
Defense os a term used in much of the human performance literature. However in DOE the term
“controls) is preferred as it is synomonous with the term “defenses” and “controls” is the term
defined and used with the DOE ISMS. (See also barrier and controls.).
Defense-in-Depth — The set of redundant and diverse controls, barriers, controls, and
safeguards to protect personnel and equipment from human error, such that a failure with one
defense would be compensated for by another defensive mechanism to prevent or mitigate
undesirable consequences.
Dependency — The increased likelihood of human error due to the person’s unsafe reliance on
or relationship with other seemingly independent defense mechanisms. (See also team error.)
Design moves — Changes made to the design based on test results.
Design validation — A post hoc evaluation that the site has the functions and elements
identified as needed in the user analysis. Tests the correspondence of the design with the end
user’s actual needs.
Design verification — The process of confirming that the interface, as built, corresponds with
the design that was specified. In contrast, design validation tests correspondence of the design
with the end user’s actual needs.
Detailed design — After the high-level structure, navigation and architecture are worked out,
the design focus shifts to presentation, content, and interaction issues. Advanced prototypes are
developed to test detailed page elements (e.g., Web controls, color, graphics, wording) and the
basic building blocks of a Web standard can be defined.
Deuteranopia — A specific type of color weakness based on the reduced ability to perceive
colors within the green spectrum. Reportedly the most common type of color weakness.
Direct Cause — The active factor that triggered an undesirable event. If it had not
occurred, there would have been no event or condition. There is usually a single direct
cause: an initiating action or a failure mechanism.
Direct user data — User data collected through direct, face-to-face interaction with end-users.
Methods include direct interviews, focus groups and usability round tables.
Dithering of colors — Mixing dots of various colors to produce what appears to be a new color.
Dithering produces a variable color palette apart from the 256 Web-safe colors.
Domain expertise — User experience with the topic or focus of an application or tool. For
example, the domain of TurboTax is income tax preparation and law. Domain expertise is
distinct from computer expertise.
Early adoptors — Individuals who integrate and use new technologies into their lifestyle as
soon as they are available-often well before the general public begins to use the technology.
Efficiency — A usability metric that captures how easily a task is completed with a given
interface (e.g., time to completion, key-strokes to completion). Must be measurable in
quantitative terms.
Engineered Controls — Those physical items (hardware, software, and equipment) in the
working environment designed to modify behavior and choices, or limit the consequences of
undesired actions or situations. These controls may be active (requires action/change of state) or
passive (defense requires no action).
Engineering assumption – A hypothesis, theory, supposition, or premise that is accepted as
true without supporting documentation; design criteria accepted as true or conservative in order
to bound inputs. (Alternatively, an unverified assumption is an assumption that has not or cannot
be validated or trusted as correct without additional data or testing.)
Engineering judgment – The process of applying technical knowledge, experience, and
professional intuition to make sound decisions; a decision that would meet the standard of
acceptance when compared to a rigorous and analytical evaluation.
Environmental profile — A snapshot of the circumstances external to the users and their tasks
that relate to them accomplishing their goals with the system. Includes the setting, circumstances,
and physical systems used.
Error — an action (behavior) that unintentionally departs from an expected behavior according
to a standard.
Error analysis — A component of task analysis which strives to identify the frequency and type
of errors that occur for a specified set of task flows. Can include Errors of Omission, Errors of
Commission, Sequence Errors or Timing Errors.
Error-Likely-Situation — A task-related predicament involving a potential error provoked by
unfavorable jobsite conditions that reduce the chances for success; an error about to happen. A
work situation in which there is greater opportunity for error when performing a specific action
or task due to error precursors (also known as “error trap”).
Error Mechanism — a descriptor of casual factors which produced or made the error detectable
(the error mode). Communication, work practices, work organization and planning, supervisory
methods, managerial methods, etc.
Error Mode — a description of how the process step or task was performed incorrectly.
Error of commission — A type of error in which a user performs an act incorrectly. This could
involve providing an incorrect input, for example. [Guffaw: A type of error which is truly
entertaining to fellow workers. Usually occurs on a Monday morning or late on Friday. Quickly
forgotten by the subject (see repression) but oft retold by witnesses.]
Error of omission — An error in which a user fails to perform a specific task or step. Failure to
take an expected action.
Error Precursors — Unfavorable factors that increase the chances of error during the
performance of a specific task by a particular individual. Unfavorable prior conditions that
reduce the opportunity for successful behavior at the jobsite. Precursors provoke error.
(See also human nature, individual capabilities, task demands, and work environment.)...
Error rate — Number, frequency or proportion of errors (relative to correct completions) for a
given task or interface.
Event — an unwanted, undesirable consequence to the safe and reliable operation of the system.
An undesirable change in the state of structures, systems, or components or
human/organizational conditions (health, behavior, controls) that exceed established significance
criteria.
Event Analysis — a review of activities causing or surrounding a disturbance of the BPS.
Expectations — Established, explicit descriptions of acceptable organizational outcomes,
business goals, process performance, safety performance, or individual behavior (specific,
objective, and doable).
Factor — An existing condition that positively or adversely influences behavior. (See also
organizational factors.)
Facts — independent verification of data or information.
Failure — The condition or fact of not achieving the desired end(s).
Failure Analysis — Physical analysis performed on components, sub-components, and material
to determine the failure mode and failure mechanism. Failure analysis does not determine root
cause, but provides data for performing a cause and effect analysis.
Failure Mechanism — the descriptor of the medium or vehicle the agent used to produce the
failure mode. (e.g., failure mode — burnt coil; failure mechanism — heat).
Failure Mode — a description, which should contain at least a noun and a verb that describes
how the failure took place. The failure mode should be defined in enough detail at the
component level to make it possible to select a suitable failure management policy. (e.g., bearing
seizes, impeller jammed by foreign objects, blocked suction line, etc.)
Failure Scenario — a series of chronological events beginning with an initiating event and
ending with the identified failure mode.
Fallibility — A fundamental, internal characteristic of human nature to be imprecise or
inconsistent.
Feedback — Information about past or present behavior, and results that is intended to improve
individual and organization performance.
Feedback message — A message that tells the user that an action has been completed
successfully.
Fitness for duty – An evaluation of an individual’s physical and psychological health to
determine whether they are able to perform their essential job functions without creating undue
risk to themselves or others.
Flawed Controls — Defects with engineered, administrative, cultural, or oversight controls that,
under the right circumstances, fail to:
• Protect plant equipment or people against hazards;
• Prevent the occurrence of active errors; and
• Mitigate the consequences of error.
(See also anatomy of an event and defense-in-depth.)
Focal points of design — The four focal points of design that evolve during the design process
are navigation, content, presentation and interaction design.
Focus groups — A direct data gathering method in which a small group (8-10) of participants
are lead in a semi-structured, brainstorming session to elicit rapid feedback about an interface
under development. Focus group data is most useful for generating new ideas or functions for an
interface, rather than evaluating an existing one. Group dynamics often make focus group data
suspect.
Formative testing — Testing the design during development to answer and verify design
decisions. Results are used to modify the existing design and provide direction. Usually done
with paper prototypes.
Function — General means or action in which a system or subsystem fulfills its requirements
Usually expressed in verb form, e.g., “enables access to the “contact us” information.
Functional allocation — The distribution of task responsibilities across humans and technology
for a given task or function.
Fusing data — Bringing multiple levels of information into a single view in order to simplify
the decision making process.
Gap analysis — A technique used to determine the difference between a desired state and an
actual state – often used in branding and marketing. Gap analysis may address performance
issues or perception issues. Smaller gaps are better. The process of comparison of actual results
or behavior with desired results or behavior, followed by an exploration of why the gap exists.
Gestalt principles — A set of principles developed by the Gestalt Psychology movement that
established rules governing how humans unconsciously create order from a complex field of
objects.
Global elements — Refers to page links that remain present across most pages of the site or
application; typically at the top of the page. Global elements are like top-level menus in windows
applications - important enough to be promoted to the top of the menu and to be made available
to users throughout their session (e.g., the LOGIN of LOGOUT links in a selfservice Web
application).
Global navigation — Provides the means to access universal content or functions from every
page.
Grid systems — A system of horizontal and vertical lines providing the underlying structure for
page layout and design.
Heuristics — Encouraging a person to learn, discover, understand, or solve problems on his or
her own, as by experimenting, evaluating possible answers or solutions, or by trial and error: a
heuristic teaching method. Established principles of design and best practices in Web site design,
used as a method of solving usability problems by using rules of thumb acquired from human
factors experience.
Heuristic evaluation/review — Also known as an expert review. Systematic inspection of a
user interface design, measuring it against a set of usability heuristics in order to identify and
prioritize usability problems. Comparison of a site with a very short and simple set of general
principles. Heuristic reviews are quick and tend to catch a majority of the problems that will be
encountered by users. However, expert reviews seldom use real end users so they may miss some
interface issues.
Hierarchical drill-down —See hub-and-spoke.
Hierarchical structures (in information architecture) — The study of how humans expect
objects in the world to behave and how they interact with those objects as a result. In the context
of the web world, the study of how humans perceive and act on the affordances of artifacts.
Homophily — the propensity or tendency of similar individuals to migrate to each other.
Hooks — In journalism, a hook is a technique to grab the reader’s attention. For example, a
question: “Would you like to lose ten pounds this week?”
Hub-and-spoke — In Web site design, a type of structure where the user may jump from the
home page (the hub) to any number of pages (spokes) and back to the home page again.
Hues — The frequency of the wavelength of color; what we normally refer to as the “color” of
an object.
Human Error — A phrase that generally means the slips, lapses, and mistakes of humankind.
Human Factors — The study of how human beings function within various work environments
as they interact with equipment in the performance of various roles and tasks (at the humanmachine interface): ergonomics, human engineering , training, and human resources. (see also
Human Factors Psychology)
Human factors psychology — The study of the predispositions and constraints in human
cognition, perceptual and motor systems in the context of interface development. That is,
exploration of ways to develop safe and efficient technology and other artifacts such that they
provide the “best fit” for human interaction. Traditionally the focus of Human Factors has been
in engineering and industrial design systems such as aviation, military systems, manufacturing,
and automotive design. (see also Human Factors )
Human-Machine Interface — The point of contact or interaction between the human and the
machine.
Human Nature — The innate characteristics of being human; generic human limitations or
capabilities that may incline individuals to err or succeed under certain conditions as they
interact with their physical and social environments.
Human Performance — A series of behaviors executed to accomplish specific results (HP = B
+ R).
Human Performance Evaluation System (HPES) — HPES is a system of charting and
comparing causal factors specific to human error. HPES was developed by the Institute of
Nuclear Power Operations (INPO) to identify causes of personnel performance errors and
inappropriate actions.
Human Reliability — The probability of successful performance of human activities, whether
for a specific act or in general.
Hybrid navigation model (hybrid structure) —The combination of basic navigation models
(e.g., a hub-and-spoke with a persistent model) that supports a user task flow. Hybrid structures
are typical of complex sites and often strive to flatten the information hierarchy to reduce the
number of steps to content.
Hyperlinks — In a hypertext system, a link that takes the user to another page or another
location within the same page. Hyperlinks are usually blue and underlined.
Icon graphic — A type of graphic designed for the purpose of representing an object or action
or marker for entry into information. Icons are usually selectable. To be differentiated from
graphics that purely add aesthetic brand value, or that offer content in and of themselves.
Image map — A graphic containing selectable links or target areas.
Image placeholders — The text that appears while an image is downloading. Provides users
with descriptive information about the graphic while they are waiting.
Immediate Action — An action taken immediately after the discovery of a condition, or any
time prior to Evaluation, that is intended to prevent the recurrence or further exacerbation of the
condition or consequences. Immediate actions may include compensatory actions of a
temporary change implemented to mitigate immediate risk, restore operability or to otherwise
enhance the capability of degraded or nonconforming structures, systems or components until
appropriate cause determination and final corrective actions are complete.
Inappropriate Action — Human action or omission, either observed or unobserved, that:
Resulted in an undesirable or unwanted condition/result
Led the task or system outside its acceptable limits
Was not desired by a set of rules or an external observer
Was not necessarily the fault of the individual committing it
Independent – Freedom of thought between a performer and a verifier, created by separating the
actions of each individual by physical distance and time, such that audible or visual cues of the
performer are not detectable by the verifier before and during the work activity.
Indirect user data — User-centered data gathering methods which do not involve face-to-face
interactions with the users. Data may originate from surveys, user analysts, or marketing efforts.
Indirect data is not as valuable as direct data in the user-centered analysis process.
Individual — An employee in any position in the organization; that is, worker, supervisor, staff,
manager, and executive.
Individual Capabilities — Unique mental, physical, and emotional abilities of a particular
person that fail to match the demands of the specific task.
Ineffective Corrective Action — An action that did not prevent recurrence of an event for
similar causes or reduce the likelihood of recurrence due to the following inadequacies:
The action did not address the cause(s) for the event.
The action did not address the extent of cause for Root Cause evaluations.
The action did not address the extent of condition.
The action did not correct the original condition as stated in the CR Evaluation.
The action did not introduce undesired effects.
The Evaluation did not include sufficient interim actions until full implementation could
be completed.
The action was not properly implemented or enforced after implementation, thus
increasing the likelihood for recurrence.
The action was not embedded in a process to ensure sustainability (e.g., expectations
conveyed via email vs. implementing a procedural change).
Information architecture — Part of the conceptual design stage; primarily associated with
defining an organization for web site content (but can include characterizing task flows or task
relationships within a content organization). Includes the processes of defining site hierarchies,
content organization and labeling schemes for all types of menu systems, and the techniques for
creating and evaluating them.
Infrequently Performed Task — Activity rarely performed although covered by existing
normal or abnormal procedures.
Initiating Event — An action or condition that begins an event sequence that caused or
allowed the error to occur. A human action, either correct, in error, or a violation; that results in
an event. (See also Anatomy of an Event.)
Interaction design — A term given to a set of design areas that focus on the interaction value of
content, as opposed to its presentation or information value. The interaction topics include Web
controls, error handling and feedback systems. The value of the term “interaction design” is
relative; i.e., it is intended to differentiate these topics from other topics for purposes of
evaluation and development.
Interim Action — Similar to compensatory actions in nature, interim actions are usually
based upon a documented cause determination at the Apparent or Root Cause level. They are
replaced by final corrective actions.
Internet-capable — A device capable of accessing the Internet.
Interocular test or Ocular significance — Only a statistician’s joke. Means the numbers jump
up and hit you between the eyes.
Interviews — One-on-one interactions between end-users and usability analysts designed to
elicit the user’s conceptual model of a system, the tasks and task flows, or other issues related to
design. Direct interviews are the best way to capture user-centered data.
Intranet — A private network, based on Internet technology, providing vital information to
employees of a company or organization.
Irreversible – Actions and related results that cannot be returned to original conditions by
reversing the initiating actions.
Iterative testing — Testing repeatedly as the design converges on a proper decision.
Job — A combination of tasks and duties that define a particular position within the organization
usually related to the functions required to achieve the organization’s mission, such as Facility
Manager or Maintenance Technician.
Job Site — The physical location where people touch and alter the facility.
Job-Site Conditions — The unique factors associated with a specific task and a particular
individual; factors embedded in the immediate work environment that influences the behavior of
the individual during work. (See also error precursors and organizational factors.)
Kerning — Adjusting the amount of space between characters so that the text displays with
optimal legibility (or with the desired effect).
Knowledge & Skill — The understanding, recall of facts, and abilities a person possesses with
respect to a particular job position or for a specific task.
Knowledge-based Performance — Behavior in response to a totally unfamiliar situation (no
skill, rule or pattern recognizable to the individual); a classic problem-solving situation that relies
on personal understanding and knowledge of the system, the system's present state, and the
scientific principles and fundamental theory related to the system.
Knowledge Worker — An individual who primarily develops and uses knowledge or
information (e.g. scientist, engineer, manager, procedure writer).
Labeling systems — The selection and placement of labels that best accommodates navigation.
Lapse — An error due to a failure of memory or recall. (See also slip and mistake.)
Late adopters — Individuals who are slower to adopt new technologies. They are typically
more challenging to design for because they tend to be more distracted by poor interface
usability. In addition, they are less goal-oriented and tend to want a “user experience.”
Latent Condition — An undetected situation or circumstance created by past latent errors that
are embedded in the organization or production system lying dormant for periods of time doing
no apparent harm. (See also latent organizational condition.)
Latent errors — (typically by management and staff) Errors resulting in undetected
organization-related weaknesses or equipment flaws that lie dormant. An error, act, or decision
disguised to the individual that results in a latent condition until revealed later, either in an event,
active error, testing, or self-assessment. (See also latent condition)
Latent Organizational Weaknesses — hidden deficiencies in management control processes
(for example, strategy, policies, work control, training, and resource allocation) or values (shared
beliefs, attitudes, norms, and assumptions) creating workplace conditions that can provoke error
(precursors) and degrade the integrity of defenses (flawed defenses). The decisions and activities
of the station's managers and supervisors determine what is done, how well it is done, and when
it is done, either contributing to the health of the organization or further weakening its resistance
to error and events. Therefore, managers and supervisors should perform their duties with the
same uneasy respect for error prone work environments as workers are expected to at a job site.
Understanding the major role organization plays in the performance of a station — a second
strategic thrust to preventing events — should be the identification and elimination of latent
organizational weaknesses.
Layout graphic — Graphics that help delineate, group, or divide content. A type of graphic
designed for the purpose of organizing content, making it easy to comprehend or scan. Layout
graphics are typically subtle and are least commented on by users, but can be used to support
brand or theme.
Leader — An individual who takes personal responsibility for his or her performance and the
facility’s performance, and attempts to influence the organization’s processes and/or the values
of others.
Leadership — The behavior (actions) of individuals to influence the behaviors, values, and
beliefs of others.
Leadership Practices — Techniques, methods, or behaviors used by leaders to guide, align,
motivate, and inspire individuals relative to the organization’s vision.
Leading — Leading is the vertical space between lines of text. Also called line spacing. It
directs the eye horizontally along the text line.
Learnability — A usability metric which measures how easy it is to begin productively using an
application or interface. That is, how much – if any – training is required?
Lesson learned – A good work practice, innovative approach, or negative experience shared to
promote positive information or prevent recurrence of negative events.
Localization — The process of adapting a product to meet the linguistic, cultural, and other
requirements of a specific target environment or market (or “locale”).
Lightness — The light or dark appearance of a color: i.e., the amount of perceived light present.
See luminance.
Likert scale — A type of survey question where respondents are asked to rate the level at which
they agree or disagree with a given statement on a numeric scale (e.g., 1-7, where 1 = strongly
agree and 7 = strongly disagree).
Line cues — A line placed strategically, usually between every five or six rows of text to aid in
visual scanning.
Line length — Refers to the number of characters per line, not the numeric measurement of the
line.
Local navigation — Relative to global navigation, local navigation refers to navigation within a
local area of a site or application. Includes sub-site navigation, and page-level navigation.
Lo-fi prototypes — Paper, PowerPoint or other non-interactive mockups of an interface
developed early in design. Useful for evaluating the effectiveness of the navigation infrastructure
and labels.
Luminance — The measurement of intensity of light. The subjective experience is brightness.
Management (manager) — That group of people given the positional responsibility and
accountability for the performance of the organization.
Management Practices — Techniques, methods, or behaviors used by managers to set goals,
plan, organize, monitor, assess, and control relative to the organization’s mission. (See also
practices.)
Marketing graphic — A type of graphic designed for the purpose of enforcing or establishing
brand, including the company logo.
Memorability — A usability metric which measures how easy it is to remember how to use an
application or interface after a period of non-use. Memorability metrics assume that users have
used the interface successfully before.
Mental map or model — Structured organization of knowledge a person has about how
something works (usually in terms of generalizations, assumptions, pictures, or key words); a
mental picture of the underlying way in which a system functions, helping to describe causes,
effects, and interdependencies of key inputs, factors, activities, and outcomes. An internal
representation of one’s environment. Users form mental maps to help them navigate in space.
Mistake — Errors committed because the intent of the act was incorrect for the work situation,
typically defined by the condition of the physical plant; incorrect decision or interpretation. (See
also error and compare with slip.)
Modal/non-modal OR modality — Refers to a “mode” of a page or window that guides the
user interaction. A modal window requires the user to finish interaction on that page before a
new page can be accessed (e.g., a dialog box that requires OK or Cancel to be selected before
interaction can return to the primary window). A modeless window allows for continued
interaction with other application windows while the modeless window remains open.
Moderated usability testing — Each participant interacts with a Web site from his or her
location with a computer. A facilitator provides instructions and information about tasks to be
performed. A facilitator observes and participates as needed during the entire test. The moderator
and participant usually talk to each other by phone during the test. Often many remote observers
can see and hear the same activity as does the moderator.
Monochrome — Black and white.
Monospaced font — Typeface attribute in which every letter occupies the same lateral space,
thus the “I” has lots of room while the “m” is cramped. This font is commonly used for input
fields on forms, and in selected applications; but, otherwise, is the least readable category of
type.
Motives — The personal (internal) goals, needs, interests, or purposes that tend to stimulate an
individual to action.
Mouse-primary When the use of the mouse takes precedence over use of the keyboard. A task
is said to be mouse-primary or keyboard-primary.
Navigation design — Based on task design and information architecture definitions developed
in conceptual design, navigation design marks the first “formal” step of design. It includes the
development of wireframes and graphical mockups to test site structure and visual direction. A
set of core navigation pages are designed, tested and iterated during this stage to ensure the user
interface structure is sound before investing in detailed design.
Near Miss — Any situation that could have resulted in undesirable consequences but did not;
ranging from minor breaches in controls to incidents in which all the available safeguards were
defeated, but no actual losses were sustained.
Norm — A behavior or trait observed as typical for a group of people.
Operating experience – Information that relates to the methods in which work is planned and
conducted and an organization’s missions are performed. Operating experience provides the
basis for knowledge and understanding that fosters development of lessons learned and
improvement of operational performance.
Operationalized variable — A variable or metric which has been defined clearly enough to be
observed and measured in a way that is replicable.
Organization — A group of individuals with a shared mission, set of processes, and values to
apply resources and to direct people's behavior toward safe and reliable operation.
Organizational Factors — 1) Task-specific sense: an existing job-site condition that influences
behavior and is the result of an organizational process, culture, and other environmental factors.
2) General sense: the aggregate of all management and leadership practices, processes, values,
culture, corporate structures, technology, resources, and controls that affect behavior of
individuals at the job site.
Organization schemes — What topics or tasks have in common. Defines a logical group.
Organization structures — Relationships between content and groups.
Oversight Control — Methods to monitor, identify, and close gaps in performance.
Performance Any activity that has some effect on the environment; the accomplishment of
work. (See also human performance.)
Page flow — Refers to the arrangement of elements on a page that suggests a hierarchy or
sequence.
Page templates — A term used to refer to a working model of a page for purposes of
implementation on a Web site. A page template is a framework for building a Web page type,
usually made available through a content management system.
Page types — A term used to refer to a set of page components that together form a page
designed to solve a specific user need (e.g., a “search and results” page allow a user to query a
database and review the results of the query). Page types form the basis of an effective Web
standard.
Paper prototypes — Non-interactive mockups of an interface developed early in design. Useful
for evaluating the effectiveness of the navigation infrastructure and labels.
Parallel construction — Using the same format for every text or graphic composition,
especially when constructing lists, e.g., items should be all sentences or all phrases, not a
combination of the two. If an item start with a verb, use the same kind of verb throughout. For
example, action-object phrasing for menu items should be followed consistently for individual
menu items within a group (e.g., View report, create new report, edit report, search reports).
Parse — To separate into component parts. For example, an SGML parser can parse an HTML
document to check for errors.
Passive voice — The voice of a verb tells whether the subject of the sentence performs or
receives the action. In the passive voice the subject receives the action of the verb. For example,
“The drop-down box is selected by the user.” Passive sentences are generally longer, more
complex and more difficult to process cognitinvely. See active voice.
Path analysis and usage statistics — In the Web environment, various tools yield a wealth of
information about users’ behavior on the site.
Performance data — Data that focuses on user behavior and/or how (well) users complete a
task. Did the user get the “right” answer? Usability tries to focus on performance data over
preference data.
Performance Gap — The difference between desired performance and actual performance,
whether in terms of results or behavior.
Performance Improvement — A systematic process of identifying and analyzing gaps in
human performance, followed by developing and implementing interventions or corrective
actions to close the gaps.
Performance Indicators — Parameters measured to reflect the critical success factors of an
organization. A lagging Indicator is a measure of results or outcomes. A leading indicator is a
measure of system conditions or behaviors which provide a forecast of future performance (also
known as “metrics”).
Performance Mode — One of three modes a person uses to process information related to one's
level of familiarity and attention given to a specific activity. People will likely use multiple
modes to complete a task. (See also Skill-based, Rule-based, and Knowledge-based
performance.)
Performance Model — A systems perspective of the context of individual human performance,
showing how plant results and individual behavior are interrelated with organizational processes
and values through job-site conditions.
Performance Monitoring — Review and comparison of performance against expectations and
standards using problem reporting, feedback, reinforcement, coaching, observation data, event
data, trend data, and so on. (See also performance indicator, performance gap, and gap
analysis.)
Performance Problem — A discrepancy in performance with respect to expectations or
operating experience, or an opportunity to improve performance created by changes in
technology, procedures, or expectations. (See also performance gap.)
Persistent context — Structure used for navigating across a range of tasks that are performed in
conjunction with one another.
Personas — Detailed examples of potential end-users that represent a specific target audience
type. Personas help developers think in terms of users by providing a concrete characterization of
them and how they might use the site. Especially helpful when there are no current users of the
Web site.
Physical Plant — Systems, structures, and components of the facility.
Plant Results — The outcomes of the organization in terms of production, events, personnel
safety, external assessments, configuration, and so on.
Population Stereotype — The way members of a group of people expect things to behave; for
example, in the U.S., up, right (direction), or red implies on or energized.
Positive Control — Active measure(s) to ensure that what is intended to happen is what
happens, and that is all that happens.
Practicality test — A test of your task flow’s feasibility, given the nature of the users and their
environment.
Practices — Behaviors usually associated with a role that can be applied to a variety of goals in
a variety of settings. (See also work practices.)
Preference data — Data that focuses on user perceptions. Do they feel like the task was easy?
Do they like the way the interface works? Usability tries to focus on performance data over
preference data.
Prevention Behaviors — Behaviors or practices oriented toward the prevention of errors or
events. (See also production behaviors.)
Primary Effect — The action or condition to be prevented. Typically the event or adverse
consequence that initiated the investigation. Usually the subject of the problem statement.
Principles — A set of underlying truths that can be used to guide both individual performance
and the management of human performance
Proactive — Preemptive measures to prevent events or avoid error by identifying and
eliminating organizational and job-site contributors to performance problems before they occur;
preventing the next event.
Process — A series of actions organized to produce a product or service; tangible structures
established to direct the behavior of individuals in a predictable, repeatable fashion as they
perform various tasks.
ProductionBehaviors — Behaviors oriented toward creating the organization’s product from
the resources provided (corollary to prevention behaviors).
Proportional font — A category of type with variable horizontal space between each character.
The shape and width of the character determines the amount of space needed on either side of it.
This makes it easier and more pleasing to read.
Protocol simulation (user-performance testing) — Individual users are asked to complete a
series of representative tasks using a prototype. While they work, they talk out loud. This gives
the researcher a clear understanding of the user’s thought processes.
Proximity — The Gestalt principle of grouping that states items that are placed close together
tend to be perceived as belonging together.
Rating scales — A testing tool used to capture the user’s subjective impression. For example,
measuring user satisfaction with a feature may have responses that range from “ctrongly agree”
to “strongly disagree.”
Reactive — Taking corrective action in response to an event or error.
Readiness — An individual’s mental, physical, and emotional preparedness to perform a job as
planned.
Reading grade level (RGL) — The level of education required by the user in order to
understand a particular document. It is important to adapt your writing to the appropriate reading
grade level (e.g., grade 6) of your target users.
Reinforcement — The positive consequences one receives when a specific behavior occurs that
increases the probability the behavior will occur again.
Reliability — In survey methodology, will a question elicit the same response over and over
again? Example: “What is your shoe size?” is generally a reliable question. “What’s the date?” is
not a reliable question.
Representative sampling — The process of determining and selecting a group of participants
from a larger population that represents your target market.
Research and planning — The first stage of user-centered design, characterized by an
evaluation of precursor designs and the gathering of business and user objectives for a new site.
Typically includes setting busines s goals, defining user requirements, and understanding brand
objectives.
Rigor — Completeness and accuracy in a behavior or process; cautiously accurate, meticulous,
exhibiting strict precision during the performance of an action.
Rollovers — An element on a page is replaced by a new element when the mouse rolls over it.
For example, in a navigation button bar, as the mouse passes over each button, the original image
is replaced with a modified version of that image. Implementing rollovers usually requires
scripting.
Root Cause — the most basic reason(s) for an undesirable condition or problem which, if
eliminated or corrected, would have prevented it from existing or occurring.
Round-tables — A data gathering technique where a group of users assemble to discuss and
analyze design concepts.
Rule-Based Performance — Behavior based on selection of a defined path forward derived
from one's recognition of the situation; follows an IF (symptom X), THEN (action Y) logic.
Safety Culture — An organization’s values and behaviors—modeled by its leaders and
internalized by its members—that serve to make safety the overriding priority. (See also values
and culture.)
Safety Significance — The actual or potential affect to human, environmental or
equipment safety.
Sans serif — A category of type where the font characters are without serifs (see serif). More
readable for isolated text such as labels and instructions.
Satisficing — A theory of human problem-solving that says people minimize expended effort by
using shortcuts to make decisions. For instance, humans tend to select the first correct answer
they encounter rather than rationally and systematically evaluating all possible answers prior to
selection. This concept was first presented by Herb Simon.
Saturated color — Saturation refers to the amount of the hue present, relative to gray.
Scenario — 1. A concrete, often narrative description of a user performing a task in a specific
context. Often a use scenario describes a desired or to-be-built function. This contrasts with a
task-scenario which describes a currently implemented function. 2. A prescribed set of
conditions under which a user will perform a set of tasks to achieve an objective defined by the
developer.
Self-Assessment — Formal or informal processes of identifying one’s own opportunities for
improvement by comparing present practices and results with desired goals, policies,
expectations, and standards. (See also benchmarking and performance monitoring.)
Sequence error — A specific error type in which a user attempts to complete a component of a
modal or ordered task out of sequence, resulting in a system error.
Serif — The cross-lines at the end of a font character stroke that help distinguish letters and
provide continuity for the reader’s eye. See sans serif.
Shortcut — An action, perceived as more efficient by an individual, that is intended to
accomplish the intent of actions rather than the specific actions directed by procedure, policy,
expectation, or training. (See also violation.)
Signal/noise ratio — The proportion of meaningful content to extraneous interference. Writing
is more powerful when the signal (message) is high and the noise (verbage) is low. Maximize the
signal/noise ratio.
Significance — A mathematical term often used in statistics to show that a group of numbers
differ by mathematical calculation to such a degree that the difference is not likely due to
chance. The actual or potential affect to Nuclear/Equipment Safety including Design Basis,
Industrial (Human) Safety, Radiation Safety, Cost and Budget, Regulatory Confidence,
Environmental Stewardship, and External Perception.
Similarity — The Gestalt principle of grouping that states items with the same size, shape, color
or shade tend to be perceived as belonging together.
Site map — A map of the Web site, displaying the navigation structure and the interrelationship
between pages.
Situation Awareness — The accuracy of a person’s current knowledge and understanding of
actual conditions compared to expected conditions at a given time.
Skill-Based Performance — Behavior associated with highly practiced actions in a familiar
situation executed from memory without significant conscious thought.
Skill of the Craft — The knowledge, skills, and abilities possessed by individuals as a result of
training or experience. Activities related to certain aspects of a task or job that an individual
knows without needing written instructions.
Slip — A physical action different than intended. (See also error, lapse, and compare with
mistake.)
Standdown — A period of time devoted by an organization toward the education, training, and
sensitization of personnel on issues associated with performance improvement.
Statement of work (SOW) — A contractual document specifying the work activities or tasks to
be conducted for successful completion of a project. Used by a contractor to size, plan and
complete a project and used by the organization that procures the services to monitor and control
the project.
Storyboards — Sketches or other visuals that help depict the design concept you have planned.
Summative testing — Testing done to measure the success of the design in terms of human
performance and preference.
Supervisor — That member of first-line management who directs and monitors the performance
of individual contributors (front-line workers) in the conduct of assigned work activities.
Surveys — An indirect user-centered analysis method for gathering information from a large
number of users. Issues in survey design include: reaching a representative sample; participant
self-filtering; question development and measurement bias; attracting enough responses.
System — A network of elements that function together to produce repeatable outcomes; the
managed transformation of inputs (resources) into outputs (results) supported with monitoring
and feedback.
Systems analysis — Reducing a system into its simpler constituents for the purpose of better
understanding the whole system, its function and the functions of its constituents. The
constituents include personnel, hardware, and software and the functions include both system
operation and maintenance functions.
Systems Thinking — Consideration of the multiple, diverse, and interrelated variables and their
patterns that come to bear on a worker at the job site; knowledge of the interdependencies of
processes and leadership dynamics on performance—the organizational nature of human
performance. (See also Performance Model.)
Task — An activity with a distinct start and stop made up of a series of actions of one or more
people; sometimes a discrete action.
Task analysis — The process of reducing a task or activity to determine the conditions in which
the task is conducted and the criteria for successful completion. Task analysis provides the
foundation for appropriate functional allocation with a logically sequenced task flow. Task
analysis serves to optimize task efficiency by reflecting the users understanding and expectations
for the task. This provides the infrastructure for the information architecture. Task analysis is
also useful for spotting potential errors and bottlenecks in the current task process by identifying
limiting factors which will likely preclude successful task completion. 2. A cause analysis
method where tasks are broken down into sub-tasks to compare expected actions with actual
actions performed during the event.
Task Demands — Specific mental, physical, and team requirements that may either exceed the
capabilities or challenge the limitations of human nature of the individual assigned to perform
the task. (See also error precursor.)
Task design — Part of the conceptual design stage; refers to the design of a task flow using
various formats (task lists, task flow diagrams, etc.) prior to prototyping. Task design follows a
task analysis and embodies the new task flows - these flows can be tested for practicality before
any formal page design has been done.
Task flow diagrams — Diagrams that show the various user tasks and their interrelationships.
Task profile — An overview of a given task outlining the task characteristics which impact
usable design, including importance, frequency, sequence, dependency, flow and mission
criticality.
Task scenario — A concrete, often narrative description of a user interacting with an interface.
Task scenarios typically describe functions which currently exist on an interface. This contrasts
with a use-scenario which describes a future use or function of an interface that is under
development.
Task statement — One representation for documenting tasks in user-centered task analysis.
Task statements should include an actor, an action and a goal or outcome: e.g., User enters
address; Manager runs report.
Team Error — A breakdown of one or more members of a work group that allows other
members of the same group to err due to either a mistaken perception of another’s abilities or a
lack of accountability within the individual’s group.
Technical rigor – Completeness and accuracy in both the process and the delivered product;
cautiously accurate and meticulous; exhibiting strict precision during the performance of action.
Thematic graphic — A type of graphic designed for the purpose of reinforcing a theme carried
throughout the site. For example, the picture of a beach might enhance the theme of “vacation.”
Think-aloud protocol — An interview strategy in which participants are asked to narrate their
activities as they complete a task simultaneously so that the interviewer can develop a better
understanding of the user’s mental model, decision criteria and expectations for a task or task
flow.
Thumbnail images — On the Web, miniature, thumbnail-sized images that can be enlarged if
desired (usually by clicking). Using thumbnail images instead of large graphics saves space and
reduces file size.
Type face families — Collections of typefaces that are designed and intended to be used
together.
Typography — The arrangement and appearance of printed materials.
Uneasiness — An attitude of apprehension and wariness regarding the capacity to err when
performing specific human actions on plant components.
Uncertainty – A presence of doubt, confusion, or questions about a work situation.
Unsafe Attitudes — Unhealthy beliefs and assumptions about workplace hazards that blind
people to the precursors to human error, personal injury, or physical damage to equipment.
Usability-centric — Refers to a mindset that focuses primarily on usability rather than features.
Usability round table — A meeting in which a group of end users is invited to bring specific
work samples and discuss the validity of an early prototype.
Use case — A user-centered design method in which critical tasks are systematically
documented with their prerequisites, the user steps and system steps, and the task outcome. Use
cases are typically described in the abstract, which makes them particularly helpful in objectoriented design. Scenarios are concrete instantiations of use-cases.
User-centered analysis — A method of collecting data to develop an understanding of user
intentions and interface use patterns. User-centered analysis provides concrete data to prioritize
and drive interface design.
User-centered design — Design methodology in which interviews and empirical tests of users’
needs determine the characteristics of a design or computer application.
User interface structure — A term used to refer to the basic content organization of the site and
it’s navigation model. To be differentiated from the page level content, the user interface
structure defines the “containers” for content and means for navigation to it.
User profile — A general description of a user group for a specific interface. Typically includes
characteristics which may influence design choices, such as: Demographic Characteristics,
Education, Language, Computer Experience, Domain Experience, Motivation, or Expectations.
User satisfaction — A metric of usability which focuses on how well the user perceives the
interface to work and how well it meets his/her needs.
Validity — The extent to which an object does what it was designed to or the extent to which a
question measures what it was intended to.
Values — The central principles held in high esteem by the members of the organization around
which decisions are made and actions occur, such as reactor safety. (See also culture and safety
culture.)
Verification – The act of confirming that the condition of a component, or other product of
human performance, conforms to the condition required by a guiding document.
Violation — A deliberate, intentional act to evade a known policy or procedure requirement and
that deviates from sanctioned organizational practices. (See also Shortcut.)
Vision — A picture of the key aspects of an organization’s future that is both desirable and
feasible—to be the kind of organization people would aspire to—that guide employees’ choices
without explicit direction, but understandable enough to encourage initiative.
Visual hierarchy— Refers to the overall page layout and its ability to lead the users attention
through the page elements. Effective visual hierarchies create an appropriate balance in
composition that draws users to top levels of the hierarchy while optimizing visual access to
important page level elements.
Vulnerability — Susceptibility to external conditions that either aggravate or exceed the
limitations of human nature, enhancing the potential to err; also the weakness, incapacity, or
difficulty to avoid or resist error in the presence of error precursors. (See also error precursor.)
Watermarks — A graphic design appearing as background.
Widgets — Slang term for “controls and displays.”
Work Environment — General influences of the work place, organizational, and cultural
conditions that affect individual behavior at the job site. (See also error precursors.)
Work Execution — Those activities related to the preparation for, performance of, and feedback
on planned work activities.
Worker — An individual who performs physical work on equipment, having direct contact
(touching) with equipment, and is capable of altering its condition. (Compare with knowledge
worker.)
Work Practices — Methods an individual uses to perform a task correctly, safely, and
efficiently including equipment/material use, procedure use, and error detection and prevention.
(See also practices.)
WYSIWYG— “What you see is what you get.” When a computer program is capable of
displaying on-screen exactly as the page will print.
Sources:
Cause Analysis Methods for NERC, Regional Entities, and Registered Entities. Published
October 2011. North America Electric Reliability Corporation
DOE STANDARD: Human Performance Improvement Handbook Volume 1: Concepts and
Principles. DOE-HDBK-1028-2009
DOE STANDARD: Performance Improvement Handbook Volume II: Human Performance
Tools for Individuals, Work Teams, and Management. DOE-HDBK-1028-2009
HFI-Certified Usability Analyst Program: Glossary of Usability Terms ©2002 Human Factors
International, Inc.
Download