Usability Requirements: Compliance with ANSI/INCITS-354

advertisement
Usability Requirements:
Compliance with
ANSI/INCITS-354
Stephen Jackson
Cheryl Kieliszewski
Abstract
ANSI standard ANSI/INCITS-354 (Common Industry Format (CIF)),
adopted in 2001, provides a standard format for the sharing of
usability-related data. As a newly ratified standard, the CIF has yet
to gain industry-wide support, and is still being evaluated for roll-out
within IBM. However, adoption of the CIF within IBM is important for
several reasons. Several large companies (both competitors and
customers) support the CIF, which may become a requirement for
sales (similar to the Government section 508 Accessibility
requirements). Early adopters of the CIF are Boeing, Kodak, Oracle
Corporation, and State Farm Insurance. The CIF will lead to
improvements in the User Centered Design (UCD) process within IBM
through the standardization of reports across teams and products.
The CIF will also become a necessity to maintain competitiveness (e.g.,
Oracle Corporation). The poster will provide a history and
requirements of the CIF document, IBM corporate strategy regarding
the CIF, a comparison to an existing process, UCD process
improvements and the benefits to IBM.
What is the ANSI Standard ANSI/INCITS-354
Common Industry Format (CIF)?

Usability standard for direct comparison between
competitive products








Most likely performed by an independent testing organization
Conducted after a product is released
Compared with competitive product for usability
Evaluate comparison and weigh differences for purchasing
decisions
Help procurement and purchasing for large companies
Document audience is primarily usability experts
The CIF does not tell you what to do; it tells you how to
report on what you did
CIF has a dual nature—highlights both product strengths
and weaknesses
History

1996 – NIST recognized the need to:


Conduct a pilot trial to determine how well the usability reporting format works
and to determine the value of using this format in software procurement.
Keith Butler (Boeing) started and drove the standards work group
IBM UCD Advisory Council and Microsoft have provided feedback to the core
CIF team during development of the standard
Ziff Davis Publishing wanted to make this a usability seal of approval, but that
idea was denied
Anticipate a government tie into the ANSI standard (like accessibility
requirements)
Being considered for inclusion in ISO Standard 9241 (Usability Standard)






Encourage software suppliers and consumer organizations to work together to understand
user needs and tasks.
Develop a common usability reporting format for sharing usability data with consumer
organizations.
Support of CIF and Early Adopters







Include user quotes here
Boeing
Kodak
Oracle
State Farm Insurance
Microsoft
HP
Standard Report Format

Title Page

Method


Executive Summary




Introduction


Full Product Description
Test objectives

Participants
Context of Product Use in Test
Experimental Design
Usability Metrics
Results


Data Analysis
Presentation of the Results

References

Appendices
Title Page

Identify the report as a Common Industry Format (CIF)
document







State the CIF version
State contact information (i.e., ‘Comments and questions:
iusr@nist.gov’)
Product name and version/release tested
Research lead and contact information
Date(s) test was conducted
Date the report was completed
Report author
Executive Summary


Provides high level summary of the test
Intent is to provide information for
procurement decision-makers in customer
organizations





Identify and description of product
Summary of the method(s) used in the
test
Results expressed as mean scores or other
suitable measure of central tendency
Reason for and nature of the test
Tabular summary of performance results
Introduction

Full Product Description







Formal product name and release or version
What parts were evaluated
Intended user population
Any groups with special needs
Brief description of environment the
product should be used in
The work that is supported by the product
Test Objectives



Describes objectives for the test
Functions and components the user directly
or indirectly interacted with during the test
Whether or not the function or component
tested was a subset of the total product. If
so, provide reason for testing subset
Methods

Key technical section of the report


Must provide enough information to allow
an independent tester to replicate the
procedure used in testing
Participants






Description of the user population and
test sample
Total number of participants tested
Segmentation of user groups tested
Key characteristics and capabilities of
the user groups
How participants were selected and if
they met essential characteristics and
capabilities
Whether or not the participant sample
included representatives of groups with
special needs.
Business
Sector
P1
P2
Pn
Company
Job Title
Storage
Experience/
Responsibilities
Storage
Software
Experience
Storage
Hardware
Experience
Elements of
Current
Storage
Environment
Methods

Context of Product Use in the Test


Description of the tasks, scenarios and conditions in which the
test was performed
Tasks
•
•
•
•
•

Description of task scenarios used for testing
Explanation of why the scenarios were selected
Description of the source of tasks
Description of task data provided to the participants
Completion or performance criteria established for each task
Test Facility
• Physical description of the test facility
• Details of relevant features or circumstances that may affect the
quality of the results (e.g., recording equipment, one-way mirrors)

Participant’s Computing Environment
•

Software and hardware configuration details, display details, audio
details, and/or manual input details
Test Administrator Tools
• Describe any hardware or software used to control the test or
record data
• Describe any questionnaires used to collect data
Methods

Experimental Design


Describes the logical design of the test
Procedure
• Provide independent or control variables,
operational definitions of measures, and any
policies or procedures for task time limits,
training, assistance, intervention or
responding to questions.
• Provide sequence of events from greeting to
dismissing participants
• Provide steps the evaluation team followed
to execute the study and record data and
the roles they played
• State details of non-disclosure agreements,
informed consent/human subjects rights,
and compensation

Usability Metrics
• Explain what measures have been used for
each category of usability metrics:
completion rates, errors, assists, time-ontask, completion rates, satisfaction ratings

3 Required CIF Usability Metric
Categories




Effectiveness: Empowering users to
succeed in their tasks
Efficiency: Enabling people to work
faster to save time and money
Satisfaction: Reducing frustration and
under-utilization
3 Additional IBM Usability Metric
Categories



Flexible: Allowing people to work in ways
that match their situation
Easy to learn: Reducing time to value with
or without training
Safe: Preventing accidents and business
errors
Results

Second major technical section of the report


Data Analysis


Describes how the data were scored, reduced and
analyzed and provides the major findings in
quantitative formats
Provide sufficient detail to allow replication of data
scoring, data reduction, and analysis methods by
another organization
Presentation of the Results

Required to report effectiveness, efficiency, and
satisfaction results in tabular and graphical
presentations to describe the data
A mixed-factor design was used for this study with 16 participants. The dependent variables were acceptability scores and quality
rating. The independent variables were:
 Drives (Company A, Company B, Company C, and Company D)
 Environment (Office and Lab)
 Gender (Male and Female)
The design classification was:
Company A
Company B
Company C
Company D
Office
Male
Female
P1, P2, P3, P4 P5, P6, P7, P8
P1, P2, P3, P4 P5, P6, P7, P8
P1, P2, P3, P4 P5, P6, P7, P8
P1, P2, P3, P4 P5, P6, P7, P8
Lab
Male
P9, P10, P11, P12
P9, P10, P11, P12
P9, P10, P11, P12
P9, P10, P11, P12
Female
P13, P14, P15, P16
P13, P14, P15, P16
P13, P14, P15, P16
P13, P14, P15, P16
The treatment order was a partially counterbalanced Balanced Latin Square design. This treatment order was used to control for
presentation order bias and gender bias:
Presentation
Order
1
2
3
4
P1, P5, P9, P13
Drive 1
Drive 2
Drive 4
Drive 3
P2, P6, P10, P14
Drive 2
Drive 3
Drive 1
Drive 4
P3, P7, P11, P15
Drive 3
Drive 4
Drive 2
Drive 1
P4, P8, P12, P16
Drive 4
Drive 1
Drive 3
Drive 2
Analysis of Variance (ANOVA) was first performed to determine significant differences between the perceived quality of
the drives based on the overall quality score for each drive (question 3 of the survey). A Regression Analysis was then
performed to determine what factors affected perceived quality based on the sound attribute scores (question 2 of the
survey). An alpha level of 0.05 was used for all analyses.
In general, a significant difference was found between the overall quality scores of the drives for both the office and lab
settings. Mean scores, standard deviations, and confidence levels for each drive in each setting were as follows:
Mean
Company
Company
Company
Company
A
B
C
D
5.3
5.5
8.2
7.5
Office Setting
Standard
Confi dence
Deviation
Level
2.6049
2.1778
1.7728
1.4821
1.8886
1.5789
1.4880
1.2440
Mean
6.4
6.5
8.5
9.4
Lab Setting
Standard
Confi dence
Devi ati on
Level
0.7440
0.6220
2.3299
1.9479
1.7728
1.4821
0.7289
0.6093
Appendices

Detailed study materials

Customer questionnaires, participant general instructions,
participant task instructions, release notes.
References
Common Industry Format for Usability Test Reports (version
1.1, October 28, 1999). Available from iusr@nist.gov
P. Englefield (personal communication, June 6, 2003)
D. Gonzalez (personal communication, April 25, 2003)
E. Reinke (personal communication, June 9, 2003)
K. Vredenburg (personal communication, May 13, 2003)
Revenue Comparison

Blah, blah, blah
Improvements in Usability Coverage




Usability tests designed to meet CIF requirements
Standardizes UCD reporting
Standardizes core set of UCD metrics
Provides a standard means to measure competitive products’
usability
Benefits and Competitiveness



Improvements to UCD process will help drive user-friendly
product that meet user requirements.
CIF will provide a yardstick to compare our usability to the
competition – highlight areas that we can improve or exceed
the competition.
Usability is a product differentiator.
Download