Usability Engineering and its role in Software Industry

Qaiser S. Durrani
FAST-NU, Lahore
Workshop on Usability Engineering
Feb 21-23, 2011 at SEECS NUST
Agenda
 Usability Engineering?
 Why we need it?
 What are its measures?
 Where UE fits in the SDLC
 Can we integrate or map UELC with SDLC?
 Experience and Emotional Measures – Role?
 Case Study
 Current practices in Software Industry with respect to
UE
Why Usability Engineering?
 Functional perspective
 User perspective
Usability
 ‘‘the capability to be used by humans easily and
effectively’’
 ‘‘quality in use’’
 ‘‘the effectiveness, efficiency, and satisfaction with
which specified users can achieve goals in particular
environments’’
 Context dependent (shaped by the interaction between
tools, problems, peoples)
 A process through which usability characteristics are
specified and measured throughout the software
development lifecycle.
Key Research Questions in HCI
 How to work with and improve the usability of




interactive systems?
Guidelines for improving the usability of systems?
Methods for predicting usability problems?
Techniques to test the usability of systems?
Discussions on how to measure usability
Neglecting Usability Engineering
Non-functional
requir ements
Product
requir ements
Ef ficiency
requir ements
Reliability
requir ements
Usability
requirements
Performance
requirements
Or ganizational
requir ements
Portability
requirements
Delivery
requirements
Space
requir ements
External
requirements
Interoperability
requirements
Implementation
requir ements
Ethical
requirements
Standards
requirements
Legislative
requirements
Privacy
requirements
Safety
requirements
Usability into Software Development
 When integrating usability into the system design process,
early focus on users and tasks, empirical measurement,
and iterative design principles are suggested
 This integration, however, is not a trivial task, as numerous
obstacles have been reported
 First of all, introducing a new method into a software
development organization is typically a delicate problem
 User-centered design techniques have been reported to
remain the speciality of visionaries, isolated usability
departments, enlightened software practitioners, and large
organizations, rather than the everyday practice of software
developers
Usability Engineering and
Experience Design
Models for Usability Engineering
Lifecycle
 Star Lifecycle Model
 ISO 13407 Model
 Usability engineering lifecycle
by Deborah J. Mayhew
Usability Engineering Lifecycle
Requirements Analysis Phase
 User Profiling – Cognitive & Non-Cognitive measures
 Task Analysis
 SW/HW/Environment Constraints
 General Design Principals
 Usability Goals
Design, Development & Evaluation
 Conceptual Level Design
 Detail Level Design
 Screen Standards
 Iterative Evaluation
Usability Activities
Adaptation of Usability Activities into Software Engineering Development Process
Allocation of Usability Techniques to
Development Activities
Shneiderman’s Golden Rules
 R1:Strive for consistency
 R2:Offer shortcut
 R3:Give effective feedback
 R4:Reduce Short term memory load
 R5:Provide reversal of actions
 R6:Design Dialogues to yield closure
 R7:Provide locus of control
Practices - MEASURING USABILITY
(Case study of 180 projects)
Measures of effectiveness
Measures of Efficiency
Measures of Satisfaction
Measures of Effectiveness
 Binary task completion
 Accuracy
 Recall
 Completeness
 Quality of outcome
 Experts assessment
Comments
1- 22% of the studies reviewed do not report any measure of
effectiveness nor do these studies control effectiveness.
Frøkjær et al. argued that the HCI community might not
succeed in trying to make better computing systems
without employing measures of effectiveness in all studies
2- Research shows that measures of the quality of the
outcome of the interaction are used in only 16% of the
studies. For example, experts’ assessment of work
products seems a solid method for judging the outcome of
interaction with computers and has been used in a variety
of fields as an indicator of the quality of work products,
for example with respect to creativity. Yet, in this sample
only 4% of the studies use such measures
Comments
3- New kinds of devices and use contexts require new
measures of usability. Especially, it has been argued
that the notion of task underlying any effectiveness
measure will not work in emerging focuses for HCI,
such as home technology
4A number of studies combine usability
measures into a single measure, report the combined
values, and make statistical tests on the combinations
Measures of efficiency
 Time
 Input rate
 Mental effort
 Usage patterns
 Communication effort
 Learning
Comments
1- Some of the efficiency measures are obviously related to
the quality of interactive computer systems, because they
quantify resources (e.g., time or mental effort) that are
relevant in many contexts for many users
2- A second comment on the studies reviewed pertains to the
measurement of time. A surprising pattern apparent from
Table is that while objective task completion time is
measured by 57% of the studies, little attention is paid to
user’s experience of time
 However, in this sample of 180 studies, only one study
measures directly subjective experience of time
Comments
3- The reviewed studies differ in how task completion
times, and efficiency measures in general, are
reasoned about. In the ISO definition of usability and
in most of the studies reviewed, time is considered a
resource of which successful interfaces minimize
consumption
 However, in a handful of studies higher task
completion times are considered as indicators of
motivation, reflection, and engagement
Comments
4- A striking pattern among the studies reviewed is that
few studies (5) concern learning of the interface.
 Only five studies measure changes in efficiency over
time
5- In the studies reviewed, the median time of
working with the user interfaces evaluated was 30
min
Measures of Satisfaction
 Standard questionnaires
 Preferences
 Satisfaction with the interface
 User attitudes and perceptions
Comments
1- The measurement of satisfaction seems in a state of
disarray. A host of adjectives and adverbs are used, few
studies build upon previous work, and many studies
report no or insufficient work on the validity and
reliability of the instruments used for obtaining
satisfaction measures
 Another indication of the disarray is in the limited use
of standardized questionnaires
Comments
2- A second comment on the satisfaction measures used is
that studies vary greatly in the phenomena that are
chosen for objective performance measures and those that
are investigated by asking subjects about their perceptions
and attitudes.
 One question arises when users’ perception of phenomena
is measured when those phenomena perhaps more fittingly
could have been assessed by objective measures
3- The review shows that in practice subjective satisfaction is
taken to mean a questionnaire completed after users used
the interface. Only eight studies (4%) measure satisfaction
during use without using questionnaires
CHALLENGES IN MEASURING
USABILITY
Subjective and objective measures
of usability
 Measures of usability concern user’s perception of or




attitudes towards the interface, called subjective
usability measures
Other measures concern aspects of the interaction not
dependent on user’s perception called objective usability
measures
Such a distinction has been argued to simplify the nature
of measurement in science
Suggest using the distinction to reason about how to
choose usability measures and find more complete ways of
assessing usability
Measures may lead to different conclusions regarding the
usability of an interface
Measures of learnability and
retention
 Particularly measures of efficiency, we find it relevant to compare
them to recommendations on how to measure usability
The well-known textbook by Ben Shneiderman (1998, p.15) recommends measuring (1) time to
learn, (2) speed of performance, (3) rate of errors by users, (4) retention over time, and (5)
subjective satisfaction.
Nielsen (1993, p. 26) similarly recommends measuring (a) learnability, (b) efficiency, (c)
memorability, (d) errors, and (e) satisfaction
 Most of the reviewed studies follow part of the recommendations by
measuring task completion time (points 2 and b above), accuracy
(points 3 and d), and satisfaction with the interface (points 5 and e):
92% of the studies measure at least one of these; 13% of the studies
measure all three
Measures of learnability and retention
 The majority of studies make no attempt to measure
learnability or retention
 This challenge is most relevant for studies or research
addressing systems that users should be able to
learn quickly or that will be intensively used
 Overall, usability studies could put more emphasis on
measures of learning, for example by measuring the
time needed to reach a certain level of proficiency
 In addition, measures of the retention of objects
and actions available in the interface (i.e., the
ability of users to come back and successfully use the
interface) are important in gaining a more complete
picture of usability
Measures of usability over time
 The studies reviewed show that users typically interact only briefly with
interfaces under investigation; as mentioned earlier the median duration
of users’ interaction was 30 min; only 13 studies examined interaction that
lasts longer than five hours
 The brief period of interaction in the studies reviewed explains the lack of
focus on measures of learning and retention
 The observation also suggests that we know little about how usability
develops as the user spend more time interacting with the interface and
how tradeoffs and relations between usability aspects change over time
 From research, we need a more full understanding of how
the relation between usability aspects develops over time
Extending, validating and standardizing measures of
satisfaction
 The disarray of measures of satisfaction presents
special challenges
 One is to extend the existing practice of measuring
satisfaction almost exclusively by post-use questions;
 another is to validate and standardize the questions
used
 Validation may be achieved through studies of
correlation between measures
Micro and macro measures of
usability
 Usability at a micro level
 Such measures cover tasks that are usually of short
duration (seconds to minutes), has a manageable
complexity (most people will get them right), often
focus on perceptual or motor aspects (visual scanning,
mouse input), and time is usually a critical resource
 Usability at a macro level
 Such measures cover tasks that are longer (hours, days,
months), are cognitively or socially complex (require
problem-solving, learning, critical thinking, or
collaboration)
A working model for usability measures and research
challenges
Affective Requirement
 The need to make something fun, engaging, or
enjoyable is usually not considered in requirements
elicitation
 Software requirements for these and other affective
factors are never truly captured in an official manner
 Juran is credited with coining the phrase "fitness for
purpose“
 If a system is intended to be a leisure product then
the ‘fitness for purpose’ must also extend to affect
Rebirth of Affect in Design
 The idea of affect is not old but affect has re-emerged as a
potentially desirable design characteristic
 One of the visionaries of this re-emergence was Robert
Glass from Sun Microsystems, who said:
“If you’re still talking about ease of use then you’re behind. It is all about
the joy of use. Ease of use has become a given – it’s assumed that your
product will work.”
(Glass, 1997)
Summary of research into affective
factors
Exploring Affect……Theories
 Three theories have each been said to contribute to computer
game enjoyment
Usability:
 In ISO 9241-11 (ISO, 1998), usability is characterized as consisting
of three elements:
 effectiveness, efficiency, and satisfaction
 Grice (2000) attempted to apply these three elements to
computer game design
 His hypothesis was that computer games that were enjoyable
will have high levels of efficiency, effectiveness, and
satisfaction
 Some minor experiments conducted under his supervision
seemed to indicate that this hypothesis was true
Exploring Affect…Theories
Flow:
 Csikszentmihaly describes flow as ‘the holistic sensation that
people feel when they act with total involvement
 In the state of flow, actions flow without conscious
intervention by the actor
 The term flow was used because people in this state often said
that they “were in the flow of [the activity]”.
 the characteristics of flow-inducing activities are:
 must feel capable of completing the task
 must have the ability to concentrate on task
 clearly recognizes the goals of the task
 receives immediate feedback about task performance
 has a sense of control over their actions
 has the sense of time altered: hours can seem like minutes
Exploring Affect…Theories
Heuristics for internally motivating interfaces:
 Malone (1983), in agreement with Csikszentmihaly,
believes that fun and enjoyment only arise from
activities that are intrinsically motivated
 Computer games are thought to be played because
of intrinsic motivation, with no expectation of a
reward other than the activity itself
 Malone and Lepper (1987) developed seven heuristics
for the design of intrinsically motivated interfaces
Exploring Affect…Theories
The 4 major heuristics are:
 Challenge- multi-layers of challenge so that the user will feel initial
success and continue to see improvements
 Curiosity- believe that their knowledge structures (or skills) are
incomplete or inconsistent
 Control-
interface should make the user feel that the outcomes are
determined by the users own actions
 Fantasy- evoke mental images of physical or social situations
Other minor are Competition, Cooperation, Recognition
Results
 The results being referred to are the learnability and ‘losing
time’ reasons
 Loss of Time
 Learnability
Measures of specific attitudes towards the
interface (Experience Design) – from 180
projects
Current Usability Practices in
Pakistan Software Industry
Basic Software Industry Data
 Number of SW industry surveyed: 26
 Number of respondents: 35
 Project Type: Multiple type from Web to IS
Research Questions
 Does organization include estimates for usability activities in





planning phase?
Does organization involve users during SDLC phases? If yes then
what kind of user involvement it has?
(a) Are usability activities integrated into requirement phase of
SDLC? (b). Are usability activities integrated into design phase of
SDLC? (c) Are usability activities integrated into implementation
phase of SDLC? (d) Is usability testing done in an organization?
Does an organization collect feedback from users for a product?
Does an organization calculate return on investment for the
usability activities?
Are organizations intended to introduce or enhance the UELC
activities in SDLC?
User Involvement
80.00
62.86
71.43
70.00
37.14
Yes
60.00
50.00
40.00
30.00
28.57
20.00
10.00
No
0.00
Responses
Yes
Fig1:Usability Activities in Planning Phase
No
Fig2:User Involvement in SDLC
80.00
72.00
64.00
40.00
60.00
Testing phase
Implementation phase
Design phase
36.00
Requirement
gathering phase
90.00
80.00
70.00
60.00
50.00
40.00
30.00
20.00
10.00
0.00
Planning phase
Respondents %
User involvment in SDLC
Installation phase
70.00
60.00
50.00
40.00
30.00
20.00
10.00
0.00
Respondents %
Respondents %
Planning Phase
Fig3:User
Involvement
in SDLC
Phases
Requirement Phase
User Profile
120.00
97.14
100.00
68.57
65.71
60.00
40.00
20.00
Respondents %
80.00
80.00
80.00
60.00
40.00
20.00
20.00
0.00
0.00
Functional
Requirements
Non-Functional
Requirements
Usability
Requirements
No
Yes
Fig5: User Profile
Fig4: Usability Requirements
User Characteristics
Respondents
Respondents %
100.00
100.00
80.00
60.00
40.00
20.00
0.00
93.94
63.64
24.24
Psychological
characteristics
(e.g., attitude,
motivation)
15.15
Know ledge and
Job and task
Physical
experience
characteristics characteristics
(e.g., typing
(e.g., frequency (e.g., age, sex,
skill, task
of use, task of
physical
experience)
structure)
limitation like
Fig6: User Contextual Inquiry
3.03
Industry and
Experience
Usability Goals
Respondents %
88.57
80.00
60.00
40.00
54.29
60.00
50.00
40.00
30.00
20.00
10.00
0.00
40.00
Choice of
Softw are
Developers
0.00
yes
No
Fig7: Usability Goals
Specialized
According to
softw are (
user request
according to
User interface
requirement)
Fig8: User Interface Development platform
Usability Roles in Requriement Phase
80.00
70.00
60.00
50.00
45.71
25.71
Most Available
Softw are
11.43
20.00
Respondents %
Respondents %
100.00
Developm ent Platform
68.57
71.43
40.00
30.00
20.00
10.00
0.00
28.57
8.57
Project Manager
Business
Analyst
Usability
Engineer
Fig9: Usability Roles
Softw are
Developer
Design Flexibility
Usability Role in Design Phase
80.00
80.00
51.43
48.57
50.00
Respondents %
60.00
40.00
30.00
20.00
73.53
70.00
60.00
50.00
40.00
26.47
30.00
10.00
20.00
0.00
Softw are Developer
User Interface Designer
10.00
0.00
Yes
No
Fig10: Screen Design Standards
Fig11: Design Flexibility
Screen Design Standards
100.00
Respondents %
Respondents %
70.00
80.00
60.00
51.43
48.57
Yes
No
40.00
20.00
0.00
Fig12: Usability
Roles
70.97
67.74
64.52
64.52
All interactions with
input devices
Menu bars
Message boxes
61.29
Dialog boxes
72.00
70.00
68.00
66.00
64.00
62.00
60.00
58.00
56.00
Identification of all
pathways/links/message
s flow between windows
Respondents %
Detail Design of User Interface
Fig13: Detailed Design of User interface
Fig14: Usability Testing
Users Feedback
Respondents %
120.00
97.14
100.00
80.00
60.00
40.00
25.71
20.00
5.71
14.29
11.43
0.00
0.00
Acceptance
testing
Interviews
Focus group
Questionnaire
Usage study
Video tapping
Fig15: User Feedback
Respondents %
User Expereinces
100.00
90.00
80.00
70.00
60.00
50.00
40.00
30.00
20.00
10.00
0.00
87.88
45.45
30.30
33.33
30.30
30.30
9.09
Fig16: User Experiences
9.09
100.00
90.00
80.00
70.00
60.00
50.00
40.00
30.00
20.00
10.00
0.00
Functionality
Performance
Cosmetic
Usability
Fig17: User Feedback
ROI Calculation
Respondents %
Respondents %
Defects Types
100.00
90.00
80.00
70.00
60.00
50.00
40.00
30.00
20.00
10.00
0.00
87.50
12.50
Yes
Fig18: ROI Calculation
No
Challenges in Measuring Usability
 Subjective and objective measures of usability
 Measures of learnability and retention
 Measures of usability over time
 Extending, validating and standardizing measures of
satisfaction
Recommendations
 Developers must consider user interaction from the
beginning of the development process.
 Practice of Usability Testing
 Practice of Cost-Justifying Usability tasks
 Don’t try to do a full-scale usability process from the
beginning.
References
 Kasper Hornbæk, Int. J. Human-Computer Studies (2006),Current




practice in measuring usability: Challenges to usability studies and
research
Xavier Ferre, Integration of Usability Techniques into the Software
Development Process
Juho Heiskari, Marjo Kauppinen, Mikael Runonen, Tomi Mannisto,
Bridging the Gap Between Usability and Requirements Engineering,
2009 17th IEEE International Requirements Engineering Conference
Todd Bentley, Lorraine Johnston, Karola von Baggo, AWRE’2002, Putting
Some Emotion into Requirements Engineering
Samia Asloob, Qaiser S. Durrani, Usability Engineering Practices in
SDLC, Technical Report (2010), FAST-NU, Lahore
Questions?
Bottom Line benefits
 Increased Productivity
 Decreased user training
 Decreased user errors
 Decreased need of on-going technical support
 Incorporating business and marketing goals while
catering to the user needs (especially for Mobile, Web
and Gaming applications)
Time Constraints for the Application of Usability
Activities and Techniques
Subjective and objective measures
of usability
 Challenges in research are to develop subjective
measures for aspects of quality-in-use that are
currently measured by objective measures, and vice
versa, and evaluate their relation
 In studies of usability, we suggest paying special
attention to whether subjective or objective
measures are appropriate, and whether a mix of
those two better covers the various aspects of qualityin use
Definition of Process Increments
 defined seven deltas in order to get a better match with the
general stages of an iterative software development process
 D1: Early Analysis
 D2: Usability Specifications
 D3: Early Usability Evaluation
 D4: Regular Analysis
 D5: Interaction Design
 D6: Regular Usability Evaluation
 D7: Usability Evaluation of Installed Systems
Affective Requirement
 Same functional requirements, underwent a similar design process by the
same designers, yet the need to convey a different affective response
greatly changed the entire product
 Given that requirements give the constraints on how a system should
behave, then it is important to see that ‘affective requirements’ are
considered a valid category of requirement
Accepting that affective factors make valid requirements raises
the following questions:
 How does an organization elicit and document affective requirements?
 How does an organization design to meet affective requirements?
 How does an organization validate that the design elicits the required
affective response?
Motivations
Research focus on how to measure usability has three motivations:
 First, what we mean by the term usability is to a large
extent determined by how we measure it
 Second, usability cannot be directly measured so, find
aspects of usability that can be measured
 Which measures of usability to select is consequently
central in many approaches to the design and development
of user interfaces
Studies of correlations between measures
 A weak understanding of the relation between usability
measures gives rise to many of the issues
 With a better understanding, we could make more
informed choices about which usability measures to
employ
 Studies of correlation between measures may improve this
understanding by informing us whether our measures
contribute something new and what their relation are to
other aspects of usability
 There is need for a better understanding of the relation
between usability measures, for which studies of
correlations between measures would be one contribution