i Some Usability Engineering Methods Randolph Bias

advertisement

Some Usability Engineering

Methods

Randolph Bias

2/17/2011

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

i

Today

• Divide the day into thirds:

– First third – RB on usability evaluation methods

– Second third – Yahoo speaker on accessibility

– Third third – Reconvene for discussion about slash work on . . .

• Wiki

• Etc.

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

i

Remember . . .

Our approach, within usability engineering of web sites and other user interfaces, is: i

• Empirical

• Iterative

• User-Centered Design

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

The Methods of Usability

Engineering . . .

• Are employed in order to enable you to bring user data (empiricism) to bear on the emerging product design.

• You (the usability engineer) become an advocate for the user, in the product development process.

i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

• Cost

One big problem

i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

Three Methods to address cost

• Remote End-user testing (lab testing)

• Heuristic Evaluations

• Usability Walkthroughs i

Today let’s talk about WHY and WHEN we employ one method or another, and

HOW to carry them out.

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

Next week -- End-user Testing

• Also called “lab testing”

• Can be done on paper-and-pencil design, prototype, early code, existing product i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

EUT - Benefits

• Gather performance and satisfaction data

– Performance data: time on task, error rates,

# calls to the help desk, # references to the documentation, . . . .

– Satisfaction data: End-of-test questionnaire

• Can be find-and-fix or benchmarking

• Ensure coverage of certain parts of the

UI – we have good control over the tasks i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

EUT - Limitations

• Artificial situation

• Successful test doesn’t “prove” the product works

– Aside – It’s ALL about confidence.

• Need representative users!

• Ease of learning vs. ease of use

• Hard to test longitudinally

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

i

EUT -- What to test?

• Can rarely cover all the UI.

• I like to test:

– critical tasks

– frequent tasks

– nettlesome tasks i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

Rubin’s 4 Types

• Exploratory (working on the “skeleton” – maybe the Information Architecture)

• Assessment test (working on the “meat and flesh” of the UI)

• Validation test (does it meet the objectives?)

• Comparison test (compare two competing designs)

– Rubin, J. Handbook of Usability Testing: How to

Plan, Design, and Conduct Effective Tests.

Wiley: New York, NY, 1994. (Superseded by Rubin and Chisnell.) i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

Set up

• Create environment (ambient setting, HW,

SW)

• Identify participants

• Establish test roles (test monitor/administrator, data logger, timer, video operator, product experts (SMEs), other observers)

• Create test plan

• Prepare observers

• Prepare test materials

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

i

What materials?

• Instructions

• Informed consent form

• NDA

• Permission to videotape

• Test scenarios

• Questionnaire(s)

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

i

Conduct the Test

• Welcome the test participant

• Communicate that this is not a test of THEM, and they can leave any time

• The scenarios should match the real world setting

• Ask the test participants to “think aloud” to better understand intent

• Offer post-test questionnaire, and debrief i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

After the Test

• Quick data to product team

• Assign severities and build recommendations

• Build archival report

• Serve as change agents!

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

i

Ah, but REMOTE

• Saves tons of travel money

• Allows you to get otherwise hard-to-get test participants.

• Allows them to be in their own environments.

• Might allow product designers/developers to watch from their own office.

• But . . . lose some fidelity of the test environment (video?)

• Some added set-up cost (time) i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

What is a Heuristic Evaluation?

Evaluators systematically inspect the application interface to check for compliance with recognized usability guidelines

(heuristics). (Thus, an INSPECTION method.)

- Identifies major and minor usability problems

- Conducted by three to five experienced usability engineers (or one!)

- Problems are reported along with the heuristic it violates

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

i

Problems Identified

The probability of one evaluator finding . . .

– A major usability problem - 42% *

– A minor usability problem - 32%

More evaluators = more problems identified

*from www.useit.com

i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

Strengths of an HE

-Done by people experienced in usability not just “dumb” users

-Can identify both major and minor usability problems

-Can be done relatively quickly and inexpensively

-UNlike EUT, can sometimes cover every corner of a UI or web site

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

i

Weaknesses of an HE

- If done at end of design, designers may be resistant to changes

- Some designers/developers may be unmoved by “just opinions”

- Experienced usability evaluators may miss content problems that actual users would find

-Can HELP address this issue by using SMEs i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

Typical Methodology

-Interface is exercised

-1 st pass to develop the big picture

-2 nd pass to accomplish typical tasks

-Each problem is reported along with the heuristic it violates

-Comments are consolidated

-Severity levels – Critical, Major, Moderate, Minor i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

Nielsen’s Usability Heuristics

• Visibility of system status

– The system should always keep users informed about what is going on, through appropriate feedback within reasonable time.

• Match between system and the real world

– The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order.

• User control and freedom

– Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo.

• Consistency and standards

– Users should not have to wonder whether different words, situations, or actions mean the same thing.

Follow platform conventions.

• Error prevention

– Even better than good error messages is a careful design which prevents a problem from occurring in the first place.

i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

Nielsen’s Heuristics (cont’d.)

• Recognition rather than recall

– Make objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.

• Flexibility and efficiency of use

Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions.

• Aesthetic and minimalist design

– Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility.

• Help users recognize, diagnose, and recover from errors

– Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.

• Help and documentation

– Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large.

i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

Severity Levels Used

Critical

Usability Issue:

- Loss of user data

- System shutdown

- Abandoned task

Major Usability

Issue –

- Completed task but considerable frustration or extra steps

Moderate Usability

Issue –

- Moderate work around or multiple attempts

* Usability Suggestion i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

Sample Summary of Results

TYPE

Installation

Project Creation

Visual Layout

CRITICAL

0

0

0

Code Editor

Run Project

0

0

Debug

Main IDE

Icons

0

0

0

Help

Totals

0

0

MAJOR

0

0

3

1

0

0

0

0

0

4

MODERATE

3

5

16

1

1

1

7

2

2

38

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

i

How to . . .

• http://www.useit.com/papers/heuristic/he uristic_evaluation.html

i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

Other Heuristics

• http://www.stcsig.org/usability/topics/artic les/he-checklist.html

i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

Next Week

• IX Lab demo

• Due dates:

– Book review – up on the wiki today

– White paper – 2 weeks

– Project test plan – 4 weeks i

R. G. Bias | School of Information | SZB 562B | Phone: 512 471 7046 | rbias@ischool.utexas.edu

Download