Heuristic evaluation

advertisement
Heuristic Evaluation
Sources for today’s lecture:
Professor James Landay:
http://bmrc.berkeley.edu/courseware/cs160/fall98/lectures/heuris
tic-evaluation/heuristic-evaluation.ppt
Jakob Nielsen’s web site:
http://www.useit.com/papers/heuristic/heuristic_evaluation.html
Nielsen articles linked to course web site
Heuristic evaluation (what is it?)
Method for finding usability problems
 Popularized by Jakob Nielsen
 “Discount” usability engineering

 Use
with working interface or scenario
 Convenient
 Fast
 Easy to use
Heuristic evaluation (how ?)
Small set of evaluators (3-5)
 Each one works independently
 Find problems with an interface using a
small number of heuristics (principles)
 Aggregate findings afterward

Use multiple evaluators

These people can be novices or experts
 “novice
evaluators”
 “regular specialists”
 “double specialists”
(- Nielsen)
Each evaluator finds different problems
 The best evaluators find both hard and
easy problems

Use multiple evaluators
Proportion of usability problems found
by different numbers of evaluators (Nielsen)
Heuristic Evaluation - Advantages
Evaluators can be experts.
There need not be a working system.
Evaluators evaluate the same system or
scenario.
Often, about 5 evaluators can discover
around 75% of the problems.
Principles (Nielsen’s original set)
Simple & natural
dialog
Speak the users’
language
Minimize users’
memory load
Be consistent
Provide feedback
Provide clearly
marked exits
Provide shortcuts
Good error messages
Prevent errors
Sample Heuristics (we’ll be using these)
1. Visibility of system
status
2. Match between
system & real world
3. User control and
freedom
4. Consistency &
standards
5. Error prevention
6. Recognition rather
than recall
7. Flexibility &
efficiency of use
8. Minimalist design
9. Help error recovery
10. Help &
documentation
(PRS pp. 408-409)
Revised principles (PRS, 408-9)
1. Visibility of system status
searching database for matches
What is “reasonable time”?
0.1 sec: Feels immediate to the user.
No additional feedback needed.
 1.0 sec: Tolerable, but doesn’t feel
immediate. Some feedback needed.
 10 sec: Maximum duration for keeping
user’s focus on the action.
 For longer delays, use % done progress
bars.

2. Match between the system and the
real world
Natural dialog?
Socrates: Please select command mode
Student: Please find an author named
Octavia Butler.
Socrates: Invalid Folio command: please
Another example:
Dragging a diskette
into the trash
(Stay tuned for lecture
on metaphors!)
3. User control and freedom
 Provide
exits for mistaken choices
 Enable undo, redo
 Don’t force users to take a particular path
4. Consistency and standards
See also: SPSS menus (“OK” is inconsistently located.
5. Error prevention
People make errors.
Yet we can try to prevent them.
How might you go about trying
preventing errors?
5. Error prevention
People make errors.
Yet we can try to prevent them.
How might you go about trying
preventing errors?
(try adding forcing functions)
6. Recognition rather than recall
Ex: Can’t copy info from one window to another
Violates: Minimize the users’ memory load
(see also Norman’s book)
7. Flexibility and efficiency of use
 Provide
short cuts
 Enable macros
 Provide multiple ways of accomplishing the
same thing
8. Aesthetic and minimalist design
NOT!
9. Help users recognize, diagnose, and
recover from errors
Error messages (Unix)
SEGMENTATION VIOLATION! Error #13
ATTEMPT TO WRITE INTO
READ-ONLY MEMORY!
Error #4: NOT A TYPEWRITER
10. Help and documentation
Heuristics adapted to web site evaluation:
(PRS p. 415)
Adapt the general heuristics provided by
Nielsen to the particular domain!
Phases of heuristic evaluation
1. Pre-evaluation training - give evaluators
needed domain knowledge and information
on the scenario (readings, this lecture!)
2. Have them evaluate interface independently
3. Classify each problem & rate for severity
4. Aggregate results (Matt will do this)
5. Debrief: Report the results to the interface
designers
Severity ratings
Each evaluator rates individually:
0 - don’t agree that this is a usability problem
1 - cosmetic problem
2 - minor usability problem
3 - major usability problem; important to fix
4 - usability catastrophe; imperative to fix
In giving a rating, consider both the flaw’s
impact and its frequency.
Conclusion



Heuristic evaluation is a great “discount”
method. (You will try out this method with
Assignment #2.)
But it’s not perfect - some “problems” may not
matter, and some problems will be missed.
For best results, use heuristic evaluation in
combination with user testing!
Download