CS 430 / INFO 430 Information Retrieval Usability 2 Lecture 24

advertisement
CS 430 / INFO 430
Information Retrieval
Lecture 24
Usability 2
1
Course Administration
2
Course Administration
Assignment 3
Grades were returned yesterday
3
The Design/Evaluate Process
Requirements
(needs of users
and other
stakeholders)
Evaluation
start
Implementation
(may be prototype)
release
4
Design (creative
application of
design principles)
Usability Factors in Searching: User
Interface
Design of an interface for a simple fielded search.
Interface: Fill in boxes, text string, ... ?
Presentation of results ... ?
Manipulation of results ... ?
Functions: Specify field(s), content, operators, ... ?
Retain results for manipulation ... ?
Query options ... ?
5
Data:
Metadata formats ... ?
Data structures and file structures ... ?
Systems:
Performance ... ?
How do we evaluate various designs?
Usability Factors in Searching:
Ordering of results
The order in which the hits are presented to the user:
• Ranked by similarity of match (e.g., term weighting)
• Sorted by a specified field (e.g., date)
• Ranked by importance of document as calculated by some
algorithm (e.g., Google PageRank)
• Duplicates shown separately or merged into a single record
• Filters and other user options
What impact do these choices have on the usability?
How do we evaluate various designs?
6
Evaluation
What is usability?
Usability comprises the following aspects:
• Effectiveness – the accuracy and completeness with which
users achieve certain goals
Measures: quality of solution, error rates
• Efficiency – the relation between the effectiveness and the
resources expended in achieving them
Measures: task completion time, learning time, clicks number
• Satisfaction – the users’ comfort with and positive attitudes
towards the use of the system
Measures: attitude rating scales
From ISO 9241-11
7
Evaluation
• The process of determining the worth of, or assigning a
value to, the usability on the basis of careful examination
and judgment.
• Making sure that a system is usable before launching it.
• Iterative improvements after launch.
• Categories of evaluation methods:
– Analytical evaluation: without users
– Empirical evaluation: with users
– Measurements of operational systems
8
Evaluation without Users
Assessing systems using established theories and methods
Evaluation techniques
• Heuristic Evaluation (Nielsen, 1994)
– Evaluate the design using “rules of the thumb”
• Cognitive Walkthrough (Wharton et al, 1994)
– A formalized way of imagining people’s thoughts and
actions when they use the interface for the first time
• Claims Analysis – based on scenario-based analysis
– Generating positive and negative claims about the
effects of features on the user
9
Evaluation with Users
Testing the system, not the users!
Stages of evaluation with users:
Preparation
Sessions conduct
Analysis of results
User testing is time-consuming and expensive.
10
Evaluation with Users
Preparation
• Determine goals of the usability testing
“The user can find the required information in no more
than 2 minutes”
• Write the user tasks
“Answer the question: how hot is the sun?”
• Recruit participants
Use the descriptions of users from the requirements phase
to detect potential users
11
Usability Laboratory
Concept: monitor users while they use system
Evaluators
one-way
mirror
12
User
Evaluation with Users
Sessions Conduct
• Conduct the session
– Usability Lab
– Simulated working
environment
• Observe the user
– Human observer(s)
– Video camera
– Audio recording
• Inquire satisfaction data
13
Evaluation with Users
Results Analysis
• If possible, use statistical summaries
• Pay close attention to areas where users
– were frustrated
– took a long time
– couldn't complete tasks
• Respect the data and users' responses, don't make excuses
for designs that failed
• Note designs that worked and make sure they're
incorporated in the final product
14
Measurements on operational systems
Analysis of system logs
• Which user interface options were used?
• When was was the help system used?
• What errors occurred and how often?
• Which hyperlinks were followed (click through data)?
Human feedback
• Complaints and praise
• Bug reports
• Requests made to customer service
15
The Search Explorer Application:
Reconstruct a User Sessions
16
Refining the design based on
evaluation
Designers and evaluators need to work as a team
Designers are poor evaluators of their own work, but know the
requirements, constraints, and context of the design:
• Some user problems can be addressed with small changes
• Some user problems require major changes
• Some user requests (e.g., lots of options) are incompatible
with other requests (e.g., simplicity)
Do not allow evaluators to become designers and vice versa
17
Experiment on the Google Interface
Methodology
• 10 information seeking tasks in 2 categories
• Users randomized across tasks
• Click through data to see what the user did
• Eye tracking data to see what the user viewed
• Google results presented with ranks changed or
reversed
An example of interdisciplinary information science
research by Cornell's Human Computer Interaction
Group and Computer Science Department
18
Evaluation Example: Eye Tracking
19
Evaluation Example: Eye Tracking
20
Google Evaluation
Click-Through Data
Number of users who
clicked on link
Rank of hit
21
Google Evaluation
Eye Tracking Data
Number of users who viewed
short record before first click
350
300
250
200
150
100
50
0
1
2
3
4
Rank of hit
22
5
6
7
8
9
10
Google Evaluation:
Eye Tracking Data
Part of short record viewed before first click (% of users)
Title: 17.4%
Snippet: 42.1%
Category: 1.9%
URL: 30.4%
Other: 8.2% (includes, cached, similar pages, description)
23
Google Experiment: Click Through
Data with Ranks Reversed
ChosenRank
Percentage of users
who clicked on link
Cond: r
80
Percent
60
40
20
0
1
2
3
4
5
ChosenRank
6
Rank of hit
24
8
9
10
Download