View/Open - ScholarsArchive@OSU is Oregon State University's

“But I’m Not a Usability Expert”:
Testing Tips for the ResourceSharing Professional
Uta Hussong-Christian
Jane Nichols
Jennifer Ward
September 15, 2011
overview
•
•
•
•
What > Jane
How-to > Jennifer
Applying > Uta
Q&A
what is usability?
Photocopy?
User Centered
Docline?
what is usability?
• Usability refers to how well your users can
learn and use your web site
• And, refers to the methods you can use to
evaluate your site and your user’s experience
Usability Basics Usability.gov http://usability.gov/basics/index.html
5 usability attributes
•
•
•
•
•
Ease of learning
Efficiency
Memorability
Error frequency and severity
Subjective satisfaction
U.S.D.H.H.S. , “Usability Basics”
Ease of learning
Efficiency of use
Memorability
Error frequency
and severity
Subjective
satisfaction
How quickly can a user who
has never seen your site
before learn it well enough to
accomplish basic tasks?
How do I request something from ILL?
U.S.D.H.H.S. , “Usability Basics”
Ease of learning
Efficiency
Memorability
Error frequency
and severity
Subjective
satisfaction
Once users have learned the
design, how quickly can they
perform tasks?
How quickly can I fill out a form?
Nielsen, “Usability 101”
U.S.D.H.H.S. , “Usability Basics”
Ease of learning
Efficiency of use
Memorability
Error frequency
and severity
Subjective
satisfaction
When your users return to
your site after not using it,
how easily can they
reestablish proficiency?
Can I get back to ILL forms?
Nielsen, “Usability 101”
U.S.D.H.H.S. , “Usability Basics”
Ease of learning
Efficiency of use
Memorability
Error frequency
and severity
Subjective
satisfaction
U.S.D.H.H.SU.S.D.H.H.S., “Usability Basics”
U.S.D.H.., “Usability Basics
How often do users make
errors, how serious are
the errors, & how easily
do they recover?
When going from the openURL resolver to the
ILL form, I get lost-badly-how do I get back?
Nielsen, “Usability 101”
U.S.D.H.H.S. , “Usability Basics”
Ease of learning
Efficiency of use
Memorability
Error frequency
and severity
Subjective
satisfaction
How much does the user like
using the system?
Fun factor, or at least pleasing
U.S.D.H.H.S., “Usability Basics”
usability and user experience
Usability
• “Can the user accomplish
their goal?”
• Did they choose the right
form (book, article, etc) for
their request?
User experience
• “Did the user have as
delightful an experience as
possible?”
• Were they able to move
from identifying a citation in
a database to filling out a
request form and go back to
the database or their next
task with ease?
Mifsud, “The difference between usability and user experience”
more about user experience
Morville, “User Experience Design”
start small and expand
Some rights reserved by jakeandlindsay
how do i…
heuristic evaluation
• What
– systematic inspection of a user interface design for usability
• When
– any time! Especially useful before you start redesigning/testing with
users
• Why
– it’s cheap and easy, with a very high return on investment
• What you’ll get from it
– a (page or process) design that’s had a lot of eyes looking at it
intensely for flaws. Guaranteed improvement and therefore a good
baseline from which to start testing with users
Nielsen, “Heuristic Evaluation”
10 heuristics
1.Visibility of system status
2.Match between system and the real world
3.User control and freedom
4.Consistency and
5.Error prevention
6.Recognition rather than recall
7.Flexibility and efficiency of use
8.Aesthetic and minimalist
9.Help users recognize, diagnose, and recover from errors
10.Help and documentation
Nielsen, “10 Usability Heuristics”
Sorry, no online holdings were found for this title.
Please see additional options below for finding this title.
task-based usability testing
• What:
– involves watching people trying to use something for its intended purpose
• When:
– because it’s expensive, usually done at a point when you have a
design/process/architecture that has been created using good design
principles and as user-centered as you can make it
• Why:
– EVERYTHING should be useful, usable, desirable, valuable, findable,
accessible, and credible
• What you’ll get from it:
– a very humbling experience
– data that can be used to inform (re)design decisions
Rubin & Chisnell
example usability test workflow
1.
2.
3.
4.
develop scripts
develop scenario-based task list
recruit participants
conduct tests
o welcome/disclaimer (we're testing the system, not you)
o pre-test survey (sometimes)
o conduct tasks (somewhere between 7-10, depending on
complexity)
o post-task activity (survey, Xs/Os) for participant while
facilitator talks w/ observers
o thank you, give the participant an incentive
5. share results (reports, videos, oral briefings, etc.)
Xs/Os
• What:
– method that gives the user the power to design in a very low-stakes way
• When:
– we tend to use it as part of a usability test session, but it can be done at any
time. Would be easy to take it to the campus union building and get quick
feedback.
• Why:
– it’s easier for users to tell us what to add/remove/change in response to a
paper protoype than the dreaded “what would you change?” question
– it’s fun for the user – they get to use The Red Pen and tell us what to do!
• What you’ll get from it:
– a pretty clear indication of what users say they will/won’t use and what they’d
like to see in the design
Xs/Os
“Now I'd like to know more about what specifically you
use/don't use or like/dislike about the content on this page.
Instructions:
•circle any items on the page that you like and would use the
most
•cross out any items on the page that you don't like and don't
use
•add any items that are not on the page that you would like to
have there for your own use”
applying usability
got data, now what?
• Convey findings
• Decide on/implement changes
• Sit back and relax RETEST
“The first few iterations can probably be expected to
result in major gains in usability as the true "usability
catastrophes" are found and fixed.”
Jakob Nielsen, “Iterative user-interface design”
convey findings
• Keep notes
– Who tested
– Usability task
– Completion rate
– Observations
• Summarize finding
– Show video if available
decide on/implement changes
• Prioritize problems to fix
– head slappers
– cheap hits
– severity vs. priority
• Figure out how to fix them
– test subjects may supply this
Clark, “Issue Priority and Severity
Krug, Don’t Make Me Think”
“The important things that you learn from usability testing usually just
make sense.” They tend to be obvious to anyone who watches the
sessions.”
- Steve Krug, Don’t Make Me Think
testing
• 3 undergrads / 3 grads & faculty
• 50% task completion rate overall
– Find copy of article for reading due tomorrow
• Hoped they would use “check catalog” link
– Find copy of article needed for grant due in
several weeks
• Hoped they would use “submit a request” link
as the user sees it!
“No online full text…”
line is loud and clear. It
just says to me that it is
not available at OSU.”
“…tempted to just
choose first option
instead of going to
sentence below.”
“Don’t use too many
words. I don’t need to
read so much.”
“…put them side by
side.”
the bottom line
Difficult web forms/processes result in:
• Lost research productivity
• Lost study time
• Lost staff time
• “Image” problem
?
references & resources
Clark, P. (n.d.). Issue priority and severity. Retrieved from
http://www.stickyminds.com/sitewide.asp?Function=FEATUREDCOLUMN&ObjectId=10119&ObjectType=ARTCOL&btntopic=artcol
Gaffney, M. (2009). Reflecting usability engineering goals in interlibrary loan user interfaces. Journal of Interlibrary Loan, Document Delivery &
Electronic Reserve, 19, 291-298. doi: 10.1080/10723030903278366
Krug, S. http://www.sensible.com/
Krug, S. (2000). Don't make me think!: A common sense approach to web usability. Indianapolis, Ind.: Que Corp.
Lehman, T., & Nikkel, T. (2008). Making Library Web Sites Usable: a LITA Guide. New York: Neal-Schuman Publishers. Retrieved from
http://www.neal-schuman.com/uploads/pdf/0139-making-library-web-sites-usable--a-lita-guide.pdf
Morville, P. http://findability.org/ or http://semanticstudios.com/
Morville, P. (2004). User experience design. Retrieved from http://semanticstudios.com/publications/semantics/000029.php
Nielsen, J. http://www.useit.com/
Nielsen, J. (2005). 10 Usability Heuristics. Retrieved from http://www.useit.com/papers/heuristic/heuristic_list.html
Nielsen, J. (2005). Heuristic Evaluation. Retrieved from http://www.useit.com/papers/heuristic
Nielsen, J. (1993). Iterative user-interface design. Computer, 26(11), 32-41. doi: 10.1109/2.241424
Nielsen, J. (n.d.). Usaibility 101: Introduction to usability. Retrieved from http://www.useit.com/alertbox/20030825.html
Mifsud, J. (2011). The difference (and relationship) between usability and user experience. Retrieved from http://usabilitygeek.com/thedifference-between-usability-and-user-experience/
Ribeirinho, A. (2007). The User Experience Honeycomb. Retrieved from http://blog.delaranja.com/the-user-experience-honeycomb/
Rubin, J., & Chisnell, D. (2008). Handbook of usability testing: How to plan, design, and conduct effective tests. Indianapolis, IN: Wiley Pub.
“start”. Flickr user jakeandlindsay. Retrieved from http://www.flickr.com/photos/jakeandlindsay/5524669257/sizes/m/in/photostream/
University of Washington Libraries Usability http://www.lib.washington.edu/usability/resources/howto
Usability4lib email list http://www.library.rochester.edu/usability4lib
U.S.D.H.H.S. (n.d.). Usability Basics. Retreived from http://www.usability.gov/basics/index.html
Uta: uta.hussong-christian@oregonstate.edu
Jane: jane.nichols@oregonstate.edu
Jennifer: jlward1@uw.edu