ALA-libanswersfinal - 2012 ALA Annual Scheduler

advertisement

Apples and Oranges: Lessons From a

Usability Study of Two Library FAQ

Web Sites

Susan [Gardner] Archambault

Kenneth Simon

Loyola Marymount University

• Private Catholic

University in Los

Angeles, California

• 5900+ undergraduates and 1900+ graduates

• William H. Hannon

Library Information

Desk open 24/5

Research Question

• What is the most effective way to provide access to our Library FAQs?

• A comparison of two products: How Do I? and

LibAnswers. Which features do students prefer, and which features lead to better performance?

How Do I?

LibAnswers

Auto-Suggest Feature

Related Questions Feature

Methodology

• Conducted usability testing on 20 undergraduate students at LMU

• Population equally represented each class

(freshmen through seniors) and had a ratio of 60:40 females to males

Methodology

• Used a combination of the Performance Test methodology and the

Think-Aloud methodology

Methodology

• Students given 10 performance tasks to complete at a computer twice - once using

LibAnswers as starting point, and once using

How Do I?

• After each performance task, students given questionnaire measuring satisfaction with site

Performance Task Questions

How to print in the library from a laptop How to request a research consultation

How long can a graduate student check out a book

Where are the library copy machines

How to request a book from basement storage

Can a Loyola law school student reserve a group study room in advance

How to search for a book by the author’s name

How to tell what books are on reserve for a class

Where to access CRSPSift software in the library

How much does it cost for an undergrad to request a magazine article from another library

Satisfaction Scale

Methodology

• Audio recorded and computer screen activity captured via

“ScreenFlow” screencasting software

Additional Questions

• How likely would you be to use each page again?

• What was your favorite aspect of each site?

• What was your least favorite aspect?

• Overall, do you prefer LibAnswers or How Do

I?

Performance Scoring: Speed

• Start the clock when the person begins searching for the answer to a new question on the home page of the site they are testing

• Stop the clock when they copy the URL with the answer

Performance Scoring: Accuracy

Was the Answer…

Completely Accurate: found the answer

On the correct path to the information, but did not go far enough or took wrong subsequent path

On the correct page, but did not see the answer (supersedes everything else they tried on other attempts to answer)

Check off the one that applies:

Pointed to a related question under the correct category, but incorrect page

Incorrect and off topic

Gave up: never found an answer

Performance Scoring: Efficiency

• Count the number of times the person made a new attempt, or started down a new path, by returning to the home page *after* a previous attempt away from or on the homepage failed

Sample Scoring Video

bit.ly/usabilityvideo

Site

How Do I?

Speed

46 seconds

Accuracy Efficiency

Completely Accurate +1 (clicked 1 wrong path)

LibAnswers 36 seconds Completely Accurate +1 (clicked 1 wrong path)

Performance Results

Speed

LibAnswers

How Do I?

Average (seconds)

40.55

33.90

Efficiency

LibAnswers

How Do I?

Total Wrong Paths

30

40

Performance Results

Accuracy

Completely accurate

Correct path but did not go far enough or took a wrong subsequent path

Correct page, but did not see the answer

Pointed to a related question under the correct category, but incorrect page

Incorrect and off-topic

Gave up: never found answer

LibAnswers

182 (91%)

5 (2.5%)

3 (1.5%)

6 (3%)

0

4 (2%)

How Do I?

175 (87.5%)

15 (7.5%)

3 (1.5%)

3 (1.5%)

3 (1.5%)

1 (.005%)

LibAnswers Features Used

Feature

Search Box

Auto-Suggest

Popular Answers

Cloud Tag

Related Questions

Change Topic Drop-down 2

Recent Answers 2

8

4

Number Who Used

16

12

9

Percent

80%

60%

45%

40%

20%

10%

10%

Satisfaction

Likely to use again

LibAnswers

How Do I?

Very unlikely

0

0

Unlikely

15% (3)

15% (3)

Undecided Likely

5 (25%)

3 (15%)

5 (25%)

5 (25%)

Very

Likely

7 (35%)

9 (45%)

Overall preference

LibAnswers

How Do I?

Satisfaction

Response

40% (8)

60% (12)

Patterns

• Overall, 9 of 20 performed worse with the site they said they preferred.

• 4 of 5 freshmen performed worse with the site they said they preferred. Upperclassmen were more consistent.

• Females tended to perform better with their preferred site; males did not.

• 75% of the males preferred How Do I? over

LibAnswers, while females were evenly divided.

Likes

• Keyword search “like a search engine”

• Autosuggest in search bar

• Popular topics list

• Friendly / pleasant to use

• Don’t have to read through categories

LibAnswers

Dislikes

• Overwhelming interface / cluttered

• Long list of specific questions but hard to find the info you want

• Less efficient than the

“How Do I” page

• Once you do a search, you lose your original question

• Autosuggestions are ambiguous or too broad, and sometimes don’t function properly

Likes

• Fast / efficient to use

• Everything is right there in front of you: “I don’t have to type, just click”

• Simple, clearly laid out categories

• Organized and clean looking

How Do I?

Dislikes

• Less efficient than the

LibAnswers page: have to read a lot

• Too restricted: needs a search box

• Have to guess a category to decide where to look

• Limited number of toobroad questions

• Boring / basic appearance

Sharing results with Springshare

• Retain question asked in search results screen.

• Add stopwords to search, so typing “How do I” doesn’t drop down a long list of irrelevant stuff, and “Where is” and “where are” aren’t mutually exclusive.

• Remove “related LibGuides” content to reduce clutter.

• Control the list of “related questions” below an answer: they seem to be based only on the first topic assigned to a given question.

Take the best of… How Do I

Take the best of… LibAnswers

But wait…

There is another.

Take the best of… Get Help

The best of all worlds

Conclusions

• Ended up with a balance between two extremes rather than one or the other

• Think-aloud method: gave up control; no preconceived ideas could influence outcome

• Sitting in silence watching the participants made them nervous. Next time maybe leave the room and have a self-guided test

• Efficiency is difficult to measure: moved away from counting clicks

Acknowledgements

Thank you:

• Shannon Billimore

• Jennifer Masunaga

• LMU Office of Assessment/Christine Chavez

• Springshare

Bibliography

• Ericsson, K.A. and Simon, H.A.

(1980). Verbal Reports as Data.

Psychological Review, 87(3), 215-

251.

• Smith, Ashleigh, Magner, Brian, and Phelan, Paraic. (2008, Nov.

20). Think Aloud Protocol Part 2.

Retrieved May 3, 2012 from http://www.youtube.com/watch?

v=dyQ_rtylJ3c&feature=related

• Norlin, Elaina. (2002). Usability

Testing for Library Web Sites: A

Hands-On Guide. Chicago:

American Library Association.

• Porter, J. (2003). Testing the

Three-Click Rule. Retrieved from http://www.uie.com/articles/thre e_click_rule/ .

• Willis, G.B. (2005). Cognitive

Interviewing: A Tool for Improving

Questionnaire Design. Thousand

Oaks, CA: Sage Publications.

Additional Information

Presentation Slides

• bit.ly/gardnersimon

Contact Us

Ken Simon

Reference & Instruction Technologies Librarian

Loyola Marymount University

Twitter: @ksimon

Email: ksimon@me.com

Susan [Gardner] Archambault

Head of Reference & Instruction

Loyola Marymount University

Twitter: @susanLMU

Email: susan.gardner@lmu.edu

Download