Presentation

advertisement
BibEval – A framework for
usability evaluations of
online library services
Thomas Weinhold, Bernard Bekavac, Sonja Hamann*
Swiss Institute for Information Research (SII), Namics AG*
Libraries in the Digital Age (LIDA) 2014
16-20 June 2014, Zadar (Croatia)
e-lib.ch – Swiss electronic library
 Innovation and cooperation initiative with 20 sub-projects
 Vision: creation of a national portal to improve access
and retrieval of scientific information (www.e-lib.ch)
e-rara.ch Marketing e-lib.ch
DOI desk
Kartenportal.ch
Web portal e-lib.ch
Metadata servers
swissbib
Best-Practices
E-Depot
Basel
Infoclio.ch
Bern
retro.seals.ch
Long-term preservation
Zürich
e-codices
Fribourg
Infonet Economy
Genève
ACCEPT
RODIN
ElibEval
Chur
Information literacy
Martigny
Search skills
Multivio
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Seite 2
Sub-project "ElibEval"
 Usability evaluations of web sites and applications
developed in the context of e-lib.ch
 Conception and creation of analytical tools to support
information providers in carrying out their own
evaluations
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Seite 3
Situation of libraries
 Changes in environment and increasing competition
 Mission:
"[..] encourage the library and information sector to work with
partners and users to maximize the potential of digital technology
to deliver services that enable seamless and open access by
users to cultural and information resources."
(IFLA Strategic Plan 2010-15, http://www.ifla.org/files/hq/gb/strategic-plan/2010-2015.pdf)
Offer the same ease of use, robustness and performance as internet
search engines combined with the quality, trust and relevance traditionally
associated with libraries
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Page 4
Challenges for libraries
Libraries
Physical
collection
Databases
Presentation
Digital
collection
Website
Support
Indexing
Library catalogue
Archiving
Additional
information
Management
 Merging of heterogeneous information
 Organizing interaction of various systems, so that users can pursue their
objectives without hindrance ("don't burden users with library-interna")
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Seite 5
User-perceived quality of library online services
(Tsakonas & Papatheodorou, 2006)
“The extent to which a product can be used by specified users to
achieve specified goals with effectiveness, efficiency and satisfaction
in a specified context of use.”
(ISO 9241-Part 11: Guidance on usability)
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Page 6
Usability evaluation methods
Two main criteria to categorize usability evaluation methods:
 When (formative vs. summative evaluation)
 Who (user-oriented/empirical vs. expert-oriented/analytical
methods)
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Page 7
Usability evaluation of online library services
 As in our own project, libraries generally use a wide spectrum of
methods for usability evaluations
 Kupersmith (2012) provides a good overview
 According to this literature review the most commonly used method
is user observation / usability tests
Observation of real user behaviour
Time-consuming and expensive
Heuristic evaluation is a widely used instrument
(cheaper, quicker)
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Page 8
Heuristic evaluation
 Experts examine whether an interface is compliant with established
usability principles (the "heuristics")
 Nielsen's heuristics (1994):
1. Visibility of system status
6. Recognition rather than recall
2. Match between system and
real world
7. Flexibility and efficiency of use
3. User control and freedom
8. Aesthetic and minimalist design
4. Consistency and standards
9. Help users recognize, diagnose
and recover from errors
5. Error prevention
10.Help and documentation
(http://www.nngroup.com/articles/ten-usability-heuristics/)
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Page 9
Motivation for the development of library specific
heuristics
 Most studies limit themselves to common
heuristics, e.g. Nielsen’s 10 heuristics
 Lack of library specific recommendations
(e.g. Clyde 1996, Clausen 1999, Raward 2001)
 Problems of common heuristics:


too generic for an in-depth analysis
extensive knowledge in the field of user interface design is needed
 Our Goal: Develop more detailed heuristics, which
 are suited to the specific requirements of library websites
 are easy to use and allow a judgement even to non-experts
 assist developers in building user-friendly library websites
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Page 10
Methodical approach
 Three cornerstones:



literature review “usability evaluations in libraries”
best-practice-analysis of library websites
focus group (to discuss and further refine our concept)
 Result:


modular, hierarchically structured list of evaluation criteria
all website elements and evaluation criteria classified into mandatory
and optional
This concept aims at maximizing the applicability of the heuristics for
libraries of different size and type.
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Page 11
BibEval – Structure
 4 sectors divided into sub sectors
 Different components in each sub sector
 Questions/evaluation criteria for each hierarchy level
information &
communication
search & explore
the collection(s)
personalization &
customization
user
participation
search
Suchen&&
exploration
Erkunden
simple search
advanced search
presentation &
access
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
12
Usage of BibEval – Selection of sectors and components
http://www.cheval-lab.ch/en/usability-of-library-onlineservices/criteria-catalogue-bibeval/
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Seite 13
BibEval – Severity rating and comments
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Seite 14
BibEval – Reports / Export functions
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Seite 15
BIBEVAL – PROJECT
ADMINISTRATION
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Seite 16
Project Administration
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Seite 17
Conclusions and further work
 One criticism levelled against heuristic evaluation:
in-depth knowledge of HCI is required in order to apply this method
correctly (Blandford et al. 2007; Warren 2001)
 In formulating our evaluation criteria, we focused on end-user
perspectives
Enable libraries to conduct self-evaluations
 Continuous improvement of our criteria catalog to keep it up to date
 Extension of our web application (e.g. deleting questions, add own
criteria)
Further development through community
LIDA 2014, 16-20 June 2014, Zadar (Croatia)
Page 18
Your Questions?
thomas.weinhold@htwchur.ch
bernard.bekavac@htwchur.ch
sonja.hamann@namics.com
http://www.cheval-lab.ch
http://www.e-lib.ch
Download