Design of & Usability for ERM Data
Amy Fry, Electronic Resources Coordinator
Bowling Green State University http://www.slideshare.net/amyfry2000/training-daypresentation-7884397
• What best practices for databases webpages should we follow?
• How do libraries structure full resource records, and what do users look for in them?
• Usability testing: tips and resources
• Portal or landing page for all databases
• Databases A-Z list (separate from e-journals)
• Databases-by-subject pages
(usually separate from other course and subject guides)
• Full resource records – information pages about each individual database
BGSU
• Portal or landing page for all databases
• Databases A-Z list (separate from e-journals)
• Databases-by-subject pages
(usually separate from other course and subject guides)
• Full resource records – information pages about each individual database
Kent State University
• Portal or landing page for all databases
• Databases A-Z list (separate from e-journals)
• Databases-by-subject pages
(usually separate from other course and subject guides)
• Full resource records – information pages about each individual database
Case Western Reserve
• Portal or landing page for all databases
• Databases A-Z list (separate from e-journals)
• Databases-by-subject pages
(usually separate from other course and subject guides)
• Full resource records – information pages about each individual database
Wright State University
OhioLINK
2010 survey of ARL library websites
• Databases A-Z list
• Databases-by-subject lists
• Full resource records
• Software
• Discovery layer or federated search
• Link name
• Order of databases-by-subject lists
• Use of icons/graphics
• Cohen and Calsada (2003)
Found that 66 of 114 academic ARLs used database-driven webpages to present their e-resources in 2002.
• Shorten (2006)
Found that 88.6% of ARL libraries had databases A-Z lists in
2003, and 10.5% also categorized them by type.
• Caudle and Schmitz (2007)
Found that 97% of the 99 American academic libraries in ARL had a databases A-Z list, 96% had databases-by-subject lists and 27% had federated searching.
Type of system # of libraries %*
Homegrown
Metalib
81
14
71.1%
12.3%
Innovative
LibGuides
Xerxes
WebFeat
LibData
8
4
4
2
1
7%
3.5%
3.5%
1.75%
<1%
Kent State University
*Percentages are based on
114 libraries (excluding 7 national/special libraries and 4 libraries whose databases pages were behind a login)
Types of access
Databases A-Z
Databases-bysubject lists
Full resource records
All three
# of libraries
%
111 97
91 80
83
73
73
64
University of
Missouri-Columbia
Subject list order # of libraries %
By relevance
By format
Alphabetical only
38
7
46
41.8%
7.7%
50.5%
University of Connecticut
University of
Missouri-Columbia
Libraries using icons or graphics: 64 (56%)
Icon # of libraries
Access restrictions 38
More information 27
Full text 9
5 Magnifying glass
(Metalib: search in database)
Tutorials
Funding source
4
3
Format (audio, etc.) 3
Plus sign (Metalib: add to a set)
Social media
2
2
Metasearch
Logo/screenshot
2
2
RefWorks
New
Plus-star
SFX
1
1
1
1
University of Cincinnati
Link title begins with…
“Databases”
“Articles”
“E” or “Electronic”
“Find”
“Research”
“Search”
“Indexes”
“Journal”
Branded names
Other
# of libraries % Examples
47 41% Databases (30), Databases A-Z (8)
22
16
8
18.6% Article Databases (4)
Articles & Databases (8)
13.6% E-Resources (7), Electronic Resources (5)
6.8% Find Articles (3)
Find Articles & Databases (1)
6.8% Research Databases (3) 8
4
2
2
2
4
3.4% Search & Find (2), Search a Database (1)
1.7% Indexes & Databases (1)
Indexes & Databases (Articles) (1)
1.7% Journal Articles (2)
1.7% Vera: E-Journals & Databases
Galileo @ UGA
< 1% each
Resource Gateway – Resources
More Databases
All Databases A-Z and Database Finder
Online Research Resources (Databases)
OhioLINK
Wright State
BGSU
What types of information are currently collected in your library's ERM system and to whom does that information display? Check all that apply.
In ERM?
Display to public?
Display to staff?
Answer Options formats
Databases
Electronic journals
Electronic books
14
12
8
8
5
4
13
11
7
“public” info
Resource descriptions
License information (permissions)
Coverage dates
Resource advisories
Trial information
Tutorials/user guides
14
14
6
7
8
5
5
5
7
6
2
2
12
13
6
7
8
5
“library” info
Vendor/contact information
Login/passwords 10 0
Renewal information 9 0
4 0 Purchase approval information
4 0 Payment history
8
4
4
10
10
8
Student comments on a resource record from BGSU’s 2010 usability study
Fields in resource records
Most important fields
Description
Dates
Full text
Most confusing fields
Mobile access
Coverage load
On-campus access
Least important fields
User support
Mobile access
Local contact
0
2
1
2
0
4
Important
14
10
7
Confusing
Not needed
0
1
1
1
0
0
10
6
4
2
10
1
3
1
0
3
3
2
BGSU usability study: steps and timeline
1. Identify goals (December 2009)
2. Complete Human Subjects Review Board (HSRB) training
(January 2010)
3. Submit HSRB application, including script, recruitment materials, consent form (January 2010)
4. Obtain funding for incentives (January 2010)
5. Test the instrument (February 2010)
6. Recruit participants (February 2010)
7. Complete the testing (February-March 2010)
8. Analyze results (March-April 2010)
9. Present findings and recommendations (April-May 2010)
Lehman & Nikkel, 2008
Krug, 2006
Foster & Gibbons, 2007
• Hammill (2003)
Did common task testing with 52 users at Florida International
University Libraries, including finding a named database.
• Krueger, Ray and Knight (2004)
Did common task testing with 134 users at the University of the Pacific Library.
• Fuller, Livingston, Brown, Cowan, Wood and
Porter (2009)
Did three rounds of testing with five users each on the databases pages at the University of Connecticut Libraries.
change to Databases A-Z change to Databases by subject add Film, Television & Media Studies change to Videos & Images
Remove search box
Add a connect button
Database title
Contains
Notes
Tutorials & help
Journal titles in this database
Access for mobile devices
Alternate on-campus link
Dates included
View this title
• July 2010
• Twelve participants
– 4 graduate students
– 4 incoming freshmen
– 2 undergraduates
– 1 staff member
– 1 faculty member
• Ask your administrative office or Friends to fund the incentives
• Recruit with signs in the library or grab people as they go by
• Design for minimal prep and minimal analysis
• Don’t worry about technology
• Make sure people are committed to change
(both intellectually and with resources).
• Have a plan to assess the impact of your changes.
• Build time into your future schedule to do more testing.