Uploaded by Tao He

University of Glasgow Library Website Usability Evaluation

advertisement
Group10 Lab02
Student IDs: <2973128H> Tao He
<2979894Z> Xin Zhang
<2981006L> Xiang Luo
<2974555X> Haokun Xie
<2979404Z> Tian Zhang
< 2928136S> Haoxuan SHI
1.Product description:
The University of Glasgow Library website (https://www.gla.ac.uk/myglasgow/library/)
is a platform that provides comprehensive resources and services for students, staff and
visitors. The website provides functions such as library account access, resource
searches, opening hours, floor plans, library news and contact information. Users can
find e-books, journals, databases and support information on learning, research and
discovery through the website.
2.Evaluation objectives:
The main research questions of this evaluation include:
1. User experience (UX) and usability: Is the website's interface intuitive? Can users
easily find the information they need? Is the navigation structure clear? How accurate
and efficient is the search function?
2. Content comprehensiveness and update frequency: Does the website provide
comprehensive information about resources and services? Is the content updated in a
timely manner, especially in terms of library opening hours, news and events?
3. Access speed and technical performance: How quickly does the website load? How
well does it perform across different devices and browsers? Are there any technical
glitches or broken links?
4. Accessibility and multilingual support: Is the website designed to be accessible to all
users? Are there any multilingual options to cater for non-native English speakers?
2.1 Research significance:
As the digital age progresses, university library websites have become a key tool for
students and staff to access academic resources and services. An in-depth evaluation of
the University of Glasgow Library website can help to understand its performance in
terms of user experience, content quality and technical performance. This will not only
provide the library with suggestions for improvement and enhance user satisfaction, but
also serve as a reference for other academic institutions when designing and optimising
their library websites. In addition, considering the needs of a diverse user base, it is also
important to evaluate the accessibility features and multilingual support of the website
to ensure that all users can equally and conveniently access and utilise library resources.
2.2 Pilot summary
In the evaluation of the University of Glasgow Library's website, attention was focussed
on the user experience, comprehensiveness of content, speed of access, technical
performance and accessibility of the site. During the evaluation, it was found that the
use of advanced research features was cumbersome and could affect the ability of users
to fully utilise the site's resources. Therefore, the experimental tasks were simplified to
improve efficiency and ensure a smoother user experience. In addition, the
questionnaire design was adjusted to focus on the usability and functionality of the
website to improve the accuracy and reliability of the survey results. Through these
adjustments, the evaluation team aimed to gain a clearer understanding of user needs,
thereby improving the overall experience and functional effectiveness of the website.
The aim of streamlining the evaluation process and optimising the questionnaire
structure is to provide more valuable feedback to help the website improve user
experience and overall performance.
3 Evaluation Plan
3.1 Evaluation Overview
3.1.1 Study Background
The objective of this evaluation is to assess the usability of the University of Glasgow
Library website, specifically focusing on the ease of use of its search functionality and
overall user experience.
To gather data, we distributed a questionnaire to five students from other groups in the
same course. Participants were asked to perform specific tasks on their own computers
and provide feedback through the survey.
This study will analyze both quantitative data (task success rate, error rate) and
qualitative feedback (user satisfaction, usability experience).
3.1.2 Evaluation Method
The evaluation type of this experiment is remote usability study. Participants used their
own computer equipment to complete the tasks and questionnaires, without restrictions
on experiment location and browser, and visited the UofG Library website. Each
participant should complete the task within 30 minutes, independently during the
experiment, and complete the questionnaire. The experimental designer drew the
conclusion by analyzing the questionnaire data.
3.2 Variables
The study examines the impact of different search methods—basic search (entering
keywords in the main search bar) and advanced search (using filters such as fields, date,
content type, and discipline)—on task performance. These different search methods are
the independent variables of this experiment.
The dependent variables include task success rate (percentage of participants
completing tasks successfully), error rate (number of mistakes), and user satisfaction
(measured on a 1-5 Likert scale). Additionally, website performance, specifically page
loading speed, is considered an independent variable as it may affect task completion
efficiency and user satisfaction.
Control variables ensure consistency, such as a fixed task order, uniform task
instructions, and acknowledgment of device and network variability, which, although
uncontrolled, will be accounted for in the analysis due to the remote nature of the
experiment.
3.3 Briefing and Debriefing
In this experiment, participants are five students from other groups in the same Human
Computer Interaction Design and Evaluation course, they have a similar level of
understanding of usability principles. Before the experiment, participants are informed
about the purpose of the study, which is “We are evaluating the usability of the UofG
Library website”. They are informed that the study should take 20-30 minutes, and they
can withdraw at any time. After the experiment, participants complete their own
questionnaire and researchers analyze the data collected and responses.
3.4 Task Design
Participants will complete the following tasks on the UofG Library website:
1. Log in to the University of Glasgow Library website (Success/Failure)
2. Use the Advanced Search function (Apply filters such as date and category)
3. Use the Specific Search function (Locate a book using its title or author)
4. Find the e-resource "Human Computer Interaction Design" (Assess the
accessibility of digital resources)
5. Find and Access a Specific Academic Database (e.g., Law, Chemistry, or AIrelated papers)
6. Navigate to the Help page (Success/Failure)
7. Evaluate Website Performance (User experience)
3.5 Data Collection Methods and Type
The data results of this experiment are the answers to the questionnaire, some of which
are Binary Choice, that is, "Yes/No" type of answers, some are scored using the Likert
Scale, and some are open-ended answers. Open-ended responses will be analyzed to
identify common usability issues and areas for improvement. Additionally, the impact
of website performance, such as page loading speed, on user experience will be
examined through qualitative feedback.
4 Experiment Result
4.1 Descriptive Statistics & Findings
4.1.1 Login Process
Question
Success Rate (%)
Successfully logged in
100%
Encountered login difficulties
0%
Finding: All participants successfully logged in, and none reported any login difficulties,
indicating that the authentication process is smooth.
4.1.2 Website Navigation
Question
Avg. Score (1-5)
Ease of finding the search bar
5.0
Ability to find e-books or academic papers
4.5
Ease of accessing the database (law, chemistry, etc.)
4.0
Finding: Users found the search bar easily (avg. 5.0), and the ability to find e-books
was rated high (4.5). However, accessing the database received a lower score (4.0),
indicating a potential area for improvement.
4.1.3 Task Completion
Task
Success Rate (%)
Found "Human Computer Interaction Design" e-resource
50%
Found an AI-related academic paper
100%
Found an IoT security paper using the database
100%
Used Advanced Search successfully
100%
Used Specific Search successfully
100%
Finding: While participants were generally successful in searching for papers, only 50%
were able to locate the "Human Computer Interaction Design" e-resource, suggesting
possible indexing or accessibility issues.
4.1.4 Search Functionality Satisfaction
Question
Avg. Score (1-5)
Satisfaction with search functionality
4.5
Encountered difficulties while searching
0
Finding: Overall satisfaction with the search function is high (4.5), and no participants
reported difficulties in performing searches.
4.1.5 User Experience
Question
Avg. Score (1-5)
Clear and understandable interface
4.5
Website provides sufficient information
5.0
Website loading speed satisfaction
3.0
Finding: While users found the interface clear (4.5) and the information sufficient (5.0),
website speed received the lowest rating (3.0), indicating a key area for improvement.
4.1.6 Overall Satisfaction
Question
Avg. Score (1-5)
Overall website usability satisfaction
4.0
Finding: The overall satisfaction rating (4.0) suggests a generally positive experience,
with minor areas for improvement.
4.2 Qualitative Feedback
4.2.1 Multilingual Support Needed
One participant mentioned that the library website should offer more language options
to accommodate international students.
4.2.2 User Interface Improvements
a) "Back to Top" button should be a floating icon rather than requiring users to scroll
down.
b) The "More" button lacks clear functionality, leading to confusion.
4.2.3 Performance Issues
Participants noted that the website loads slowly, impacting usability.
4.2.4 New User Guidance
A tutorial for first-time users would be beneficial to help them navigate various sections.
5 Discussion
The results of the experiment showed that the University of Glasgow Library website
had a good user experience, with a clear navigation structure and an efficient search
function: the survey participants generally found the search bar easy to find (average
score: 5.0), and they searched for e-books and academic papers fairly well (4.5).
However, finding a specific resource is not smooth enough (for example, only half of
users successfully find an e-resource related to Human Computer Interaction Design),
indicating that the site's indexing or categorisation needs to be optimised. In addition,
while users rated the clarity of the website interface (4.5) and the adequacy of
information (5.0) quite highly, some users reported that it was difficult to access certain
databases (4.0), indicating that there is still room for usability.
While the site performed well in many areas, we found some key improvements through
this research. First, the website load speed score is the lowest (3.0), indicating that
optimisation is needed to improve responsiveness. Secondly, the lack of multilingual
support can create barriers to use for international students. In addition, users suggested
that some details should be optimised, such as changing the "Back to Top" button to a
floating display instead of requiring the user to swipe the interface themselves, and
making the "More" button more explicit to prevent misunderstandings. These findings
demonstrate that we can further enhance the user experience by improving search
indexing, optimising site performance, and enhancing accessibility. These
enhancements not only help to improve the satisfaction of current users, but also
provide a valuable reference for other universities when designing and optimising
library websites.
6 Summary and Reflection
This experiment evaluated the usability of the University of Glasgow Library website,
and the overall process was relatively smooth. Most participants were able to complete
the main tasks and provide feedback. Nevertheless, we also came across few problems,
like advanced search option containing a lot of condition that some users may be
affected. The information on the result page is relatively brief and does not meet the
needs of some users. Moreover, the users mentioned that the web portal hardly loads
fast, and the waiting period is quite lengthy. Among other things, users suggested the
creation of personalized user manuals as well as implementation of automatic
translation tools to support users with non-English native backgrounds.
We reflected on this experiment and found that it is crucial to conduct more pilot studies
before formal evaluation. Although the task was clear, there were some failures to meet
the requirements by some of the participants. These problems might have been avoided
if more adequate testing had been done ahead of time. Besides, we observed that a
simple guide in the task to instruct them on what to focus on really made users evaluate
more efficiently. Moreover, not all of these subjects had the same exposure to the course
materials and the task itself was computer-related. The results may be different for other
professional related literature and students. Finally, considering the time of the subjects,
the task was somewhat inadequate, such as the use of literature citation and citation
tools. If we are going to do another product evaluation in the future, we will try to use
direct observation to record about the subjects' actions, blocks, and errors during the
experiment. After the test, we will also talk to the participants to get more in-depth
results. This evaluation improved our experimental design, data analysis, and user
research capabilities, and also made us realize the importance of HCI evaluation for
interactive system
Appendix
Download