BLACKBOARD PILOT FINAL REPORT

advertisement
BLACKBOARD PILOT FINAL REPORT
SPRING-SUMMER 2014
Penn State University, University Park, PA
ABSTRACT
Report on Spring-Summer 2014 Blackboard Learn Pilot to be presented to the
eLearning Strategic Committee to the Office of the Vice Provost for Information
Technology at Penn State in partial fulfillment of the committee charge to
develop a strategy for sustaining a common eLearning environment across the
university.
Table of Contents
Executive Summary ............................................................................................................... 4
About the Pilot ...................................................................................................................... 7
History and Objectives .................................................................................................................................. 8
Assessment Strategy ................................................................................................................................... 10
Assessment Measures and Strategies .................................................................................. 11
Post-Training Survey ................................................................................................................................... 11
Direct Observations .................................................................................................................................... 12
Focus Groups............................................................................................................................................... 13
Mid-Term Pilot Survey ................................................................................................................................ 13
Section Discussion....................................................................................................................................... 13
Demographic Information ................................................................................................... 14
Section Findings .......................................................................................................................................... 14
Pilot Participation by Campus................................................................................................................. 14
Pilot Participants..................................................................................................................................... 18
Course Delivery Format .......................................................................................................................... 22
Years of Professional Experience ............................................................................................................ 26
Gender of the Respondents .................................................................................................................... 29
Age of the Student Respondents............................................................................................................. 33
Courses Where Blackboard Was Used .................................................................................................... 35
Level of Comfort in Using Technology .................................................................................................... 39
Network Devices ..................................................................................................................................... 43
Section Discussion....................................................................................................................................... 48
Ease of Use .......................................................................................................................... 50
Section Findings .......................................................................................................................................... 50
Section Discussion....................................................................................................................................... 55
Pedagogy ............................................................................................................................ 56
Section Findings .......................................................................................................................................... 56
Section Discussion....................................................................................................................................... 60
Accessibility ........................................................................................................................ 61
Section Findings .......................................................................................................................................... 61
Section Discussion....................................................................................................................................... 61
Course Migration................................................................................................................. 62
Section Findings .......................................................................................................................................... 62
Section Discussion....................................................................................................................................... 64
Course Administration ......................................................................................................... 65
Section Findings .......................................................................................................................................... 65
Section Discussion....................................................................................................................................... 87
Functionality of the LMS Products ....................................................................................... 89
Section Findings .......................................................................................................................................... 91
Section Discussion..................................................................................................................................... 100
2
Functionality of the Mobile Learn Products ........................................................................ 102
Section Findings ........................................................................................................................................ 102
Section Discussion..................................................................................................................................... 114
Overall Feelings About Blackboard Mobile Learn ............................................................... 116
Section Findings ........................................................................................................................................ 116
Section Discussion..................................................................................................................................... 120
Overall Feelings About the LMS by WCLD........................................................................... 122
Section Findings ........................................................................................................................................ 122
Section Discussion..................................................................................................................................... 126
Suggestions and Recommendations ................................................................................... 128
Section Findings ........................................................................................................................................ 128
Section Discussion..................................................................................................................................... 137
Technical Issues ................................................................................................................. 139
Section Findings ........................................................................................................................................ 139
Section Discussion..................................................................................................................................... 145
Resources Used to Resolve Technical Problems .................................................................. 147
Section Findings ........................................................................................................................................ 147
Section Discussion..................................................................................................................................... 153
Recommendations For Help & Support .............................................................................. 154
Section Findings ........................................................................................................................................ 154
Section Discussion..................................................................................................................................... 157
Appendixes ....................................................................................................................... 158
Appendix A: Faculty Mid-Term Survey: Blackboard Pilot, Summer 2014 ................................... 158
Appendix B: Staff MidTerm Survey: Blackboard Pilot, Summer 2014 .......................................... 167
Appendix C: Student MidTerm Survey: Blackboard Pilot, Summer 2014 .................................... 173
Appendix D: Faculty End-Term Survey: Blackboard Pilot, Summer 2014 ................................... 181
Appendix E: Student End-Term Survey: Blackboard Pilot, Summer 2014 ................................... 189
Appendix F: Blackboard Pilot Survey - “Lessons Learned” ........................................................... 195
Appendix G: Focus Group with Support Staff ................................................................................. 196
3
Executive Summary
eLearning, the delivery of courses entirely online or in a hybrid format, continues to be
an area of great potential for meeting the changing landscape of higher education.
Driven by both monetary and competitive reasons, universities are reviewing how
programs of study are delivered in order to maximize the pedagogical benefit while
controlling costs. In August 2009, the Provost and the Vice Provost for Information
Technology charged the eLearning Strategic Committee with shaping the future of Penn
State’s learning environment to meet the needs of the Penn State community and find a
replacement for ANGEL, Penn State’s course management system. The four major
criteria for this search were:

Pedagogical: providing the tools needed for faculty and students.

Technology Management: system architecture, scalability, security, development,
quality assurance, etc.

Organizational Administration: policy issues (academic, operational), data
retention, user support, training, etc.

Cost: hardware, software, lifecycle, operations and maintenance, staff, etc.
During the 2010 fall semester, the Learning Management System (LMS) Pilot Team
worked with instructors, students, designers, technologists, and support staff to explore
the capabilities of both Desire2Learn and Moodle within the Penn State environment. In
January 2011, the support team continued pilot research to assess how the Blackboard
system supports teaching, learning, and collaboration among faculty and students at
Penn State. The LMS Pilot Team then conducted a fourth pilot in the 2012 spring
semester to explore Canvas.
In 2009, Penn State did not have the option of continuing vendor support of ANGEL
beyond 2014 as a result of ANGEL’s parent company, Blackboard, planning to
discontinue support of the product. However, in March 2012, Blackboard reversed their
4
decision to provide indefinite support of ANGEL. In light of this new development, the
eLearning Strategic Committee recommended to renew the ANGEL agreement through
2017, which included a co-production agreement to use both ANGEL and the
Blackboard Learn LMS. Both the Provost and Vice Provost for Information Technology
accepted this recommendation, thus allowing the committee to successfully fulfill their
charge and disband.
Since spring of 2012, Blackboard has made extensive enhancements to the Learn
platform. As a result, Craig Weidemann, Vice President for Outreach and Vice Provost
for Online Education, Rob Pangborn, Vice President and Dean for Undergraduate
Education, and Information Technology Services (ITS) recommended to pilot a suite of
Blackboard products during the 2014 summer semester. The applications that were
piloted include:

Blackboard Learn (LMS)

Blackboard Collaborate (a virtual space that enables real-time collaborative
work)

Blackboard Mobile (for Android, iOS, and Blackberry)
The purpose of the 2014 pilot was to evaluate the potential of these products to
positively impact the teaching and learning experience at Penn State. The criteria used
to measure this impact included ease of use, pedagogy, course administration,
functionality, accessibility, and migration. Students, faculty, and staff from across the
Commonwealth were invited to participate. Active members of the pilot included
instructors, students, instructional designers, and support staff designated with
training, documentation, and answering help desk inquiries.
The assessment strategy for the pilot included online surveys conducted several times
over the course of the pilot, direct observations of the faculty and instructional
designers during the training portion of the pilot, and focus groups with faculty and
5
support staff conducted virtually at the end of the pilot project as a part of “Lessons
Learned.”
In the mid-term surveys, faculty indicated that they found the platform useful for their
teaching needs. However faculty respondents’ feelings about the usefulness of
Blackboard for teaching and learning were less positive in the end-term surveys.
The major disadvantages of Blackboard Learn as reported by the pilot participants were
accessibility, navigation, and ANGEL-Blackboard migration.
Originally, an additional pilot of the suite of Blackboard products was also scheduled
for the 2014 fall semester. However, this pilot was cancelled due to a change by
Blackboard in July 2014 regarding the product direction for Learn and other associated
applications.
6
About the Pilot
This report analyzes the experiences of the instructors, instructional designers, and
support staff during the Spring-Summer 2014 pilot. It encompasses the time period
beginning with the vendor delivered training in April-June, 2014 (IDs and support staff
training was conducted from April 30 through May 02, 2014; for faculty - June 10-12,
2014).
The report is broken out into sections based on each of the criterion evaluated. Each
section will begin with the findings and end with a discussion of the results. The
supporting documentation for our pilot processes is located in the Appendixes found at
the end of the document.
The report was prepared by the LMS Pilot Assessment Team of Jeff Swain and Olga
Buchko and presented to Terry O’Heron, the eLearning Strategic Committee Chair.
Special thanks to the Evaluation Team who assisted with the surveys development and
delivering including Louise Sharrar, Andrea Gregg, Jane Keary-Thomas, Oranuj
Janrathitikarn, Dominic Pugliese and Janet May Dillon. Another special Thank You to
Andrea Gregg for sharing the report she made with assistance from Dominic Pugliese
on World Campus Learning Design (WCLD) experience in the pilot. This information
was included as s separate section called “Overall Feelings About the LMS by WCLD”.
Finally, a very special thank you has to be given to all the instructors, students and
support staff, who participated, with special acknowledgement to Janet Duck for her
support with the student focus group interview. Your patience, understanding, and
diligence will go a long way toward benefitting the entire Penn State community.
7
History and Objectives
The landscape of higher education is dramatically changing. Driven by both monetary
reasons and competitive reasons, universities are reviewing programs of study and how
they are delivered in order to maximize the pedagogical benefit while controlling costs.
eLearning, the delivery of courses entirely online or in a hybrid format, continues to be
an area of great potential for meeting these needs. Several factors are driving this
phenomenon, including:

The growth of eLearning at the K-12 level. According to the Sloan Consortium,
an advocacy group for online education, over 1 million students took an online
course at the K-12 level during the 2007-08 school year, an increase of 47% from a
similar survey done two years earlier.

The growing demand for online courses at the college level. Interest in
attending Penn State continues to grow. In order to accommodate as many
qualified candidates as we can, offering courses partially or entirely online
provides a way for students to take classes that they normally would have
difficulty enrolling in.

The pedagogical benefits of eLearning. According to a 2009 study by the U.S.
Department of Education, preliminary results indicate learning benefits for
students taking courses online. Whether a course is delivered entirely online or
whether online constructs are used to supplement face-to-face classes, eLearning
tools such as a learning management system (LMS) allow instructors to build
materials that offer students multiple ways to engage with content and
collaborate with their peers.
To meet the needs of our community, the provost and vice provost for information
technology charged the eLearning Strategic Committee with shaping the future of Penn
State’s learning environment by exploring what a future LMS may look like at the
8
University. The committee composed of key representatives from across the colleges,
campuses, and Information Technology Services (ITS), became responsible for
developing a strategy for sustaining a common eLearning environment across the
University.
To accomplish this task, the committee identified four major criteria that needed to be
addressed in order to make a recommendation. They are:

Pedagogical: providing the tools needed for faculty and students.

Technology management: system architecture, scalability, security,
development, quality assurance, etc.

Organizational administration: policy issues (academic, operational), data
retention, user support, training, etc.

Cost: hardware, software, life cycle, operations and maintenance, staff, etc.
To accomplish this charge, the eLearning Strategic Committee formed two advisory
committees, pedagogical and technical, to explore the capabilities of several existing
and emergent LMS products. Membership of the strategic and two advisory committees
is provided in Appendix A. The LMS products explored was Blackboard Learn. The
advisory committees developed scorecards (criteria) and evaluated areas such as
features, functionality, the user interface, scalability, performance, security, etc., and
reported their findings back to the eLearning Strategic Committee. Scorecard criteria are
referred to within this report as LMS critical core elements.
9
Assessment Strategy
The assessment strategy for the pilot included the methodological approach of data
triangulation, a means of collecting data through a variety of sources and applying a
common coding system in order to synthesize their meaning. This method is very
effective in explaining the richness and complexity of human behavior because it
provides a more detailed and balanced picture of the situation. In our pilot study, data
was triangulated through surveys, direct observations, and focus groups.

Online surveys were conducted several times over the course of the pilot with
the goal of collecting a cross-sectional body of participant experiences.

Direct observations of the faculty and instructional designers took place during
the training (both face-to face and virtually) in order to gain a deeper
understanding of the course construction process.

Focus groups were designed to assess the strengths and weaknesses of each
platform in a group setting, however, due to time limitation and schedule
conflict there was no possibility to conduct a focus group with students. The
focus group with faculty and support was conducted at the end of the pilot
project virtually as a part of “Lessons Learned.”
10
Assessment Measures and Strategies
The LMS support team conducted six types of assessment measures over the course of
the pilot. Participation was voluntary and self-selected by participants. That is, there
was no point in the pilot where partaking in any assessment measure was required.
Rather, participants were able to contribute to assessment measures if they chose to do
so. Additionally, the completion of one assessment measure was not connected to any
others. Assessment measures were guided by the scorecard categories developed by the
eLearning Strategic Committee.
The following were assessment measures that took place during the course of the pilot
in the order they were performed:

Post-training survey: April-June, 2014

Direct observations: April-May, 2014

Midterm Surveys: July, 2014

Focus groups: August, 2014

End of Pilot Surveys: August, 2014

Lessons learned (focus group with support staff): August, 2014
Assessment Measures
Survey
Participants
Instructors
IDs
Support Staff
Students
Post-Training
Survey
x
x
n/a
n/a
Direct
Observation
x
x
x
x
Midterm
Survey
x
n/a
x
x
Focus
Groups
x
x
x
x
End of pilot
survey
x
n/a
x
x
Table 1: Assessment Measure and Targeted Role
Post-Training Survey
Pilot participants were invited to participate in the platform training prior to the start of
the summer semester in April-June, 2014: IDs and support staff training was conducted
11
from April 30 through May 02, 2014; for faculty - June 10-12, 2014. The main objective
of the trainings was to focus on the course building aspects of Blackboard Learn.
Agenda topics for IDs included:

Designing Course Structure

Presenting Dynamic Content

Creating and Managing Assignments

Creating and Managing Tests and Surveys

Mastering the Grade Center

Designing Engaging Discussions

Building Communities Online
Agenda topics for faculty included:

Building Course

Assessing Learners

Managing Communication and Collaboration
Upon completion of training, participants were asked to take a short online survey
about their experience and impressions of the Blackboard they would be piloting.
Results of the post-training survey were used to create an initial benchmark for
pedagogical impressions, develop webinars, create help desk materials and determine
quality of the training provided by each vendor.
Direct Observations
To gain a deeper understanding of the course construction process, the LMS support
team observed how instructors built their course(s) in Blackboard Learn during the
training. Observations took place prior to the start of the summer semesters.
The objectives of conducting these observations were to:

Capture /document what happens during the course building process.
12

Gather a rich description of the experience.

Identify the general level of intuitiveness of each application.

Create support materials from frequently asked questions/common issues.
Focus Groups
In August 2014, the LMS pilot team held a focus group to assess the strengths and
weaknesses of the platform. One focus group was held for students, which regrettably
was not conducted due to a last-minute schedule conflict. The focus group for the
support staff was designed as a Lessons Learned session conducted virtually on August
13, 2014 by Brett Bixler (See Sppendix G).
Mid-Term Pilot Survey
An online midterm pilot survey was administered to faculty, students, and support
staff in July, 2014. Ideally a response rate of approximately 20% is desirable in order to
make inferences. We were fortunate to exceed that percentage in each of our pilot
demographics.
Section Discussion
The triangulated approach to collecting data offered several advantages. The data
collection process was carried out with regards to the rhythm of the semester. The posttraining survey allowed us to identify initial concerns and teaching plans. Direct
observation provide the opportunity to study the actual course building process and the
focus groups allowed individuals come together, share ideas, and form a consensus
opinion about the products. The data collection process was also emerged with each
mode of collection being informed by its predecessors. This allowed the support staff
the opportunity to be participants as well as observers in the pilot because we could
address and questions and training needs as they developed.
13
Demographic Information
Section Findings
This section supplies demographic information of the pilot participants. Demographic
information includes gender values, age, campus affiliation and other categories. Pilot
Participation by Campus
The Spring-Summer 2014 pilot of Blackboard Learn included course sections from
campuses across the Commonwealth. Figure 1 illustrates this participation.
Figure 1: Pilot Participation by Campus
In total, 33% of faculty, 82% of staff and 31% of students were from University Park;
25% of faculty and 33% of students were from World Campus; the remaining were from
other 12 Penn State campuses.
14
Pilot Participation by Campus Survey Results
Participants
Terms
Campus
AA (Altoona)
AN (Lehigh Valley)
BD (Erie)
BK (Berks)
DE (Brandywine)
CL (Harrisburg)
HY (Hershey Med
Center)
KP (Great Valley)
NK (New
Kensington)
OZ (Abington)
UP (University Park)
WC (World Campus)
WS (Worthington
Scranton)
Faculty
Faculty
Faculty
Faculty
Students
Students
Students
Students
Staff
Staff
Mid-Term
MidTerm
End-Term
EndTerm
Mid-Term
MidTerm
EndTerm
EndTerm
Mid-Term
MidTerm
Number
Percent
Number
Percent
Number
Percent
Number
Percent
Number
Percent
0
0
1
1
0
0
0
0
0
8
8
0
0
0
0
0
1
1
0
0
0
0
0
8
8
0
0
0
0
0
20
36
0
0
0
0
0
7
13
0
0
0
0
0
20
36
0
0
0
0
0
7
13
0
0
0
1
2
3
3
1
1
1
1
2
3
3
1
1
1
0
1
0
8
0
1
0
8
0
6
0
2
0
6
0
2
1
1
1
1
1
4
3
1
8
33
25
8
1
4
3
1
8
33
25
8
23
88
93
18
8
31
33
6
21
84
91
18
8
30
33
7
3
81
0
1
3
82
0
1
Table 2: Pilot Participation by Campus
It is worth mentioning that one course (CAMS045) had two sections and was delivered
by World Campus and University Park. Below is a graphical representation of the
survey results for the pilot participation by campus.
15
WS (Worthington Scranton)
WC (World Campus)
UP (University Park)
OZ (Abington)
NK (New Kensington)
KP (Great Valley)
Staff
HY (Hershey Med Center)
Students
CL (Harrisburg)
Faculty
DE (Brandywine)
BK (Berks)
BD (Erie)
AN (Lehigh Valley)
AA (Altoona)
0
10
20
30
40
50
60
70
80
90
Figure 2: Campus Distribution
Active members of the pilot included instructors, students, instructional designers, and
support staff designated with training, documentation, and answering help desk
inquiries. The following section of the report will detail the pilot sample. Table 3
indicates the total number of pilot participants grouped according to their role.
Pilot Participation by Role
Role
Course Sections
Faculty
Support Staff
Students (Enrollments)
Mid-Term
14
11
98
284
End-Term
14
11
n/a
276
Table 3: Pilot Participation by Role
Out of 407 pilot participants, 11 were faculty, 98 – support staff, and 284 (mid-term) and
276 (end-term) – students enrolled in 14 course sections. Below is a graphical
representation of the survey results for the pilot participation by role.
16
300
250
200
Mid-Term
150
End-Term
100
50
0
Course Sections
Faculty
Support Staff
Students (Enrollments)
Figure 3: Pilot Participation by Role
17
PILOT PARTICIPANTS
In the pilot there were three groups of participants – faculty, staff and students.
Faculty
The faculty group was represented by instructors.
Instructors
Instructors are defined as those who enhance the educational experience of students
through their teaching, research and service. To participate in the pilot they were
obtained through a convenience sample method meaning pilot participants were
invited to participate or self-selected, based on an e-mail invitation.
Faculty Role in the Pilot Mid-Term Survey
Participant
Instructor
Assistant Professor
Associate Professor
Professor
Other
Number
7/7
0/7
0/7
0/7
0/7
Percent
100
0
0
0
0
Table 4: Faculty Role in the Pilot
18
All faculty members that participated in the pilot were instructors. Below is a graphical
representation of the survey results for the roles faculty had in the pilot study.
120
100
80
60
40
20
0
Instructor
Assistant Professor Associate Professor
Professor
Other
Figure 4: Faculty Role in the Pilot
Staff
Staff was represented by Instructional Designers (IDs), Instructional Production
Specialist (IPS) and Support staff.
Instructional Designers
Instructional Designers (IDs) are defined as those who support instructors in the design
and implementation of online courses and includes individuals designated as
Instructional Technologists. In the LMS pilot, IDs were members by ascription. For
example, if an instructor received regular support from an ID, then that ID was by
default also part of the pilot.
Instructional Production Specialist
Instructional Production Specialist (IPS) are defined as those who provide
administrative, technical, and production support to instructional designers, faculty,
educational technologists, etc. Examples of Instructional Production Specialists
included Penn State IT managers and IT assistants.
Support Staff
19
Support staff, those designated with training, documentation, and answering help desk
inquiries, also provided feedback about this experience in a focus group. Examples of
support staff members included Penn State help desk employees, graduate assistants,
managers, and consultants.
Out of 28 staff members, 18 were Instructional Designers, 6 - Instructional Production
Specialists, 2 - Support Staff, and 2 – other.
Staff in the Pilot (Mid-Term Survey)
Participants
Instructional Designer
Instructional Production Specialist
Support Staff
Other
Number
Percent
18/28
6/28
2/28
2/28
64
21
7
7
Table 5: Staff Role in the Pilot (Mid-Term Survey)
Below is a graphical representation of the survey results for the roles staff had in the
pilot.
70
60
50
40
30
20
10
0
Instructional Designer
Instructional Production
Specialist
Support Staff
Other
Figure 5: Staff Role in the Pilot (Mid-Term Survey)
Students
Students were also ascribed participants of the pilot (i.e., because of their role as
students in the class of a faculty/instructor who had volunteered, they were
automatically pilot participants).
20
In the mid-term survey, students indicated that their academic level was the following:
21% (16/77) were the first-year undergraduates, 18% (14/77) – the second-year
undergraduates, 17% (13/77) - the third-year undergraduates, 14% (11/77) – the four or
more years undergraduates, and 30% (23/77) - Masters students.
In the end-term survey, among students-respondents there were 30% (18/61) of
freshmen, 16% (10/61) of sophomores, 20% (12/61) of juniors, 11% (7/61) of seniors,
23% (14/61) of Masters students.
Student Academic Level
Student Academic Level
First-year undergraduate (Freshman)
Second-year undergraduate (Sophomore)
Third-year undergraduate (Junior)
Four or more years undergraduate (Senior)
Masters student (MA, MS, MBA, MFA, MSW, MPA, etc.)
Doctoral Student (EdD, PhD, etc.)
MidTerm
Number
MidTerm
Percent
EndTerm
Number
EndTerm
Percent
16/77
14/77
13/77
11/77
23/77
0/77
21
18
17
14
30
0
18/61
10/61
12/61
7/61
14/61
0/61
30
16
20
11
23
0
Table 6: Student Academic Level
Below is a graphical representation of the survey results for Student Academic Level.
Doctoral Student (EdD, PhD, etc.)
Masters student (MA, MS, MBA, MFA,
MSW, MPA, etc.)
Four or more years undergraduate (Senior)
End-Term Survey
Third-year undergraduate (Junior)
Mid-Term Survey
Second-year undergraduate (Sophomore)
First-year undergraduate (Freshman)
0
5
10
15
20
25
30
35
Figure 6: Student Academic Level
21
COURSE DELIVERY FORMAT
All respondents were asked to indicate in what form the course was delivered. Tables 6,
7 and 8 describe formats of course delivery by faculty, staff and student.
Faculty indicated that most of the courses were delivered online with no face-to-face
interaction (mid-term survey: 57% (4/7) and end-term survey: 60% (6/20)). None of the
courses were delivered online with no face-to-face interaction.
Course Delivery Format by Faculty
Course Delivery Format
Face - to – face
In a hybrid format using a blend of face-toface and online interaction
Online with face-to-face interaction only for
exams
Only online with no face-to-face interaction
MidTerm
Number
1/7
Mid –
Term
Percent
14
End –
Term
Number
2
End –
Term
Percent
20
2/7
29
2
20
0/7
0
0
0
4/7
57
6
60
Table 7: Course Delivery by Faculty
22
Below is a graphical representation of the survey results for course delivery by faculty.
Only online with no face-to-face
interaction
Online with face-to-face interaction only
for exams
End-Term Survey
Mid-Term Survey
In a hybrid format using a blend of faceto-face and online interaction
Face - to - face
0
10
20
30
40
50
60
70
Figure 7: Course Delivery by Faculty
Staff indicated that 78% (17/18) - only online with no face-to-face interaction, 17%
(3/18) - in a hybrid format using a blend of face-to-face and online interaction, and 6%
(1/18) of courses were delivered face - to – face.
Course Delivery Format by Staff (Mid-Term)
Course Delivery Format
Face - to - face
In a hybrid format using a blend of face-to-face and online
interaction
Online with face-to-face interaction only for exams
Only online with no face-to-face interaction
Number
1/18
Percent
6
3/18
17
0/18
14/18
0
78
Table 8: Course Delivery by Staff (Mid-Term Survey)
23
Below is a graphical representation of the survey results for course delivery by staff.
Only online with no face-to-face interaction
Online with face-to-face interaction only for exams
In a hybrid format using a blend of face-to-face and
online interaction
Face - to - face
0
10
20
30
40
50
60
70
80
90
Figure 8: Course Delivery by Staff (Mid-Term Survey)
In the mid/end-term survey students indicated that most of the courses - 59% (48/81)
and 56% (35/63) - they participated in were delivered online with no face-to-face
interaction (These are all courses but BI SC002, ECON 102, HIST 021 and FD SC 105).
Such courses as CAMS 045, IST 110, BI SC 002, ECON 102 and FD SC 105 were also
delivered in a hybrid format using a blend of face-to-face and online interaction (223%). Only 2 courses offered face - to - face interaction – PSYCH 100 and FD SC 105
(14% and 22%). Three courses offered online delivery with face-to-face interaction only
for exams; they were BI SC 002, CAMS 045 and IST 110 (4%). IMBA 523 was delivered
only online with no face-to-face interaction; however, one respondent indicated that he
studied in a hybrid format.
Course Delivery Format by Students
Course Delivery Format
Face - to - face
In a hybrid format using a blend of face-to-face
and online interaction
Online with face-to-face interaction only for exams
Only online with no face-to-face interaction
Mid Term
Number
11/81
MidTerm
Percent
14
End Term
Number
14/63
EndTerm
Percent
22
19/81
23
14/63
22
3/81
48/81
4
59
0/63
35/63
0
56
24
Table 9: Course Delivery by Students
Below is a graphical representation of the survey results for course delivery by students.
Only online with no face-to-face
interaction
Online with face-to-face interaction only
for exams
End-Term Survey
Mid-Term Survey
In a hybrid format using a blend of face-toface and online interaction
Face - to - face
0
10
20
30
40
50
60
70
Figure 9: Course Delivery by Students
25
YEARS OF PROFESSIONAL EXPERIENCE
Faculty and staff were asked about how many years they had been in higher education.
Most of faculty respondents (more than 50%) indicated that their work experience
consisted of 11 to 20 years. Nobody had work experience of 1 year or less. None of the
respondents had work experience of more than 30 years.
Faculty Work Experience
Years
1 year or less
2 - 5 years
6 - 10 years
11 - 20 years
21 - 30 years
More than 30 years
Mid-Term
Number
0/7
2/7
1/7
4/7
0/7
0/7
Mid-Term
Percent
0
29
14
57
0
0
End-Term
Number
0/10
1/10
3/10
5/10
1/10
0/10
End-Term
Percent
0
10
30
50
10
0
Table 10: Faculty Work Experience
26
Below is a graphical representation of the survey results regarding faculty work
experience.
More than 30 years
21 - 30 years
11 - 20 years
End-Term Survey
6 - 10 years
Mid-Term Survey
2 - 5 years
1 year or less
0
10
20
30
40
50
60
Figure 10: Faculty Work Experience
Staff respondents indicated that 32% of them had work experience of 2 - 5 years, 25% 6 - 10 years, 21% - 6 - 10 years, 14% - 1 year or less, and 7% - 21 - 30 years. None of the
respondents had work experience of more than 30 years.
Staff Work Experience
Years
1 year or less
2 - 5 years
6 - 10 years
11 - 20 years
21 - 30 years
More than 30 years
Number
4/28
9/28
6/28
7/28
2/28
0/28
Percent
14
32
21
25
7
0
Table 11: Staff Work Experience (Mid-Term Survey)
27
Below is a graphical representation of the survey results regarding staff work
experience.
More than 30 years
21 - 30 years
11 - 20 years
6 - 10 years
2 - 5 years
1 year or less
0
5
10
15
20
25
30
35
Figure 11: Staff Work Experience (Mid-Term Survey)
28
GENDER OF THE RESPONDENTS
All respondents were asked about their gender. Three respondents (43%) identified
themselves as female and four as male (57%) in the mid-term survey; in the end-term
survey three respondents (30%) identified themselves as female and seven as male
(70%).
Gender by Faculty
Gender
Female
Male
Other
Mid-Term
Number
3/7
4/7
0/7
Mid-Term
Percent
43
57
0
End-Term
Number
3/10
7/10
0/10
End-Term
Percent
30
70
0
Table 12: Gender by Faculty
29
Below is a graphical representation of the survey results for faculty gender
representation.
80
70
60
50
Mid-Term Survey
40
End-Term Survey
30
20
10
0
Female
Male
Other
Figure 12: Gender by Faculty
Staff was equally presented by both gender groups (50/50%).
Gender by Staff (Mid-Term)
Gender
Female
Male
Other
Number
14/28
14/28
0/28
Percent
50
50
0
Table 13: Gender by Staff (Mid-Term Survey)
30
Below is a graphical representation of the survey results for staff gender representation.
60
50
40
30
20
10
0
Female
Male
Figure 13: Gender by Staff (Mid-Term Survey))
Students were equally presented by both gender in the mid-term survey, however, one
respondent indicated that he/she belonged to “Agender” group. In the end-term
survey, percent of females was higher than of males (58 and 42%).
Gender by Student
Gender
Female
Male
Other (Agender)
Mid-Term
Number
40/81
40/81
1/81
MidTerm
Percent
49
49
1
EndTerm
Number
36/62
26/62
0/62
EndTerm
Percent
58
42
0
Table 14: Gender by Student
31
Below is a graphical representation of the survey results for student gender
representation.
70
60
50
40
Mid-Term Survey
30
End-Term Survey
20
10
0
Female
Male
Other (Agender)
Figure 14: Gender by Student
32
AGE OF THE STUDENT RESPONDENTS
Students were asked to indicate their age. More than 55% of respondents were under
24. There were no students of such age groups as 55 – 59, 65 – 70 and 71 and over. Such
groups as 40 – 44, 50 – 54 and 60 – 64 were represented by one individual each.
Age by Students
Age Ranges
Under 24
25 - 29
30 - 34
35 - 39
40 - 44
45 - 49
50 - 54
55 - 59
60 - 64
65 - 70
71 & Over
Mid-Term
Number
45/81
11/81
9/81
11/81
1/81
2/81
1/81
0/81
1/81
0/81
0/81
Mid-Term
Percent
56
14
11
14
1
2
1
0
1
0
0
End-Term
Number
38/63
5/63
8/63
4/63
3/63
2/63
2/63
0/63
1/63
0/63
0/63
End -Term
Percent
60
8
13
6
5
3
3
0
2
0
0
Table 15: Age by Students
33
Below is a graphical representation of the survey results for student age groups.
71 & Over
65 - 70
60 - 64
55 - 59
50 - 54
45 - 49
End-Term Survey
40 - 44
Mid-Term Survey
35 - 39
30 - 34
25 - 29
Under 24
0
10
20
30
40
50
60
70
Figure 15: Age by Students
34
COURSES WHERE BLACKBOARD WAS USED
All respondents were asked to indicate in which course(s) during summer semester
they were using Blackboard. Tables 15-17 present this information.
Courses Where Blackboard Learn Was Used by Faculty
Courses
BI SC002(BBMS_BISC002)
CAMS 045(BBMS_CAMS045)
CMPSC200(BBES_CMPSC200)
COMM 150(BBMS_COMM150)
ECON 102(BBMS_ECON102)
FD SC105(BBMS_FDSC105)
HIST 021(BBES_HIST021)
IST 110(BBMS_IST110)
PSYCH100(BBMS_PSYCH100)
SPAN 131(BBMS_SPAN131)
Mid-Term
Number
1/7
0/7
0/7
1/7
1/7
1/7
1/7
1/7
0/7
1/7
Mid-Term
Percent
14
0
0
14
14
14
14
14
0
14
End-Term
Number
1/10
1/10
1/10
1/10
1/10
1/10
1/10
1/10
1/10
1/10
End-Term
Percent
10
10
10
10
10
10
10
10
10
10
Table 16: Courses Where Blackboard Learn Was Used by Faculty
35
Below is a graphical representation of the survey results for courses that used
Blackboard Learn by faculty.
SPAN 131(BBMS_SPAN131)
PSYCH100(BBMS_PSYCH100)
IST 110(BBMS_IST110)
HIST 021(BBES_HIST021)
FD SC105(BBMS_FDSC105)
End-Term Survey
ECON 102(BBMS_ECON102)
Mid-Term Survey
COMM 150(BBMS_COMM150)
CMPSC200(BBES_CMPSC200)
CAMS 045(BBMS_CAMS045)
BI SC002(BBMS_BISC002)
0
2
4
6
8
10
12
14
16
Figure 16: Courses where Blackboard Learn was used by Faculty
Table 16 describes in what courses staff worked with Blackboard Learn.
Courses Where Blackboard Learn Was Used by Staff
Courses
Number
1/28
Percent
4
BA 321
NURS 580
1/28
4
1/28
4
COMM 409
1/28
4
CAMS 045
2/28
7
ECON102
1/28
4
GEV (General Evaluation and Assessment)
HIST 021
1/28
2/28
4
7
HRER 802
2/28
7
iMBA 523
2/28
7
PHP 597A
1/28
4
PSYCH 485
1/28
4
None /N/A
12/28
43
ADTED 531
Table 17: Courses where Blackboard Learn was used by Staff
36
Below is a graphical representation of the survey results for courses where staff used
Blackboard Learn.
None /N/A
PSYCH 485
PHP 597A
iMBA 523
HRER 802
HIST 021
GEA
ECON102
CAMS 045
COMM 409
NURS 580
BA 321
ADTED 531
Series2
Series1
0
5
10
15
20
25
30
35
40
45
50
Figure 17: Courses Where Blackboard Learn Was Used by Staff
Table 18 describes in what courses students worked with Blackboard Learn.
Courses Where Blackboard Learn Was Used by Students
Courses
BI SC002(BBMS_BISC002)
BI SC004(BBMS_BISC004)
CAMS 045(BBMS_CAMS045)
CMPSC200(BBES_CMPSC200)
COMM 150(BBMS_COMM150)
ECON 102(BBMS_ECON102)
FD SC105(BBMS_FDSC105)
IMBA 521(BBMS_IMBA521)
IMBA 522(BBMS_IMBA522)
IMBA 523(BBMS_IMBA523)
IST 110(BBMS_IST110)
PSYCH100(BBMS_PSYCH100)
SPAN 131(BBMS_SPAN131)
None
Mid-Term
Number
Mid-Term
Percent
End-Term
Number
4/82
0/82
16/82
0/82
3/82
4/82
7/82
1/82
1/82
23/82
13/82
6/82
3/82
1/82
5
0
20
0
4
5
9
1
1
28
16
7
4
1
3/62
1/62
9/62
1/62
4/62
6/62
9/62
0/62
0/62
14/62
5/62
8/62
1/62
1/62
EndTerm
Percent
5
2
15
2
6
10
15
0
0
23
8
13
2
2
Table 18: Courses Where Blackboard Learn Was Used by Students
37
Below is a graphical representation of the survey results for courses in which students
used Blackboard Learn.
None
SPAN 131(BBMS_SPAN131)
PSYCH100(BBMS_PSYCH100)
IST 110(BBMS_IST110)
IMBA 523(BBMS_IMBA523)
IMBA 522(BBMS_IMBA522)
IMBA 521(BBMS_IMBA521)
FD SC105(BBMS_FDSC105)
ECON 102(BBMS_ECON102)
COMM 150(BBMS_COMM150)
CMPSC200(BBES_CMPSC200)
CAMS 045(BBMS_CAMS045)
BI SC004(BBMS_BISC004)
BI SC002(BBMS_BISC002)
End-Term Survey
Mid-Term Survey
0
5
10
15
20
25
30
Figure 18: Courses Where Blackboard Learn Was Used by Students
38
LEVEL OF COMFORT IN USING TECHNOLOGY
All respondents were asked to indicate their level of comfort in using different types of
technology. Tables 18-20 present the information on what faculty, staff and student
experienced using Blackboard Learn.
From fifty-seven to sixty percent of faculty indicated that it was “very comfortable” to
use the Blackboard Learn platform. However, from 10-14% of respondents indicated
that they were “very uncomfortable” of “somewhat uncomfortable” using technologies.
In the end-term survey, 30% of respondents indicated that their level of comfort using
technologies was “somewhat comfortable”; that is 20% more than it was in the midterm survey.
Level of Comfort Using Technologies by Faculty
Comfort Level
Very Uncomfortable
Somewhat Uncomfortable
Somewhat Comfortable
Very Comfortable
Other
Mid-Term
Number
1/7
1/7
1/7
4/7
0/7
Mid-Term
Percent
14
14
14
57
0
End-Term
Number
0/10
1/10
3/10
6/10
0/10
End-Term
Percent
0
10
30
60
0
Table 19: Level of Comfort Using Technologies by Faculty
39
Below is a graphical representation of the survey results for the level of comfort using
technologies by faculty.
Very Comfortable
Somewhat Comfortable
End-Term Survey
Mid-Term Survey
Somewhat Uncomfortable
Very Uncomfortable
0
10
20
30
40
50
60
70
Figure 19: Level of Comfort Using Technologies by Faculty
Staff’s opinion regarding the level of comfort using technologies matched faculty
experience. Thus, 82% of staff indicated that it was “very comfortable” to use the
Blackboard Learn platform.
Staff: Level of Comfort Using Technologies (Mid-Term)
Comfort Level
Very Uncomfortable
Somewhat Uncomfortable
Somewhat Comfortable
Very Comfortable
Other
Number
2/28
1/28
2/28
23/28
0/28
Percent
7
4
7
82
0
Table 20: Level of Comfort Using Technologies by Staff (Mid-Term Survey)
40
Below is a graphical representation of the survey results for the level of comfort using
technologies by staff.
Very Comfortable
Somewhat Comfortable
Somewhat Uncomfortable
Very Uncomfortable
0
10
20
30
40
50
60
70
80
90
Figure 20: Level of Comfort Using Technologies by Staff (Mid-Term Survey)
Less than 50% of students indicated that they were “very comfortable” using
technologies; this percent dropped to 30% to the end of the semester. From 32% to 43%
respondents indicated their level of comfort as “somewhat comfortable”. So, overall, the
experience was very/somewhat comfortable for students. However, from 10-17% of
respondents felt “very/somewhat uncomfortable” using technologies.
Level of Comfort Using Technologies by Students
Comfort Level
Very Uncomfortable
Somewhat Uncomfortable
Somewhat Comfortable
Very Comfortable
Other:
Mid-Term
Number
8/81
8/81
26/81
39/81
0/81
Mid-Term
Percent
10
10
32
48
0
End-Term
Number
11/63
6/63
27/63
19/63
0/63
End-Term
Percent
17
10
43
30
0
Table 21: Level of Comfort Using Technologies by Students
41
Below is a graphical representation of the survey results for the level of comfort using
technologies by students.
Very Comfortable
Somewhat Comfortable
End-Term Survey
Mid-Term Survey
Somewhat Uncomfortable
Very Uncomfortable
0
10
20
30
40
50
60
Figure 21: Level of Comfort Using Technologies by Students
42
NETWORK DEVICES
All respondents were asked what type(s) of network device(s) they used on a regular
basis. Tables 22-24 present this information.
In the mid-term survey faculty respondents indicated that mainly they worked with
their laptops (100%), mobile phones (86%) and tablets (71%). In the end-term survey,
instructors indicated that they used their tablets (30%) and mobile phones (40%) less
than in the beginning of the project. The interested fact is that 42% of respondents
stopped using portable media player at the end of the semester at all. Such device as the
eBook Reader was used only by one respondent at the beginning of the semester who
stopped working with it by the end of the semester.
43
Network Devices by Faculty
Devices
Mobile phone
Portable media
player
Tablet
eBook reader
Laptop/Netbook
computer
Desktop
computer
Other
40
MidTerm
Number
Not
Selected
1
MidTerm
Percent
Not
Selected
14
EndTerm
Number
Not
Selected
6
EndTerm
Percent
Not
Selected
60
0/10
0
4
57
10
100
71
14
3/10
0/10
30
0
2
6
29
86
7
10
70
100
7/7
100
9/10
90
0
0
1
10
4/7
57
4/10
40
3
43
6
60
0/7
0
1/10
10
7
100
9
90
MidTerm
Number
Selected
MidTerm
Percent
Selected
EndTerm
Number
Selected
EndTerm
Percent
Selected
6/7
86
4/10
3/7
42
5/7
1/7
Table 23: Network Devices by Faculty
Below is a graphical representation of the survey results for type(s) of network device(s)
used on a regular basis during the pilot project indicated by the faculty.
Other
Desktop computer
Laptop/Netbook computer
End-Term Survey (Not Selected)
Mid-Term Survey (Not Selected)
Ebook reader
End-Term Survey (Selected)
Tablet
Mid-Term Survey (Selected)
Portable media player
Mobile phone
0
20
40
60
80
100
120
Figure 23: Network Devices by Faculty
According to staff responses, all network devices were used when project started.
However, the most popular network devices were mobile phone (100%) and laptops
44
(100%). Tablet was used by 82% of respondents. More than 50% of respondents used
portable media player (57%) and desktop computer (61%).
Network Devices by Staff (Mid-Term)
Number
Selected
28/28
16/28
23/28
11/28
28/28
17/28
0/28
Devices
Mobile phone
Portable media player
Tablet
eBook reader
Laptop/Netbook computer
Desktop computer
Other
Percent
Selected
100
57
82
39
100
61
0
Number
Selected
0/28
12/28
5/28
17/28
0/28
11/28
28/28
Percent
Selected
0
43
18
61
0
39
100
Table 24: Network Devices by Staff (Mid-Term Survey)
Below is a graphical representation of the survey results for type(s) of network device(s)
used on a regular basis during the pilot project indicated by the staff.
Other
Desktop computer
Laptop/Netbook computer
Not Selected
eBook reader
Selected
Tablet
Portable media player
Mobile phone
0
20
40
60
80
100
120
Figure 24: Network Devices by Staff (Mid-Term Survey)
Students, as well as faculty and staff, mainly worked with their laptops (93% according
to the mid-term and 86% according to the end-term survey) and mobile phones (88%
according to the mid-term survey). It is important to mention that mobile phones were
used less at the end of the semester; the percentage dropped from 88% to 24%. The
same happened with the portable medial player (percentage dropped from 40% to 3%),
45
tablet (percentage dropped from 52% to 24%) and eBook reader (percentage dropped
from 1% to 0%). The use of the desktop computer was practically the same during the
semester.
Network Devices by Students
Mobile phone
Portable media
player
Tablet
eBook reader
Laptop/Netbook
computer
Desktop
computer
Other
MidTerm
Number
Selected
#
71/81
MidTerm
Percent
Selected
%
88
32/81
40
42/81
9/81
52
11
75/81
93
30/81
37
1/81
1
EndTerm
Number
Selected
#
15/63
2/63
EndTerm
Percent
Selected
%
24
3
MidTerm
Number
Selected
#
10/81
MidTerm
Percent
Selected
%
12
49/81
60
15/63
0/63
54/63
24
0
86
39/81
72/81
48
89
6/81
7
22/63
35
51/81
63
0/63
0
80/81
99
EndTerm
Number
Selected
#
48/63
61/63
EndTerm
Percent
Selected
%
76
97
48/63
63/63
9/63
76
100
14
41/63
65
63/63
100
Table 25: Network Devices by Students
Below is a graphical representation of the survey results for type(s) of network device(s)
used on a regular basis during the pilot project indicated by the students.
Other
Desktop computer
Laptop/Netbook computer
End-Term Survey (Not Selected)
Mid-Term Survey (Not Selected)
eBook reader
End-Term Survey (Selected)
Tablet
Mid-Term Survey (Selected)
Portable media player
Mobile phone
0
20
40
60
80
100
120
Figure 25: Network Devices by Students
46
47
Section Discussion
This section provided us with the demographic data regarding the gender of the
participants, years of their work experience, age of students, campus participants were
affiliated with and information on courses participated in the project and what form
they were delivered. The response rate of the surveys distributed to faculty was high
enough; 67% (6/11) in the mid-term survey and 91% (10/11) in the end-term survey.
However, the response rate of staff and students was 29% (Staff: 28/98; Students:
81/284) in the mid-term surveys, and 22% (Students: 62/276) in the end-term survey. If
to compare this response rate to studies such as the national telephone Survey of
Consumer Attitudes, which generated a response rate of 48% (Curtin, Presser & Singer,
2005)1, it is low. In order to achieve high response we followed some of the Quinn
(2002)2 strategies for online surveys. First, we extended the duration of a survey’s
availability. Second, the choice of questions was optional. Third, the anonymity of the
responses was assured. In addition to this, we also used Zúñiga (2004)3, from the US
Teaching and Learning with Technology/Flashlight Group, set of some ‘best practices
for increasing response rates to online surveys’. They were (1) pushing the survey
(which means providing respondents with the survey URL in an email sent directly to
them); (2) providing frequent reminders (at least three); (3) involving academics
(reminders form the faculty); (4) persuading respondents that their responses will be
used (in the emails); and (5) creating surveys that seek constructive criticism (not just
multiple choice questions or items required a simple numerical rating but open-ended
questions as well).
1
Curtin, R., Presser, S. & Singer, E. (2005). Changes in telephone survey nonresponse over the past
quarter century. Public Opinion Quarterly 69(1), 87—98.
Nulty, D.D. (2008). The adequacy of response rates to online and paper surveys: What can be done?
Assessment & Evaluation in Higher Education 33(3), 301—314. Retrieved from
https://www.uaf.edu/files/uafgov/fsadmin-nulty5-19-10.pdf
3 Zúñiga, R.E. 2004, March. Increasing response rates for online surveys—a report from the Flashlight
Program’s BeTA Project. Retrived form http://www.tltgroup.org/resources/F-LIGHT/2004/03-04.html
(accessed 26 September 2014).
2
48
The demographic section of the survey also collected data about participants’ level of
comfort in using different types of technology and information regarding what type(s)
of network device(s) participants used on a regular basis. Discussing the level of
comfort using technologies all respondents indicated that it was “very/somewhat
comfortable”. Discussing network devices, respondents indicated that the most favorite
ones were laptops, mobile phones and tablets.
49
Ease of Use
A critical success factor for the adoption of any new technology is in its ease of use,
defined here as the extent to which a product can be used by specified users to achieve
specified goals with effectiveness, efficiency, and satisfaction in a specified context of
use. Of the entire assessment criterion applied to the pilot, ease of use and user-interface
(covered in the next section) are the most subjective and most critical to successful
adoption. Simply put, the easier it is to figure out and manipulate a device the more
likely that device is to be successful incorporated into the lives of the community.
Section Findings
This section addresses participant perceptions of intuitiveness, adaptability,
acclimation, and transition to the pilot platform. Questions in this section were asked as
a way of gauging participants’ perceptions of whether or not the system and its specific
functionalities were easy or difficult to learn. Because the overwhelming majority of
pilot participants had prior experience using ANGEL comparisons with it were
unavoidable. This was especially apparent in the feedback received from students.
Faculty’ Feedback on Ease of Use
Instructors were asked to rate the ease of use of pilot tools on a four-point rating system
(Difficult to Use, Slightly Easy to Use, Moderately Easy to Use and Very Easy to Use).
They generally agreed that the platform they piloted was easy to use (57-60%).
However, 14-20% of respondents indicated that it was “difficult to use” the platform.
Ease of Use by Faculty (Mid-Term and End-Term)
Ease of Use
Difficult to Use
Slightly Easy to Use
Moderately Easy to Use
Very Easy to Use
Mid-Term
Number
1/7
1/7
4/7
1/7
Mid-Term
Percent
14
14
57
14
Mid-Term
Number
2/10
2/10
6/10
0/10
End-Term
Percent
20
20
60
0
Table 26: Ease of Use by Faculty
50
Figures 26 presents a graphical representation of faculty feedback on ease of use of
Blackboard LMS.
Very Easy to Use
Moderately Easy to Use
End-Term
Mid-Term
Slightly Easy to Use
Difficult to Use
0
10
20
30
40
50
60
70
Figure 26: Ease of Use by Faculty
There were some difficulties that prevented faculty from the effective use of the LMS.
They were (a) lack of time to learn the platform functionalities (“Learning a new system in
a short period of time.”), (b) complexity of the platform (“Soooooooo many settings and it's
confusing when choosing one how it will affect another.”) and (c) lack/no ease of the
platform use (“Blackboard is NOT intuitive to use.”).
Staff’ Feedback on Ease of Use
Staff members were also asked to rate the ease of use of pilot tools on a four-point
rating system. More than 50% of staff respondents indicated that the Blackboard
platform was “moderately/very easy to use” (54% and 12%). However, 31%indicated
that they had some difficulties working with the platform describing this as “slightly
easy to use”.
Ease of Use by Staff (Mid-Term)
Ease of Use
Difficult to Use
Slightly Easy to Use
Moderately Easy to Use
Very Easy to Use
Number
1/26
8/26
14/26
3/26
Percent
4
31
54
12
Table 27: Ease of Use by Staff
51
52
Below is a graphical representation of the overall ease of use of Blackboard LMS rated
by staff.
Very Easy to Use
Moderately Easy to Use
Mid-Term
Slightly Easy to Use
Difficult to Use
0
10
20
30
40
50
60
Figure 27: Ease of Use by Staff
Some of the IDs stated that the Blackboard platform is more intuitive in comparison
with ANGEL:
“Smoother & more intuitive than ANGEL.”
“After getting past a few differences in navigation and terminology, it appears as though
it would be intuitive to use.”
Students’ Feedback on Ease of Use
Students were also asked to rate the ease of use of pilot tools as well. Table 28 presents
this information:
Ease of Use by Students
Ease of Use
Difficult to Use
Slightly Easy to Use
Moderately Easy to Use
Very Easy to Use
Mid-Term
Number
10/81
19/81
39/81
13/81
Mid-Term
Percent
12
23
48
16
End-Term
Number
9/63
16/63
25/63
13/63
End-Term
Percent
14
25
40
21
Table 28: Ease of Use by Students
53
In comparison with faculty and staff, student respondents experienced more difficulties
using the platform: 12-14% of respondents described their experience as “difficult to
use, and 23-25% - as “slightly easy to use” the platform. However, more than 60% of
respondents rated easiness of use of the platform as “moderately easy” (48 and 40%)
and “very easy” (16 and 21%).
These quotes reflect the feelings of only 9-11% of student respondents (6/55 (11%) in the
mid-term survey; 4/46 (9%) in the end-term survey) about the ease of use: “Easy to use,
well organized, information is easily accessible”; “Ease of use without issues like in ANGEL”.
Below is a graphical representation of the overall ease of use of Blackboard LMS rated
by students.
Very Easy to Use
Moderately Easy to Use
End-Term
Mid-Term
Slightly Easy to Use
Difficult to Use
0
10
20
30
40
50
60
Figure 28: Ease of Use by Students
Providing their feedback to open-ended questions, only 4-7% of students reported that
the LMS “…seems more intuitive to navigate than ANGEL” and that they found it easy to
find the various course tools as well as the course content:
“Blackboard seems to do a good job of surfacing content and allowing one to switch from
area to area with ease. It was relatively easy to move back and forth between various
sections.” (Student)
“The ease of finding items saves time to do other things like study.” (Student)
54
“Blackboard is accessible and easy to use. It's also pretty well organized. All of these
features are nice because they just make the course work easier to complete.” (Student)
Section Discussion
Generally, the pilot participants agreed that the platform they piloted was easy to use
(Faculty: 57-71% (mid-term) and 60% (end-term); Staff: 54-66%; Students: 48-64% (midterm) and 40-61% (end-term)), but not without a learning curve. However, in
comparison with staff and instructors, students experienced harder time using the
platform. When a decision is made as to the future direction of eLearning at the
university it would be good to revisit this section to identify support (training and
documentation) opportunities.
55
Pedagogy
A critical objective for the pilot was to identify the pedagogical affordances of
Blackboard Learn. For the pilot, pedagogy was defined as the affordances the LMS
lends to teaching and learning. In order to understand the pedagogical affordances of
each LMS, instructors/IDs were asked how well the tools available in the pilot platform
suited their pedagogical needs.
Section Findings
A severe limitation in the analysis of the pedagogical affordances of each LMS was time.
Participants only had access to the LMS for a single semester. Since they were also
learning the LMS during this time most of their energy was directed toward the course
design and administration practices needed to keep the course moving forward.
Faculty About the Usefulness of Blackboard
The faculty respondents were asked to describe how useful Blackboard was for their
needs.
Usefulness of Blackboard in Teaching
Faculty was asked to rate the overall usefulness of Blackboard for teaching on a fourpoint rating system. Table 29 presents the data:
Usefulness for Teaching by Faculty
Usefulness
Not At All Useful
Slightly Useful
Moderately Useful
Highly Useful
Mid-Term
Number
0/6
1/6
0/6
5/6
Mid-Term
Percent
0
17
0
83
End-Term
Number
0/10
2/10
3/10
5/10
End-Term
Percent
0
20
30
50
Table 29: Usefulness for Teaching by Faculty
Most of the respondents indicated that they found the platform useful for their teaching
needs. None of the respondents rated the usefulness of the platform as “not at all
useful”. Below is a graphical representation of the overall usefulness of Blackboard for
teaching rated by faculty.
56
Highly Useful
Moderately Useful
End-Term
Mid-Term
Slightly Useful
Not At All Useful
0
10
20
30
40
50
60
70
80
90
Figure 29: Usefulness for Teaching by Faculty
Benefits of Blackboard for Teaching and Learning
Table 30 describes the data obtained from the faculty when they rated how beneficial
Blackboard was to teaching and learning.
Faculty About Blackboard Benefits
Benefits
1
2
3
4
5
6
Blackboard enables me to
do what I wanted for my
course(s).
Blackboard is easy for
my students to learn how
to use.
Blackboard increases my
efficiency as a teacher.
Blackboard increases my
effectiveness as a teacher.
Using Blackboard is
beneficial to my
students’ overall
learning.
Blackboard was a
valuable aid to me in my
teaching.
Neither
Agree
nor
Disagree
EndTerm
Percent
Agree
MidTerm
Percent
Agree
EndTerm
Percent
Strongly
Agree
MidTerm
Percent
Strongly
EndTerm
Percent
Strongly
Agree MidTerm
Percent
Strongly
Agree
End-Term
Percent
Disagree
Mid-Term
Percent
Disagree
End-Term
Percent
Neither
Agree nor
Disagree
Mid-Term
Percent
0
10
29
10
43
10
29
40
43
30
0
10
17
0
29
20
17
40
43
20
0
10
0
10
29
30
57
30
17
10
0
10
0
10
57
50
29
10
17
20
0
10
0
10
17
30
83
20
17
20
0
10
0
0
17
40
83
30
17
20
Table 30: Faculty About Blackboard Benefits
57
Comparing the results of both surveys, it becomes visible that at the end-term survey
faculty respondents’ feelings about the usefulness of Blackboard for teaching and
learning were less positive (See Columns “Strongly Disagree”, “Disagree”, and
“Neither Agree nor Disagree”). For example, only 30% (See “Agree” and “Strongly
Agree” Columns) of respondents indicated that Blackboard increased their efficiency (it
was 74% in the mid-term survey) and effectiveness as teachers (it was 46% in the midterm survey). A huge decrease from 100% (mid-term) to 40% (end-term) was when
faculty respondents evaluated how beneficial Blackboard was to their students’ overall
learning (Statement 5). Then, if to compare respondents’ feedback on how valuable
Blackboard was to teaching (Statement 6), again there was a decrease from 100% to 50%
(See “Agree” and “Strongly Agree” Columns).
However, 70% of respondents agreed (strongly/-) that Blackboard enabled them to do
what they wanted to do for their course(s). This percent is similar to the one in the midterm survey data (that was 72%).
Below are graphical representations of these data.
6. Blackboard was a valuable aid
to me in my teaching.
Neither Agree nor Disagree EndTerm
5. Using Blackboard is beneficial to
my students’ overall learning.
Neither Agree nor Disagree MidTerm
4. Blackboard increases my
effectiveness as a teacher.
Disagree End-Term
3. Blackboard increases my
efficiency as a teacher.
Disagree Mid-Term
Strongly Disagree End-Term
2. Blackboard is easy for my
students to learn how to use.
Strongly Disagree Mid-Term
1. Blackboard enables me to do
what I wanted for my course(s).
0
10
20
30
40
50
60
Figure 30: Faculty About Blackboard Benefits (1)
58
6. Blackboard was a valuable aid to
me in my teaching.
5. Using Blackboard is beneficial to
my students’ overall learning.
4. Blackboard increases my
effectiveness as a teacher.
Strongly Agree End-Term
Strongly Agree Mid-Term
3. Blackboard increases my efficiency
as a teacher.
Agree End-Term
Agree Mid-Term
2. Blackboard is easy for my students
to learn how to use.
1. Blackboard enables me to do what
I wanted for my course(s).
0
20
40
60
80
100
Figure 31: Faculty About Blackboard Benefits (2)
This quote from the question 22 describes the frustrating experience of one of the online
instructors:
“I have been teaching online for 14 years for PSU and I built the course ([Course name])
we ran through the pilot. I'm pretty savvy when it comes to technology, when it comes to
adapting to new systems (such as Plone, when I rewrote everything into Flash, etc.).
Despite my experience and openness to this pilot, Blackboard was misery for me the
whole summer session. BB was a total disaster for my course. It made all aspects of my
teaching more difficult, less efficient, and, most importantly, it had a very negative
impact on my students. I had at least a dozen students who would have failed this course
due to BB had I not done all kinds of Jerry-rigging of their scores to make up for the ways
that BB made their experience so complicated and difficult. I got so many emails
complaining about not being able to find deadlines, not being able to access certain
features in certain browsers, and not being able to find essential features of the course,
that I made the decision to inflate all grades a bit to make up for it. Let's not forget: these
are Penn State students; they are smart people who are perfectly familiar with ANGEL,
so switching to a new LMS should not be so crippling to so many students. They are our
59
focus, but this pilot threatened a lot of GPAs and certain did nothing for the overall
strong reputation of my class.”
Section Discussion
Data collected showed that toward the end of the semester the platform received less
favorable responses from the pilot participants in areas of Blackboard efficiency
(Statement 3), effectiveness (Statement 4) and helpfulness in teaching (Statement 6).
Evaluating how beneficial Blackboard was to their students’ overall learning (Statement
5), respondents lowered their rating scores from 100% to 40% (See “Agree” and
“Strongly Agree” Columns). A deep dive into the pedagogical affordances of the
platform was limited due to the length of the pilot and the learning curve involved for
all participants. A great bulk of the time over the semester was spent learning how to
use the course design and administrative tools. This may also be an indication of how
instructors use the platform in general.
.
60
Accessibility
With 11,747 active courses in ANGEL serving 84,985 students Web-based tools that are
accessible by all members of our community are a crucial component to the future
design of eLearning at Penn State. Toward that goal, an accessibility pilot is run with
the specific purpose of evaluating the overall accessibility of the system.
Section Findings
Unfortunately there is no information from the summer pilot to provide. Due to the
cancelation of the pilot the accessibility pilot/review was also cancelled. A review is
scheduled for the beginning of the fall semester because more content in Learn to test is
needed. Courses for the fall that were being planned had more variation of content and
structure, as well as content placed directly in the Learn system rather than simply
framed in from an outside CMS.
The table had been left open to review both Canvas and any new version of Blackboard
Learn that would be piloting (M. Brooks, personal communication, January 9, 2015).
Section Discussion
We will return to the discussion of this section when the review is conducted and new
data are collected.
61
Course Migration
A critical success factor when adopting a new LMS is the ability to move content from a
legacy system, in our case ANGEL, into a new LMS (Blackboard). Ideally the migration
would occur in batch format where large amounts of data are exported from ANGEL,
imported into the new LMS, and populated without requiring much configuration from
the instructor after the fact.
Migrating course content usually requires converting course data into some common
format that can be output from the old database and input into the new database. Since
the new database may be organized differently, it may be necessary to write a program
that can process the migrating files. The purpose of including course migration as part
of the pilot process was to test the veracity of each LMS by making sure that course
content could be migrated in a way that:



Allowed participants to explore the distinct features of the new LMS.
Old settings did not require changing.
Ensuring that current applications continued to work in the new environment.
Section Findings
This section of the report will summarize the experience of pilot participants indicated n
three surveys. First, the survey (See Appendix F) conducted by the WC team with the
Learning Design staff that were involved in the Bb pilot on their evaluation of it as well
as the conversion tool specifically. Second, the survey conducted with the faculty and
staff by Blackboard pilot team (mid/end-term surveys).4
World Campus Survey Feedback
Thus, the number of courses migrating the content from ANGEL to Blackboard was low
(3). See the Table below.
44
Information was obtained from Andrea Gregg; Personal communication, January 16, 2015.
62
Method Used to Build the Course
World Campus Survey
Import/Export from ANGEL to Bb
Started from scratch
Other
Number
3/12
5/12
4/12
Percent
23%
38%
31%
Table 30: Method Used to Build the Course
Replying to the question "If you used built in import/export options, what worked and
what didn’t work?" the respondents mentioned mainly 3 issues experienced:
(1) "links were not always preserved"
(2) “gradebook did not transfer “
(3) “groups in course, group settings dropped during process”
All the rest seemed worked, the participants who migrated the content from ANGEL to
Blackboard were satisfied with the way the content came from. It was mentioned by one
of the respondents that “all of the quizzes and assessments came through which was a huge
time saver.”
Mid/End Term Survey Feedback
Some of the respondent from mid/end term surveys reported that they experienced
difficulties with importing course materials:
“Importing items from Angel. Some items carried over, some did not.”
“Course importing was not very intuitive.”
Staff participants reported the process to be time consuming and generally
unsuccessful.
“Importing from ANGEL to Bb Learn - The process of importing ANGEL content from
ANGEL, specifically exams and tests into Bb Learn was poor.”
63
“No significant improvement in the area of scalability - still need to copy courses one at a
time, can't copy out to multiple courses at once.”
In addition to this, faculty and support staff reported that a course migration process
was not able to meet their needs.
Section Discussion
Currently a course migration process was not able to meet the needs of our pilot
participants. It might mean that there is a necessity to specifically design a migration
tool/ program to move content from ANGEL into the Blackboard Learn, and pilot this
product when available.
64
Course Administration
Course administration refers to the tools and methods used to manage the course
throughout the semester using the Blackboard Learn platform.
Section Findings
Several areas concerning course administration functions came up for discussion during
the pilot including issues of course administration and management.
Faculty About the Usefulness of Blackboard
Level of Satisfaction with Blackboard Tools
Faculty respondents were asked to rate their level of satisfaction with the Blackboard
Learn tools and features designed to support the teaching and course management
tasks described in Table 31:
Blackboard Tools
1
2
3
4
5
6
7
8
9
Creating and publishing the
course syllabus (Content)
Creating a course calendar
(Course Calendar)
Posting course
announcements
(Announcements)
Uploading, organizing, and
sharing course files (Control
Panel>>Content
Collection>>Course Name)
Posting audio/video lectures
or other multimedia
(Control Panel>>Content
Collection>>Course Name)
Creating course web pages
(Content Area > Blank Page)
Organizing course content,
activities, and assessments
into a series of modules or
lessons (Content Area,
Learning Modules)
Posting assignments
(Content > Assignment)
Assigning individual
and collaborative writing
tasks (Journals, Wikis,
Blogs)
Do Not
Use MidTerm
Percent
Do Not Use
End-Term
Percent
Not at all
MidTerm
Percent
Not at
all EndTerm
Percent
Slightly
MidTerm
Percent
Slightly
EndTerm
Percent
Moderately
Mid-Term
Percent
Moderately
End-Term
Percent
Highly
MidTerm
Percent
Highly
EndTerm
Percent
43
10
0
0
0
10
14
50
43
30
57
50
0
20
0
0
14
10
29
20
29
20
0
10
0
10
14
10
43
50
0
0
14
0
14
10
29
60
43
30
43
30
0
0
0
10
14
20
43
40
29
30
14
0
0
30
14
20
43
20
14
20
0
10
14
20
29
30
43
20
0
10
0
0
29
20
29
30
43
40
43
80
14
0
0
0
29
0
14
20
65
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
Assigning peer reviews on
student work (Self and Peer
Assessment, Wikis, Blogs)
Creating and administering
online quizzes, tests, and/or
surveys (Tests, Surveys, and
Pools)
Facilitating graded and
ungraded discussions
(Discussions)
Giving feedback on and/or
grading student
submissions (GradeCenter >
Needs Grading)
Creating and using rubrics
to grade student work
(Rubrics, Grade Center)
Setting up and using the
gradebook to enter and
track student grades (Grade
Center)
Monitoring course activity
and student progress
(Course Reports,
Performance Dashboard,
Retention Center)
Creating and managing
groups for group
assignments, group
discussions, and/or group
projects (Groups)
Conducting online chat
sessions (Blackboard
Collaborate>>Course Room)
Keeping track of your
course tasks (Calendar, To
Do, Needs Attention)
Importing or exporting
course content (Packages
and Utilities)
Integrating an external
learning tool or platform
with my course, e.g.,
SoftChalk Cloud, Piazza,
etc. (Web Link)
Customizing the navigation,
look, and feel of your course
(Quick Setup Guide,
Teaching Style)
Connecting or encouraging
students to connect with
Blackboard users and
groups within or outside of
your course (Blackboard
Global Learning Network
Sending and receiving
messages to and from
students using Course
Messages
100
10
0
0
0
0
0
0
0
0
14
10
0
10
14
0
43
30
29
40
43
50
14
10
0
10
14
0
29
30
14
10
14
20
29
30
14
0
29
40
100
10
0
0
0
0
0
0
0
0
14
0
14
40
57
20
14
20
0
10
29
60
14
20
14
0
29
10
14
10
57
60
29
10
14
20
0
0
0
10
71
90
0
0
0
0
14
0
14
10
29
40
29
10
14
10
14
20
14
10
29
10
43
20
0
40
0
20
29
0
71
90
0
0
0
0
14
0
14
10
29
50
29
0
14
20
0
20
29
10
86
80
0
10
14
0
0
10
0
0
57
70
0
20
14
0
14
0
14
10
66
25
26
27
Sending and receiving
messages to and from
students and groups
using Send Email
Using Turnitin
originality checking on
assignments (Turnitin
Direct Assignment)
Using SafeAssign
originality checking on
assignments (SafeAssign
Direct Assignment)
29
30
29
20
14
10
0
0
57
40
86
90
0
0
14
0
0
10
0
0
100
90
0
0
0
10
0
0
0
0
Table 31: Level of Satisfaction by Faculty
Among those tools and features that were not used at all (100%) there were (1)
assigning peer reviews on student work (Statement 10); (2) creating and using rubrics to
grade student work (Statement 14) and (3) using SafeAssign originality checking on
assignments (Statement 27). However, the situation changed by the end of the
semester, and such tools as Self and Peer Assessment, Wikis, Blogs, Rubrics and Grade
Center (Statement 10 and 14) were not used only by 10% of respondents. SafeAssign
Direct Assignment was still not used by 90% of faculty (Statement 27).
Then, 80-90% of respondents didn’t use such features as (1) connecting or encouraging
students to connect with Blackboard users and groups within or outside of the course
(Statement 23; 86% in the mid-term survey; 80% in the end-term survey) and (2) using
Turnitin originality checking on assignments (Statement 26; 86% in the mid-term
survey; 90% in the end-term survey).
Such tools as Course Room (Statement 18; conducting online chat sessions) and Web
Links (Statement 21; integrating an external learning tool or platform with my course)
were not used by 71% of respondents in the mid-semester and 90% in the end-semester.
More than 50% of faculty did not use such tools as (1) Course Calendar (Statement 2;
57% in the mid-term survey; 50% in the end-term survey), (2) Groups (Statement 17;
67
57% in the mid-term survey; 60% in the end-term survey) and (3) Course Massages
(Statement 24; 57% in the mid-term survey; 70% in the end-term survey).
68
Below is a graphical representation of the Blackboard tools and features that were not
used by faculty.
27. Using SafeAssign
26. Using Turnitin
25. Sending and receiving messages
24. Sending and receiving messages
23. Connecting or encouraging…
22. Customizing the navigation
21. Integrating an external learning…
20. Importing or exporting course…
19. Keeping track of your course tasks
18. Conducting online chat sessions
17. Creating and managing groups
16. Monitoring course activity
15. Setting up and using the…
14. Creating and using rubrics
Do Not Use End-Term Survey
13. Giving feedback on and/or…
Do Not Use Mid-Term Survey
12. Facilitating graded and…
11. Creating and administering…
10. Assigning peer reviews on…
9. Assigning individual and…
8. Posting assignments
7. Organizing course content,…
6. Creating course web pages
5. Posting audio/video lectures or…
4. Uploading, organizing, and…
3. Posting course announcements
2. Creating a course calendar
1. Creating and publishing the…
0
20
40
60
80
100
120
Figure 32: Level of Satisfaction by Faculty
69
Below is another graphical representation of the Blackboard tools and features that
faculty rated as “not at all useful” in supporting their teaching and course management
tasks.
In the mid-term survey, 43% of respondents were “not at all satisfied” with such tools
as Packages and Utilities (Statement 20). However, in the end-term semester, only 20%
of respondents were not satisfied with this tool.
Then, comparing the data from both surveys, we can see that the level of satisfaction
with such tool as (1) Send Email; (2) Quick Setup Guide and Teaching Style; (3)
Calendar, To Do and Needs Attention; and (4) Groups (Statements 17, 19, 22 and 25)
dropped from 29% to 10%, 10%, 0%, and 20%.
70
27. Using SafeAssign
26. Using Turnitin
25. Sending and receiving…
24. Sending and receiving…
23. Connecting or encouraging…
22. Customizing the navigation
21. Integrating an external…
20. Importing or exporting course…
19. Keeping track of your course…
18. Conducting online chat…
17. Creating and managing groups
16. Monitoring course activity
15. Setting up and using the…
14. Creating and using rubrics
Not at all Satisfied End-Term Survey
13. Giving feedback on and/or…
Not at all Satisfied Mid-Term Survey
12. Facilitating graded and…
11. Creating and administering…
10. Assigning peer reviews on…
9. Assigning individual and…
8. Posting assignments
7. Organizing course content,…
6. Creating course web pages
5. Posting audio/video lectures or…
4. Uploading, organizing, and…
3. Posting course announcements
2. Creating a course calendar
1. Creating and publishing the…
0
10
20
30
40
50
Figure 33: Level of Satisfaction by Faculty
Below is a graphical representation of the Blackboard tools and features that faculty was
“moderately and slightly satisfied”. Respondents indicated that their level of
satisfaction to the end of the survey increased. For example, such features as Creating
and publishing the course syllabus (Statement 1) and Uploading, organizing, and
sharing course files (Statement 4) from 14% and 29% (See “Moderately Satisfied”
71
Column) changed to 50% and 60%. However, the level of satisfaction with such tool as
Gradebook (Statement 15), dropped from 57% to 20% (See “Slightly Satisfied” Column).
27. Using SafeAssign
26. Using Turnitin
25. Sending and receiving messages
24. Sending and receiving messages
23. Connecting or encouraging…
22. Customizing the navigation
21. Integrating an external learning…
20. Importing or exporting course…
19. Keeping track of your course…
18. Conducting online chat sessions
17. Creating and managing groups
Moderately Satisfied End-Term
Survey
16. Monitoring course activity
15. Setting up and using the…
14. Creating and using rubrics
Moderately Satisfied Mid-Term
Survey
13. Giving feedback on and/or…
Slightly Satisfied End-Term Survey
12. Facilitating graded and…
Slightly Satisfied Mid-Term Survey
11. Creating and administering…
10. Assigning peer reviews on…
9. Assigning individual and…
8. Posting assignments
7. Organizing course content,…
6. Creating course web pages
5. Posting audio/video lectures or…
4. Uploading, organizing, and…
3. Posting course announcements
2. Creating a course calendar
1. Creating and publishing the…
0
10
20
30
40
50
60
70
Figure 34: Level of Satisfaction by Faculty
Below is a graphical representation of the Blackboard tools and features that faculty
were “highly satisfied”.
72
27. Using SafeAssign
26. Using Turnitin
25. Sending and receiving…
24. Sending and receiving…
23. Connecting or encouraging…
22. Customizing the navigation
21. Integrating an external…
20. Importing or exporting course…
19. Keeping track of your course…
18. Conducting online chat…
17. Creating and managing groups
16. Monitoring course activity
15. Setting up and using the…
14. Creating and using rubrics
13. Giving feedback on and/or…
12. Facilitating graded and…
11. Creating and administering…
10. Assigning peer reviews on…
9. Assigning individual and…
8. Posting assignments
7. Organizing course content,…
6. Creating course web pages
5. Posting audio/video lectures or…
4. Uploading, organizing, and…
3. Posting course announcements
2. Creating a course calendar
1. Creating and publishing the…
0
Highly Satisfied End-Term Survey
Highly Satisfied Mid-Term Survey
10
20
30
40
50
60
Figure 35: Level of Satisfaction by Faculty
Figure 35 shows that respondents were “highly satisfied” with such a Blackboard tools
as Announcements (Statement 3; 43% in the mid-term survey, 50% in the end-term
survey) and Send Email (Statement 25; 57% in the mid-term survey, 40% in the endterm survey).
The level of satisfaction with some tools decreased for 23-29% in the end-term survey.
Among these tools were (1) creating course web pages (Statement 6), (2) organizing
course content, activities, and assessments into a series of modules or lessons (Statement
7), (3) importing or exporting course content (Statement 20), and (4) customizing the
navigation, look, and feel of your course (Statement 22).
73
74
Usefulness of Online Documentation
Faculty was also asked to rate the overall usefulness of Blackboard’s online
documentation on a five-point rating system. Table 32 presents the data:
Usefulness of Online Documentation by Faculty
Online Documentation Use
Do Not Use
Not at all Useful
Slightly Useful
Moderately Useful
Highly Useful
Mid-Term
Number
4/7
0/7
1/7
1/7
1/7
Mid-Term
Percent
57
0
14
14
14
End-Term
Number
3/10
1/10
2/10
3/10
1/10
End-Term
Percent
30
10
20
30
10
Table 32: Usefulness of Online Documentation by Faculty
In the mid-term survey, 57% of respondents indicated that they did not use online
documentation tool; however, this tool was indicated as more or less useful by 60% in
the end-term survey. One respondent rated the online documentation as “not at all
useful” (10%) in the end-term survey. If to compare responses from both surveys, we
can see that the number of those who rated this tool as “slightly”, “moderately” and
“highly” useful increased.
Below is a graphical representation of the overall usefulness of Online Documentation
by faculty.
Highly Useful
Moderately Useful
End-Term
Slightly Useful
Mid-Term
Not at all Useful
Do Not Use
0
10
20
30
40
50
60
Figure 36: Usefulness of Online Documentation by Faculty
75
76
Staff About the Usefulness of Blackboard
Level of Satisfaction with Blackboard Tools
Staff was asked to rate their level of satisfaction with the Blackboard tools and features
designed to support the following teaching and course management tasks:
Teaching and Course Management
Tasks
1. Creating and publishing the course
syllabus (Content)
2. Creating a course calendar (Course
Calendar)
3. Uploading, organizing, and sharing
course files (Control Panel>>Content
Collection>>Course Name)
4. Posting audio/video lectures or other
multimedia (Control Panel>>Content
Collection>>Course Name)
5. Creating course web pages (Content
Area > Blank Page)
6. Organizing course content, activities,
and assessments into a series of
modules or lessons (Content Area,
Learning Modules)
7. Creating assignments (Content >
Assignment)
8. Assigning individual and
collaborative writing tasks (Journals,
Wikis, Blogs)
9. Creating and administering online
quizzes, tests, and/or surveys (Tests,
Surveys, and Pools)
10. Creating rubrics to grade student
work (Rubrics, Grade Center)
11. Setting up the gradebook (Grade
Center)
12. Creating and managing groups for
group assignments, group discussions,
and/or group projects (Groups)
13. Managing course tasks (Calendar,
To Do, Needs Attention)
14. Importing or exporting course
content (Packages and Utilities)
15. Integrating an external learning tool or
platform with the course, e.g., SoftChalk
Cloud, Piazza, etc. (Web Link)
16. Customizing the navigation, look, and
feel of your course (Quick Setup Guide,
Teaching Style)
Do Not
Use
Percent
Not at all
Satisfied
Percent
Slightly
Satisfied
Percent
Moderately
Satisfied
Percent
Highly
Satisfied
Percent
20
8
20
32
20
38
13
25
4
21
21
4
21
29
25
50
0
17
25
8
17
4
21
38
21
13
4
30
30
22
9
0
35
26
30
52
4
13
13
17
17
0
29
50
4
50
4
13
21
8
33
8
33
21
4
33
0
25
33
8
50
0
29
8
13
33
8
17
33
8
67
4
4
21
4
25
13
25
25
13
77
Table 33: Level of Satisfaction by Staff (End-Term Survey)
Below is a graphical representation of the tools and features that staff didn’t use or was
not satisfied at all with.
16. Customizing the navigation, look,…
15. Integrating an external learning tool…
14. Importing or exporting course…
13. Managing course tasks
12. Creating and managing groups for…
11. Setting up the gradebook
10. Creating rubrics to grade student…
9. Creating and administering online…
Not at all Satisfied
8. Assigning individual and collaborative…
Do Not Use
7. Creating assignments
6. Organizing course content, activities,…
5. Creating course web pages
4. Posting audio/video lectures or other…
3. Uploading, organizing, and sharing…
2. Creating a course calendar
1. Creating and publishing the course…
0
10
20
30
40
50
60
70
80
Figure 37: Level of Satisfaction by Staff (End-Term Survey)
Thus, 50-52% of respondents did not use such features as (1) Posting audio/video
lectures or other multimedia (Statement 4); (2) Assigning individual and collaborative
writing tasks (Statement 8); (3) Creating rubrics to grade student work (Statement 10);
and (4) Managing course tasks (Statement 13). Sixty-seven percent of respondents did
not use such features as Integrating an external learning tool or platform with the
course, e.g., SoftChalk Cloud, Piazza, etc. (Statement 15).
78
Below is a graphical representation of the tools and features that staff were slightly,
moderately or highly satisfied with. Among them such tools as (1) Creating
assignments (Statement 7; “slightly satisfied” – 35%, “moderately satisfied” – 26%, and
“highly satisfied” – 30%) and (2) Creating and administering online quizzes, tests,
and/or surveys (Statement 9; “slightly satisfied” – 29%, “moderately satisfied” – 50%,
and “highly satisfied” – 4%).
16. Customizing the navigation, look,…
15. Integrating an external learning…
14. Importing or exporting course…
13. Managing course tasks
12. Creating and managing groups for…
11. Setting up the gradebook
10. Creating rubrics to grade student…
Highly Satisfied
9. Creating and administering online…
Moderately Satisfied
8. Assigning individual and…
Slightly Satisfied
7. Creating assignments
6. Organizing course content,…
5. Creating course web pages
4. Posting audio/video lectures or…
3. Uploading, organizing, and sharing…
2. Creating a course calendar
1. Creating and publishing the course…
0
10
20
30
40
50
60
Figure 38: Level of Satisfaction by Staff (End-Term Survey)
79
Students About the Usefulness of Blackboard
Overall Usefulness of Blackboard in Learning
Students were also asked to rate their level of satisfaction with the Blackboard tools and
features on a five-point rating scale. Table 34 describes their responses:
Usefulness of Blackboard in Learning by Students
Blackboard Tools and Features
1
2
3
4
5
6
7
8
9
Strongly
Disagree
MidTerm
Percent
Strongly
Disagree
EndTerm
Percent
Disagree
MidTerm
Percent
Disagree
EndTerm
Percent
Neither
Agree nor
Disagree
Mid-Term
Percent
Neither
Agree
nor
Disagree
EndTerm
Percent
Blackboard helps me to
learn the course
5
8
17
10
31
materials/content.
Blackboard helps me to
5
12
34
18
32
study for exams/tests.
Blackboard helps me to
complete course
5
7
9
7
21
assignments.
Blackboard helps me to
4
7
7
5
20
take quizzes/exams.
Blackboard helps me to
make efficient use of my
13
8
13
16
32
time in the course.
Blackboard helps me to be
in control of my own
8
10
10
18
30
learning in the course.
Blackboard helps me to
communicate with my
6
19
10
16
26
professor.
Blackboard expands access
to learning
9
11
10
13
32
materials/resources
available to me.
Blackboard is beneficial to
my overall learning in the
9
11
8
11
37
course.
Table 34: Usefulness of Blackboard in Learning by Students (Mid-term Survey)
Agree
MidTerm
Percent
Agree
EndTerm
Percent
Strongly
Agree
MidTerm
Percent
Strongly
Agree
EndTerm
Percent
37
31
19
17
26
30
26
18
16
23
19
41
31
24
37
19
36
25
33
43
26
25
28
18
21
20
33
20
19
37
21
36
23
21
21
26
31
18
17
33
28
30
25
16
28
80
Below are graphical representations of the student level of agreement regarding the
usefulness of Blackboard in learning.
9. Blackboard is beneficial to my
overall learning in the course.
8. Blackboard expands access to
learning materials/resources…
7. Blackboard helps me to
communicate with my professor.
6. Blackboard helps me to be in
control of my own learning in the…
5. Blackboard helps me to make
efficient use of my time in the course.
4. Blackboard helps me to take
quizzes/exams.
3. Blackboard helps me to complete
course assignments.
2. Blackboard helps me to study for
exams/tests.
1. Blackboard helps me to learn the
course materials/content.
0
Neither Agree nor Disagree (%)
End-Term
Neither Agree nor Disagree (%)
Mid-Term
Disagree (%) End-Term
Disagree (%) Mid-Term
Strongly Disagree (%) End-Term
Strongly Disagree (%) Mid-Term
10
20
30
40
Figure 39: Usefulness of Blackboard in Learning by Students (1)
Overall, students tend to be neutral about the usefulness of the Blackboard tools and
features (See Column “Neither Agree nor Disagree”). However, from 30 to 34% of
respondents didn’t find Blackboard helpful in their preparation for exams and tests
(Statement 2).
81
9. Blackboard is beneficial to my overall
learning in the course.
8. Blackboard expands access to
learning materials/resources available…
7. Blackboard helps me to communicate
with my professor.
6. Blackboard helps me to be in control
of my own learning in the course.
5. Blackboard helps me to make
efficient use of my time in the course.
4. Blackboard helps me to take
quizzes/exams.
3. Blackboard helps me to complete
course assignments.
2. Blackboard helps me to study for
exams/tests.
1. Blackboard helps me to learn the
course materials/content.
0
Strongly Agree (%) End-Term
Strongly Agree (%) Mid-Term
Agree (%) End-Term
Agree (%) Mid-Term
10
20
30
40
50
Figure 40: Usefulness of Blackboard in Learning by Students (2)
Overall, students rated the usefulness of the Blackboard tools positively. The most
helpful Blackboard was in completing course assignments (Statement 3; mid-term – 41%
and 24%, end-term – 31% and 37%) and taking quizzes/exams (Statement 4; mid-term –
36% and 33%; end-term – 25% and 43%).
Usefulness of Online Documentation
Students were also asked to rate the overall usefulness of Blackboard’s online
documentation on a five-point rating system. Overall, more than 70% of students found
Blackboard online documentation tool useful (slightly, moderately and highly).
Usefulness of Online Documentation by Students
Online Documentation
Usefulness
Do Not Use
Not at all Useful
Slightly Useful
Moderately Useful
Highly Useful
Mid-Term
Number
10/81
8/81
23/81
29/81
11/81
Mid-Term
Percent
12
10
28
36
14
End-Term
Number
12/63
7/63
11/63
17/63
16/63
End-Term
Percent
19
11
17
27
25
Table 35: Usefulness of Online Documentation by Students
Below is a graphical representation of the student rating of the usefulness of Blackboard
in online documentation.
82
Highly Useful
Moderately Useful
End-Term Survey
Slightly Useful
Mid-Term Survey
Not at all Useful
Do Not Use
0
5
10
15
20
25
30
35
40
Figure 41: Usefulness of Online Documentation
83
Usefulness of Blackboard Products
Students were also asked to rate how useful Blackboard features were in contributing to
their learning. Unfortunately, due to technical issues, the data from the end-term survey
could not be reached. Table 43 presents the data from the mid-term survey:
Usefulness of Blackboard Features in Learning by Students (Mid-Term Survey)
Features
1. Announcements (for reading
announcements and other timely news
and information posted by your
instructor or department)
2. Assignments (for submitting
individual or group assignments)
3. Calendar (for managing your
personal calendar and viewing course
events and due dates)
4. Chat (for live text messaging with
classmates and other Blackboard users)
5. Course Messages (for sending and
receiving messages to and from your
instructor and other students)
6. Groups (for collaborating with a
specific group of students on
assignments, discussions, blogs, wikis,
or projects)
7. Journal (for keeping a learning
journal shared with your instructor)
8. Content Collection > My Content
(for storing personal files related to
your course work)
9. My Grades (for viewing a list of the
graded items in the course and the
grades you received)
10. Quizzes/Tests (for taking and
receiving feedback on online quizzes,
tests, and self-assessments)
11. Roster (for viewing a list of the
other people in the course)
12. Rubrics (for understanding how
your work will be or was graded)
13. Send Email (for sending messages
to the external email account of other
course members)
Do Not Use
This Feature
Not at all
Useful
Slightly
Useful
Moderately
Useful
Highly
Useful
0
6
24
42
27
0
6
21
45
28
26
15
16
25
19
54
8
14
11
13
15
12
25
26
22
26
11
24
25
19
56
17
14
5
9
43
10
20
10
15
10
6
14
32
38
20
7
14
25
33
43
12
17
17
10
7
7
19
31
37
19
10
21
25
25
84
14. Surveys (for taking online surveys)
8
16
26
28
15. Tasks (for completing a list of tasks
32
7
17
19
prepared by the instructor)
16. Discussions/Discussion Board (for
participating in online discussions with
31
15
16
26
the entire class)
17. Discussions/Discussion Board (for
participating in online discussions in
11
13
26
18
small groups)
18. Blog (for individual and group
writing tasks assigned by your
21
12
25
21
instructor)
19. Wikis (for individual and group
writing tasks assigned by your
19
16
21
23
instructor)
20. Collaborate (for participating in
virtual classrooms and meeting spaces
30
10
17
22
(web conferencing))
21. Self and Peer Assessment (for
providing and receiving feedback from
25
11
20
20
peers)
Table 36: Usefulness of Blackboard Features in Learning by Students (Midterm Survey)
21
23
12
33
21
21
21
25
Below is a graphical representation of the tools that students did not use. Among them
are Chat (Statement 4; 54%) and Journal (Statement 7; 56%); Content Collection
(Statement 8; 43%) and Roster (Statement 11; 43%); Discussions (Statement 16; 31%) and
Tasks (Statement 15; 32%), as well as Collaborate (Statement 20; 30%).
85
21. Self and Peer Assessment
19. Wikis
17. Discussions (small group)
13. Send Email
Not at all Useful
12. Rubrics
Do Not Use This Feature
11. Roster
9. My Grades
7. Journal
5. Course Messages
3. Calendar
1. Announcements
0
10
20
30
40
50
60
Figure 42: Usefulness of Blackboard Features in Learning by Students (Midterm Survey)
Below is a graphical representation of the features that students found useful. Among
the most useful tools are (1) Announcements (Statement 1); Assignments (Statement 2);
My Grades (Statement 9); Quizzes/Tests (Statement 10); and Rubrics (Statement 12).
21. Self and Peer Assessment
20. Collaborate
19. Wikis
18. Blog
17. Discussions (small group)
16. Discussions (entire class)
15. Tasks
14. Surveys
13. Send Email
Highly Useful
12. Rubrics
Moderately Useful
11. Roster
10. Quizzes/Tests
9. My Grades
8. Content Collection
7. Journal
6. Groups
5. Course Messages
4. Chat
3. Calendar
2. Assignments
1. Announcements
Slightly Useful
0
10
20
30
40
50
86
Figure 43: Usefulness of Blackboard Features in Learning by Students (Midterm Survey)
Section Discussion
Summing up this section, it is worth mentioning that overall level of satisfaction with
Blackboard tools varies from being more or less useful to more or less useless and has
different indicators in both surveys. Thus, in the mid-term survey, faculty respondents
didn’t use such features as (1) assigning peer reviews on student work (Statement 10);
(2) creating and using rubrics to grade student work (Statement 14) and (3) using
SafeAssign originality checking on assignments (Statement 27). However, becoming
more familiar with the Blackboard tools, 90% faculty used some of these tools by the
end of the semester (For example, such tools as Self and Peer Assessment, Wikis, Blogs,
Rubrics and Grade Center (Statement 10 and 14)). Other features that 80-90% of faculty
respondents didn’t use were (1) connecting or encouraging students to connect with
Blackboard users and groups within or outside of the course (Statement 23), (2) using
Turnitin originality checking on assignments (Statement 26) and (3) conducting online
chat sessions (Statement 18). Among tools that were not used by more than 50% of
faculty were (1) Course Calendar (Statement 2), (2) Groups (Statement 17) and (3)
Course Massages (Statement 24). The features the faculty respondents were “highly
satisfied” were the following (1) Announcements (Statement 3) and (2) Send Email
(Statement 25).
The staff respondents were slightly, moderately or highly satisfied with such tools as (1)
creating assignments (Statement 7) and (2) creating and administering online quizzes,
tests, and/or surveys (Statement 9). However, more than 50% of staff indicated that
they did not use such features as (1) posting audio/video lectures or other multimedia
(Statement 4); (2) assigning individual and collaborative writing tasks (Statement 8); (3)
creating rubrics to grade student work (Statement 10); and (4) managing course tasks
(Statement 13). Sixty-seven percent of respondents did not use such feature as the
integration of an external learning tool or platform (e.g., SoftChalk Cloud, Piazza, etc.)
with the course (Statement 15).
87
Students had a tendency to be neutral about the usefulness of the Blackboard tools and
features. Among tools that more than 50% of students did not use were Chat (Statement
4) and Journal (Statement 7). More than 40% of students did not use such tools as
Content Collection (Statement 8) and Roster (Statement 11). Among the most useful
were such tools as (1) Announcements (Statement 1); Assignments (Statement 2); My
Grades (Statement 9); Quizzes/Tests (Statement 10); and Rubrics (Statement 12).
88
Functionality of the LMS Products
The following section focuses on functionalities and how much pilot participants
utilized them. These functionalities were combined in such category as Blackboard
Learn products, the core functions of which included:
Customization
Journals
Discussions
Dropbox
Grades
Tests
Announcements
Push Notifications
Content
Tasks
Blogs
Roster
Customization allows users to decide themselves what’s most important in the app.
Things that are used the most can be renamed, hidden, reordered, and color-coded.
Students and instructors access the things they use most (e.g. favorite blog posts,
folders, and announcements). This approach saves users’ time; they shouldn’t have to
click through a whole course to get to the places they visit most often.
Beings a key course communication tool for engaging students, the Discussion Board
allows students to ask and answer questions while allowing instructors to chime in,
provides students and instructors with the possibility to easily read-up and contribute
to Discussions from Blackboard Mobile Learn, allows to upload media from user’s
mobile device as part of a Discussion attachment.
The Grade tool allows students to find out the information regarding how successful
they have been on their last midterm or homework assignment. When the grade is
posted students are sent a Push Notification.
89
The Announcements tool is the place where instructors go to post the news students
need to know, for example, the assignment due date, class cancellation etc. Students
have instant, on-the-go access to the latest announcements, and instructors can post
announcements anywhere, anytime.
The Content tool provides students with an access to content uploaded by their
instructors. Moreover, they can interact with it on their device, using Blackboard Mobile
Learn or any other application that supports those documents.
The Blogs tool gives a possibility for students (classmates) to read blog posts and
interact with each other by posting comments and uploading media as attachments on
blogs (Android and iOS) as well as uploading non-media files (Android).
The Journal tool is designed to help students reflect on their course, comment on peer
Journals. Instructors can use this tool to comment on student journals.
The Dropbox is linked to Blackboard Mobile Learn on mobile devices (e.g. iOS and
Android). Students and instructors can easily manage critical course documents from
their mobile devices, without ever leaving the Blackboard Mobile Learn app. Not only
can students and instructors save their course content to their personal Dropbox, but
they can also upload documents to discussions and blogs with a single click.
The Test tool provides the opportunity to take Mobile Tests either via Blackboard
Mobile Learn on users’ iOS or Android device, or on their desktop computer.
The Push Notification tool allows students to receive automatic, personalized
notifications delivered straight to their mobile devices to help them stay informed.
Students can receive notifications for new announcements, new graded items, a test
being posted, and many other course activities.
90
The Task tool is used for tracking and managing the progress of carious tasks from
turning in homework assignments to midterm reminders to purchasing textbooks. The
students can mark when they've started a task and when it's complete.
The Roster allows students quickly view their entire class list, making organizing study
groups a whole lot easier.
Section Findings
This section offers the pilot participants feedback on the core functions of the
Blackboard Learn tools.
Overall Comments on Blackboard Learn: Pros and Cons
The major disadvantages of the system reported by the pilot participants were: (1)
accessibility, (2) navigation and (3) Angel-Blackboard migration.
It was a lot of discussion on the accessibility that was indicated as a major issue for
many of the participants:
“Nothing was easy access it was like I needed to take a college course on the best way to
use this useless program.”
“Could not access few of the critical course content.”
A lot of students also indicated that they had difficult time finding things and that a lot
of time was spent just on “clicking” to find what they were looking for:
“The system is cumbersome. There are too many options that simply clutter the course. I
have not found one place that will let me know all the new things that have taken place. I
am never sure where our group puts things because there are too many places to do that.
When it takes more time to organize your small group than it does to do the work, there is
a problem. SIMPLIFY BLACKBOARD. Eliminate or allow some of the tools to be
hidden. I don't want to spend my team searching for the needed information or
91
communications. I have too little time as is and this LMS has simply increase the amount
of time I have to spend.” (Student)
“Not very streamlined, there are too many different sections to check for each assignment,
have to go back and forth to view things.” (Student)
Another trouble experienced by students was loading/uploading things:
“The videos never load so I have to find the movies for my class on other websites.
Something on the website gave me a virus on my computer.”
“The homework, because it's hard to upload the homework.”
All respondents mentioned navigation as one of the major issues. Typical comments
are captured here:
“What I like least about Blackboard is that sometimes it can be confusing to navigate.”
(Student)
“There was a lot of duplicate information or links to the same areas. It was difficult to
navigate and find the pertinent information like grading rubrics. Someone even had to
ask how to find the syllabus and I'm glad they did because it was not intuitive. Even
when I did find the information I was looking for, it was difficult to repeat the steps to
find it a second time.” (Student)
“I really disliked how much room the navigation took up - it left very little real estate for
the main content.” (Support Staff)
Comparisons to ANGEL were unavoidable. Many took the form of not being able to
find or do things the way they are used to do using ANGEL.
“I wish there were a feature analogous to ANGEL Private Team Journal (i.e. a single
discussion forum with posts sorted by group).” (Staff)
92
“With Blackboard Learn, for a single assignment (e.g. Lesson 1 Discussion) every group
must have its own discussion forum. This means that some courses would have dozens of
discussion forums, which would be cumbersome to manage.” (Staff)
“Still had trouble organizing content the way I wanted to. Very lineal.” (Staff)
“Instructor contacts (pages) live at the course level Email & Course Messages - Neither
offered an ideal solution for what our users need. Lack of granularity with permissions for
discussion forums - Could not allow one group to have write access and another group to
only have view access. Grade Center - complex and not intuitive.” (Staff)
“Content looked very plain - need to explore how to build in interactive elements.”
(Staff)
“Not having as many variations of the discussion boards such as "Post First". This is
something that has been useful in ANGEL, but something I'll gladly give up to move to a
more up to date system.” (Staff)
“Difficult to add a group later in the process, not all of the edit features were intuitive -had some trouble finding a few items. Scalability issues, display of content (esp.
graphics).” (Staff)
“Overly complex to complete certain tasks - For example there are four separate options
to edit parts of tests 1.)Editing Test 2.) Editing Test Options 3.) Within the Test -> Edit
4.) Edit Question Settings.” (Staff)
Another major weakness of the system was the migration process from ANGEL to
Blackboard Learn (See Migration Chapter).
Discussing the convenience of the system, only 5-7% of the student respondents (3/55
(5%) in the mid-term survey; 3/46 (7%) in the end-term survey) remarked that they were
highly satisfied with the possibility of “having everything handy in one location”:
93
“[I Like] Combined Lesson with Activities under once folder.”
“Easy to use and has everything that you need in one place.”
To have “the tables on the left for finding everything” was indicated by many respondents
as a very helpful feature.
“[I Like] All the links on the left side.”
Among strengths of the LMS was User Interface. Students reported that in comparison
with ANGEL Blackboard Learn is “more graphic”, “more modern”, “more sophisticated and
intuitive”. Staff also expressed their satisfaction with the platform interface (However, it
was also described as “cluttered interface, difficult to find things”.), and the variety of
features to meet their needs:
“User interface is simple and easy to use.”
“I think that it has some more modern features that are better integrated into the system
than ANGEL.”
Other comments included:
“I'm also not a fan of the pale yellow background.”
“Look & Feel - Overall look and feel (UI, icons etc) were not modern.”
Faculty Feelings About the Blackboard Learn Tools and Features
Among the benefits of the platform tools/features, instructors mentioned the following:
(1) Retention Center, (2) Integrated Bb Collaborate, (3) Discussion Tool (particularly the
possibility to see unread posts), (4) Announcements, (5) Student Preview tool, (6) Grade
Center, (7) Interface (“Has a more modern look and feel than Angel.”), (8) Integration with
94
the external publisher content, and (9) Push Notifications (“In an online course being able
to send those push notifications to a student's mobile device is a huge benefit.”)
Among the major disadvantages was the Mobile app that prevented faculty form the
effective use of the (1) Gradebook, (2) Group Discussions and (3) Collaborate.
Staff Feelings About the Blackboard Learn Tools and Features
The staff respondents remarked that the strengths of the Blackboard were: (1) the
possibility of having all the content in one system and (2) the integration of the third
party, such as Safe Assign, Crocodoc, Blogs, Wikis, Collaborate. However, the last one
didn’t meet IDs’ expectations and needs.
“Integration of 3rd party tools - Safe Assign, Crocodoc Ability to reorder left-hand menu
items and content pieces easily.“
“I like the addition and integration of certain tools such as Blogs, Wikis, Collaborate, etc.
But I don't think they were "integrated" in the best way.”
The other features and tools that got a favorable feedback by the staff participants were
(3) Assessment, (4) Test Availability Exceptions, (5) Achievements capabilities, (6)
Student Preview, (7) Group settings and (8) Retention Center.
“One thing I really liked about BB was the idea of assignments containing everything
needed for a task - instructions, related documents, rubrics, plus avenues for submission,
grading, etc., all combined into one "thing". “
“Test Availability Exceptions - The ability to provide individual or a group of students
with specific test extensions or extended time limits.”
Staff respondents were sharply divided in views regarding the Gradebook. One group
of respondents (staff) expressed particular pleasure with the Gradebook, particularly
regarding the ease of use in comparison with the ANGEL:
95
“Grade Center. It's easier to use than ANGEL's, and performs consistently.”
“In-line grading capability, ability to provide exceptions for assessments.”
However, the other group of respondents (staff) had the opposite opinion and
described the Gradebook as “pretty complex”, “very confusing”, "complex and not
intuitive”, “large and cumbersome”. It was also mentioned that: “Setting up grading seemed
rather challenging”. The reason of this feedback might be in having not enough time to
investigate the tool and “really learn it.”
Discussing the disadvantages of the platform, the staff respondents reported that they
were not satisfied with such tool as (1) Journal, (2) Discussion, (3) Content, (4) Email, (5)
Course Messages and (6) Groups. In many cases a lot of the difficulty was due to the
lack of experience and training on how these tools worked.
Respondents (staff) indicated that they faced difficulties with the Exam tool; exam
questions “were dropped randomly with no indication of what exam they were dropped from,
exams were duplicated.”
This quote reflects an issue some participants (staff) had regarding Communication:
“Never received communications.”
Student’s Feelings About the Blackboard Learn Tools and Features
The main strengths of the platform, according to the participants’ replies, were
Blackboard Collaborate and Blackboard Mobile. Video Chart Rooms and Group Projects
were reported as the most useful collaborative tools:
“The Blackboard Collaborate is very useful to my Group Projects and communication.”
“I like the video chat room, because it makes it really easy to communicate with your
partners from far away.”
96
However, it was also reported that the respondents experienced some technical issues
using the Collaborate:
“The collaborate meeting space is also buggy, sometimes I would get kicked off, or the
connection was slow. When trying to collaborate with other people this can hinder
productivity.”
Pilot participants expressed particular pleasure with such Course tools as the
Assessments, the Grade and the Calendar. Quiz, Test, and Exam settings were
indicated to be well-organized and easy to use. The Quiz was characterized as
“convenient”, “easy to manipulate”, “easy to take” etc. The possibility to see the grades
immediately after the submission was very much appreciated by the participants.
“Quizzes and exams were easy to take and much easier than using ANGEL.”
“[I Like that] Quizzes and assignments [are] well organized.”
One of the respondents remarked that he was not satisfied with the tool because he had
“to go through multiple questions to backtrack” and couldn’t “just skip to the skipped
question.”
Some technical issues using Peer Assessment tool were indicated:
“Had a hard time completing peer assessments (the assessment never showed up).”
Students indicated their satisfaction with the Grade tool, particularly the possibility to
find out information regarding how successful they had been on their last midterm or
homework assignment.
“I like that I am able to see my grades as they are completed, take quizzes on my own
time, and communicate with my professor when needed.”
“[I like] The organization of grades.”
97
The Calendar was indicated as a very efficient application to keep track of the
important dates for the course by 4% (2/55) respondents in the mid-term survey:
“[I like] Ease of use. Blackboard is very straightforward, and has a much better
calendar/tasks function than ANGEL.”
“I like the tasks and calendar feature!”
Students gave a favorable feedback on such communication tools as Announcements
and Push Notification. The Announcements tool provided students with instant and onthe-go access to the latest announcements. The Push Notification tool allowed them to
receive automatic, personalized notifications delivered straight to their mobile devices
to help them stay informed. Students appreciated getting reminders/notifications for
new announcements, new graded items, a test being posted, and many other course
activities.
“What I like most about blackboard is the ability to keep up to date with my assignments
and communicate with others and my professor.”
“[I liked] The manner in which one can see due dates at a glance, the manner in which all
assignments are listed for grading, the grouping of announcements.”
“The customizable lists of impending assignments are incredibly helpful.”
However, it was also mentioned that the tools didn’t always work in a way it was
expected:
“It has a feature for letting you know when assignments are due, but it didn't work but it
would have been nice.”
“There is a feature for assignment due dates and this worked on a sporadic basis. It never
showed what was due for the week until the day before it was due, and it did not capture
98
all of the assignments. Some were duplicated. If this feature is going to be useful, it has to
work 100% of the time. In general, it seemed to lack the organization and intuitive
features that most applications have today. Also, it was difficult to communicate because
there was not a feature that shows if you have a new message inside or outside of
blackboard.”
“Announcement area is a bit confusing.”
Such Communication tool as the Discussion Board made 11-18% of students (6/54 (11%)
in the mid-term survey; 9/45 (20%) in the end-term survey) experience frustration. The
tool was described as “cumbersome,” “not user friendly,” poorly structured,” and time
consuming. Typical comments were the following:
“The discussion board is poor structured and not user friendly and when I first enter …
[it] cannot tell if there are comments in the discussion areas.”
“Blackboard does not offer an easy way to see new content. You have to look in the
individual spots. Discussion boards and blogs are cumbersome as you have to scan the
entire thread and can't easily see new responses.”
“Cumbersome and did not like the discussion board forum at all. Required bulky extra
work to search and return to home page.”
“The discussion board does not register that a message has been read on the 'thread' until
you physically click each box and say 'mark as read' which is time-consuming and hard
to tell when there is something new.”
Such tools as the Email and Blog were problematic to work with:
“E-mail is difficult because it requires multiple steps. With my team working in both
ANGEL and Blackboard, we ended up opting to do the work for this course through
ANGEL in one of the other courses.”
99
“I sometimes had issues connecting to the blog from Blackboard. Sometimes the link on
the right wouldn't work and I would have to access through activities, and sometimes it
would be the other way around.”
“The email section of blackboard is a bit annoying. If it uses a mail formal similar to
Gmail, it will be great.”
“I don't like that I can't forward the course email to my Penn State email account.”
“The email and the iPhone issues I mentioned above are also VERY high on my list.”
It was also mentioned that the Blogs do not encourage interaction in a way the
Discussion Boards do it in the ANGEL:
“The ability to post a blog but the blogs do not encourage interaction as much as
discussion boards in other Angel.”
The positive feedback included: “The blog and class discussion tools are easy to use and help
you stay current on the discussion.”
Only 4% of respondents (2/55 (4%) in the mid-term survey; 2/46 (4%) in the end-term
survey) were satisfied with the Discussion Board.
“It is really easy to post things to the discussion boards and the calendar is helpful.”
“Group discussions were very easy to manage.”
“Group chats because it shows what most people are thinking.”
Section Discussion
Pilot participants indicated that the major disadvantages of the Blackboard Learn are (1)
accessibility, (2) navigation and (3) Angel-Blackboard migration. It was indicated that in
order to find necessary items the respondents had to spent a lot of time just on
100
“clicking” the buttons and tabs. Then, a lot of troubles the participants of the pilot
experienced loading/uploading items. A big issue was the migration process from
ANGEL to Blackboard Learn. Staff participants reported the process to be time
consuming and generally unsuccessful. ANGEL got a favorable feedback from the
participants with regard to the issues mentioned above. It was reported that ANGEL is
more intuitive and organized.
Among things that participants enjoyed the most was the User Interface that was
described as “more graphic”, “more modern”, “more sophisticated and intuitive” in
comparison with ANGEL. However, it was also described as “cluttered interface, difficult
to find things.” To have “the tables on the left for finding everything” was indicated by many
respondents as a very helpful feature.
Discussing the efficiency of the Blackboard Learn tools and features, the faculty and
staff respondents indicated that Retention Center, Collaborate, Discussion Tool
(particularly the possibility to see unread posts), Announcements, Group settings, Push
Notifications, Student Preview tool, Grade Center, Test Availability Exceptions, and
Achievements capabilities are the benefits of the system. Such features as the
integration with the external publisher content and the integration of the third party
(e.g. Safe Assign, Crocodoc, Blogs, Wikis, Collaborate) were remarked as the strengths
of the Blackboard by faculty and staff.
Students expressed particular pleasure with such Course tools as the Assessments, the
Grade, the Calendar, Collaborate and Blackboard Mobile. However, for staff and faculty
the Mobile app was reported as the one that needs a lot of improvement. Among other
tools that were pointed out by staff as those that need improvement were Journal,
Discussion, Content, Email, Course Messages, Exam and Groups. For students Email,
Blog and Discussion Board were the most problematic to work with.
101
Functionality of the Mobile Learn Products
Taking into account the importance of the virtual access to learning from a variety of
mobile devices the Blackboard Learn platform was integrated with Blackboard
Mobile™ Learn that provided the pilot participants with the instant access to their
courses, content and university itself. This access was possible on a variety of mobile
devices, including iOS®, Android™, BlackBerry®, and HP webOS devices. The section
below provides the respondents’ feedback on how effective and intuitive the Mobile
App.
Section Findings
The Mobile Learn app provides the opportunity for students and instructors to access
documents in multiple formats, create threaded discussion posts, upload media as
attachments to discussion boards and blogs, create content items within the course map,
and comment on blogs and journals. The following section focuses on functionalities of
the Mobile Learn Products and how much pilot participants utilized them.
Faculty About the Usefulness of the Blackboard Mobile Learn Products
Thus, in both surveys faculty was asked to rate the usefulness of the Blackboard Mobile
Learn products listed in the table below:
Usefulness of Blackboard Mobile Learn Products by Faculty
Blackboard Mobile
Learn Products
1
2
3
4
5
6
7
8
9
10
11
Customization
Discussions
Grades
Announcements
Content
Blogs
Journals
Dropbox
Tests
Push Notifications
Tasks
Do Not
Use
MidTerm
Percent
Do Not
use
EndTerm
Percent
Not at all
Satisfied
MidTerm
Percent
Not at all
Satisfied
EndTerm
Slightly
Satisfied
MidTerm
Percent
Slightly
Satisfied
EndTerm
Percent
Moderately
Satisfied
Mid-Term
Percent
Moderately
Satisfied
End-Term
Percent
Highly
Satisfied
MidTerm
Percent
Highly
Satisfied
EndTerm
Percent
50
25
50
0
0
75
75
50
75
50
75
75
25
75
25
25
75
75
75
25
25
100
25
25
0
25
25
0
0
0
0
0
0
0
25
0
25
25
0
0
0
50
0
0
25
0
25
0
0
0
0
0
25
0
0
25
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
25
0
25
0
0
0
25
0
25
0
0
25
25
25
0
0
50
0
50
75
25
25
25
0
25
25
0
50
0
50
25
25
25
0
0
50
0
102
12
Roster
25
0
0
0
50
25
0
50
25
Table 37: Usefulness of Blackboard Mobile Learn Products by Faculty
Below is a graphical representation of the faculty feedback on the usefulness of
Blackboard Mobile Learn products.
12. Roster
11. Tasks
10. Push Notifications
9. Tests
8. Dropbox
Not at all Satisfied (%) End-Term
7. Journals
Not at all Satisfied (%) Mid-Term
6. Blogs
Do Not Use (%) End-Term
5. Content
Do Not Use (%) Mid-Term
4. Announcements
3. Grades
2. Discussions
1. Customization
0
20
40
60
80
100
120
Figure 44: Usefulness of Blackboard Mobile Learn Products by Faculty (1)
Half of the Blackboard products were not used by 75% of faculty; among them Customization, Grades, Blogs, Journals and Drop-box. The “Tasks” seemed to be
especially frustrating for the participants: toward the end of the semester none of the
instructors used it.
12. Roster
Highly Satisfied (%) End-Term
11. Tasks
10. Push Notifications
Highly Satisfied (%) Mid-Term
9. Tests
8. Dropbox
Moderately Satisfied (%) EndTerm
7. Journals
6. Blogs
Moderately Satisfied (%) MidTerm
5. Content
4. Announcements
Slightly Satisfied (%) End-Term
3. Grades
2. Discussions
Slightly Satisfied (%) Mid-Term
1. Customization
0
20
40
60
80
103
25
Figure 45: Usefulness of Blackboard Mobile Learn Products by Faculty (2)
The tools with which 50% of faculty was satisfied the most by the end of the semester
were announcements and discussions (50%). Push notifications and Roster were rated
as “moderately/highly useful” by 75% of faculty.
Faculty About the Reasons of Not Using the Mobile App
Most of the respondents indicated that they didn’t use most of the features. Among the
reasons of why these features were not used faculty indicated the following:
(1) Having no necessity to work with them:
“Did not need to.”
(2) Shortage of time:
“I did not try the mobile app at all. I was teaching two classes this summer and did not
find enough time to do this.”
“Given the short period of time in this course, I was focused entirely on setting up and
running the course. Did not get to explore much beyond the basis necessary to run a
course.”
(3) Having lack of information about these features:
“Didn't know it was available. Plus, I'm sure some students don't have smart phones or
tablets. “
Staff About Blackboard Mobile Learn Products
104
Staff About the Usefulness of the Blackboard Mobile Learn Products
Staff was also asked to indicate how useful they found Blackboard Mobile Learn
products. The table below describes the data:
Table 38: Usefulness of Blackboard Mobile Learn Products by Staff
Do Not Use
This
Feature
Percent
33
33
38
25
0
50
63
38
25
25
75
50
Blackboard Mobile Learn
Products
1. Customization
2. Discussions
3. Grades
4. Announcements
5. Content
6. Blogs
7. Journals
8. Dropbox
9. Tests
10. Push Notifications
11. Tasks
12. Roster
Not at all
Useful
Percent
Slightly
Useful
Percent
Moderately
Useful
Percent
Highly
Useful
Percent
22
0
13
0
13
0
0
13
25
0
13
0
22
11
13
13
50
25
25
13
13
0
13
25
11
33
13
25
0
25
13
25
38
38
0
13
11
22
25
38
38
0
0
13
0
38
0
13
Below is a graphical representation of the staff feedback on the usefulness of Blackboard
Mobile Learn products.
12. Roster
11. Tasks
10. Push Notifications
9. Tests
8. Dropbox
7. Journals
Not at all Useful
6. Blogs
Do Not Use This Feature
5. Content
4. Announcements
3. Grades
2. Discussions
1. Customization
0
10
20
30
40
50
60
70
80
Figure 46: Usefulness of Blackboard Mobile Learn Products by Staff (1)
105
Out of 12 tools, 4 were not used by 50-75% of respondents. Among these tools are Blogs
(50%), Roster (50%), Journals (63%) and Tasks (75%).
12. Roster
11. Tasks
10. Push Notifications
9. Tests
8. Dropbox
Highly Useful
7. Journals
Moderately Useful
6. Blogs
5. Content
Slightly Useful
4. Announcements
3. Grades
2. Discussions
1. Customization
0
10
20
30
40
50
60
Figure 47: Usefulness of Blackboard Mobile Learn Products by Staff (2)
Push Notifications (76%), Announcements (63%) and Discussions (55%) were rated as
“Moderately” and “Highly” useful.
Staff About the Reasons of Not Using the Mobile App
Among the reasons of why these features were not used staff respondents indicated the
following:
(1) Having lack of training regarding the use of these features:
“We didn't get that far. I was waiting for the 3 day training that was cancelled to try out
different features.”
(2) Having lack of information about these features:
“The mobile app info was passed along to me after we completed our main testing phase
(and after the course had started).”
(3) Preference to work on a PC.
“Focus has been on building of courses which is much easier on computer.”
106
(4) Having no necessity to work with the app:
“Not involved in this aspect of the pilot.”
“Not my role.”
“No need.”
“We had a specific team to focus on mobile aspects of the system.”
(5) Shortage of time (e.g. being busy with course development):
“I haven't had the time to experiment with it just yet.”
“I have not had time to go down this path.”
“Course was still in development, not far enough along to test on mobile.”
(6) Having no opportunity to explore the features (e.g. having no access or device):
“Did not get access until June and now training has been cancelled.”
“We have not completed our testing, which is currently on hold, so we haven't had the
opportunity to test all features yet.”
“Never got access.”
“No opportunity.”
(7) Experiencing difficulties with the app:
“I have viewed the courses via the app, but couldn't really make any changes because it
kept crashing.”
107
“The mobile app is atrocious and inconsistent. If you want to display content that isn't
housed in Learn, there are several ways to do so, but each is bad in their own way. If you
want to display content within Learn, you are in the same situation, each way to do so is
bad in its own unique way.”
“There are a LOT of tradeoffs with pretty much everything in Learn. If you want a
mobile-friendly test, it has to first be created as such. At that point you lose all ability to
create semantic markup, so it becomes less Accessible. The H tag structure in Learn
technically violates Accessibility standards.”
(8) Other:
“I have not explored this option yet.”
“Just never tried it.”
“Only working with laptop.”
Students About Blackboard Mobile Learn Products
Students About the Usefulness of the Blackboard Mobile Learn Products
Table 39 presents students’ feedback about usefulness of Blackboard Mobile Learn
Products.
Blackboard Mobile
Learn Products
1
2
3
4
5
6
7
8
Customization
Discussions
Grades
Announcements
Content
Blogs
Journals
Dropbox
Do not
use MidTerm
Percent
Do Not
Use
EndTerm
Percent
Not at all
Satisfied
MidTerm
Percent
Not at all
Satisfied
EndTerm
Percent
Slightly
Satisfied
MidTerm
Percent
Slightly
Satisfied
EndTerm
Percent
Moderately
Satisfied
Mid-Term
Percent
Moderately
Satisfied
End-Term
Percent
Highly
Satisfied
Mid-Term
Percent
Highly
Satisfied
End-Term
Percent
42
27
8
4
8
48
56
29
64
45
14
18
9
55
77
59
12
8
4
8
8
12
12
8
14
9
9
14
9
9
5
5
19
35
24
12
16
12
12
21
5
9
0
9
14
9
9
0
15
19
36
44
28
12
12
17
9
27
9
14
23
9
0
5
12
12
47
32
40
16
8
25
9
9
68
45
45
18
9
32
108
9 Tests
10 Push
Notifications
11 Tasks
12 Roster
29
50
17
5
13
5
17
9
25
32
29
50
4
14
29
5
17
5
21
27
33
50
50
64
8
8
14
14
25
17
9
5
17
8
9
5
17
12
18
14
Table 39: Usefulness of Blackboard Mobile Learn Products by Students
109
Below is a graphical representation of the student feedback on the usefulness of
Blackboard Mobile Learn products.
12. Roster
11. Tasks
10. Push Notifications
9. Tests
8. Dropbox
Not at all Satisfied (%) End-Term
7. Journals
Not at all Satisfied (%) Mid-Term
6. Blogs
Do Not Use (%) End-Term
5. Content
Do Not Use (%) Mid-Term
4. Announcements
3. Grades
2. Discussions
1. Customization
0
20
40
60
80
100
Figure 48: Usefulness of Blackboard Mobile Learn Products by Students
By the end of the semester out of 12 tools 9 were not used by 50-59% of the students;
among them: Blogs, Dropbox, Tests, Push Notifications and Tasks. Sixty-four percent of
respondents didn’t wok with Customization and Roster; 77% - with Journals.
12. Roster
Highly Satisfied (%) End-Term
11. Tasks
10. Push Notifications
Highly Satisfied (%) Mid-Term
9. Tests
8. Dropbox
Moderately Satisfied (%) EndTerm
7. Journals
6. Blogs
Moderately Satisfied (%) MidTerm
5. Content
4. Announcements
Slightly Satisfied (%) End-Term
3. Grades
2. Discussions
Slightly Satisfied (%) Mid-Term
1. Customization
0
20
40
60
80
Figure 49: Usefulness of Blackboard Mobile Learn Products by Students
Overall, there were only 3 tools, such as Grades (68%), Announcements (45%) and
Content (45%), with which students were satisfied.
110
Students About the Reasons of Not Using the Mobile App
Students were also asked about the reasons of not using some of the Blackboard Learn
Mobile features, as well as about what additional features they would like to see in the
Mobile app. The following reasons were indicated in both surveys:
(1) Products were not/never tried.
(2) There was no knowledge about the existence of such products (7 responses in the
mid-term survey; 2 – in the end-term survey):
“I didn't know it existed”; Didn't know that was an option”; “I didn't know we could”;
“I had no knowledge such app existed”; “I never knew there was an app”; “Never heard
of it” etc.
(3) There was no need/desire to use it (5 respondents in the mid-term survey; 6 – in
the end-term survey):
“I've had no need (so far) to access Blackboard from by mobile phone through the
Blackboard Learn app.”
“Less than interested to install the app-plus, I don't see why I would need it on my
phone.”
“Not necessary, website needs to be fixed first.”
“I never needed to have it on my phone because I have access to it on the computer.”
(4) Preference to work with computers/laptops.
(5) Lack/No time:
“I wasted enough time navigating blackboard for needed information. I didn't want to
waste anymore time.”
“Didn't take the time to use it. Worked a home on course.”
111
(6) Learning from overseas (Africa) where mobile devices are “not as good with
internet.”
(7) Ease of use issues:
“I really dislike blackboard it is really hard to use and the help desk is barely helpful.”
(8) Accessibility (to the content and device):
“Could not access few of the critical course content.”
“Did not have access to smartphone.”
(9) Issues with the app:
“I have an iPhone, and when I was trying to use the program on my iPhone it was very
“The app is even worse than the actual online version.”
(10) Smartphones or tablets are not used/owned:
“I do not use a smartphone or tablet.”
“Since I own neither of the listed devices, I never had a chance to test it.”
(11) Preference not to use mobile devices for learning purposes:
“Would not work on my phone.”
“I am not using mobile app for academic studies.”
(12) There was no satisfaction with the speed of the app:
“Because I was afraid what's already slow would become even slower on my phone.”
“Connection speeds were terrible, uploading was cumbersome and sluggish. I didn't have
the patience for it. “
112
(13) No Wi-Fi access.
“Don't have Wi-Fi access.”
(14) Negative previous experience with the ANGEL:
“I tried using angel on my iPhone and it didn't work well so I haven't tried it with this
platform either.”
“I attempted Angel once and it did not appear to be very compatible with the iOS.”
(15) Issues with the phone (e.g. storage limitations and installation issues):
“My storage is full on my phone.”
“My phone had issues installing it.”
(16) No interest to investigate the app features (14 responses in the mid-term survey;
6 – in the end-term survey):
“Never got around to downloading it”; “No interest”; “No desire to use”; “Haven't
wanted to”; “Haven't tried”; “did no use at all”; “I don’t have them so did not use them”
etc.
(17) Lack of support from Help Desk:
“I really dislike blackboard it is really hard to use and the help desk is barely helpful.”
(19) Preference to work with the PC/laptop and tablet:
“I downloaded the app, but I use Blackboard so much more on my computer, so I haven't
really explored the app yet.”
“I do not use my tablet very often and I do not like using my phone for such actions when
I carry a laptop everywhere I go.”
113
(20) Other:
“It is too cumbersome to use on a computer let alone attempt to use it on a smart
device.”
“Blackboard was so cumbersome on the laptop that I could not imagine trying to use it on
a smart device.”
Section Discussion
While nearly all instructors/IDs indicated usage of core Blackboard Learn elements at
some point during the pilot, data results indicate a steady decline in usage from
beginning to the end of the pilot. This was a consistent trend for almost all scorecard
functionalities. Thus, out of 12 tools from 4 to 9 tools were not used by 50-75% (and
sometimes even 100% like “Tasks” by faculty) of the respondents. Here they are: (1)
Customization (75% of faculty; 64% of students), (2) Grades (75% of faculty), (3) Blogs
(75% of faculty; 50% of staff; 55% of students), (4) Journals (75% of faculty; 63% of staff;
77% of students), (5) Drop-box (75% of faculty; 59% of students), (6) Roster (50% of staff;
64% of students), (7) Tasks (75% of staff; 50% of students; 100% of faculty), (8) Tests
(50% of students), and (9) Push Notifications (50% of students).
There were 6 tools with which the respondents were more or less satisfied: (1)
Announcements (50% of faculty; 63% of staff; 45% of students), (2) Discussions (50% of
faculty; 55% of staff), (3) Push notifications (75% of faculty; 76% of staff), (4) Roster (75%
of faculty), (5) Grades (68% of students), and (6) Content (45% of students).
Core functions that were difficult to set-up and use or that did not function on the
mobile devices as expected had a steep learning curve. These difficulties sometimes
forced respondents move back to their laptops and desk computers. The main reasons
of not using the Blackboard Learn products included (1) no need, (2) shortage of time,
(3) lack of training, and (4) technical issues. In addition to these reasons, students
114
mentioned other ones, as (5) no interest to explore, (6) negative previous experience
with the ANGEL, (7) preference to work with the PC/laptop and tablet, (8) lack of
support from Help Desk, and (9) issues with Blackboard Learn functionality. To
conclude, many of the issues were knowledge (training/documentation) or
configuration (pilot) related.
115
Overall Feelings About Blackboard Mobile
Learn
Below are the pilot participants’ overall feelings of confidence about the intuitiveness
and the effectiveness of the Mobile App. The data was captured from the surveys
distributed in the mid and end of the semester.
Section Findings
Comments from instructors, staff and students who participated in the Blackboard
Learn pilot underlined that they went through some technological problems and
functional inconvenience.
Comments on Blackboard Mobile Learn
Overall, faculty and staff were not satisfied with the accessibility via mobile devices.
Faculty reported that the core functions that were difficult to set-up and use or that did
not function on the mobile devices as expected were the following:
(1) Gradebook:
“Gradebook is difficult to use and some things are not intuitive.”; “It is not intuitive for
trying to set up categories and percentages.”
(2) Mobile App:
“Mobile App works for some features and other open like browser page.”
(3) Group Discussions:
“Group discussions do not allow discussions to be easily organized into smaller groups.”
(4) Blackboard Collaborate:
“Every Group should have option for Bb Collaborate room automatically.”
116
This quote from a staff respondent reflects an issue some participants had with regards
to Mobile app.
“The mobile app is atrocious and inconsistent. If you want to display content that isn't
housed in Learn, there are several ways to do so, but each is bad in their own way. If you
want to display content within Learn, you are in the same situation, each way to do so is
bad in its own unique way.”
“There are a LOT of tradeoffs with pretty much everything in Learn. If you want a
mobile-friendly test, it has to first be created as such. At that point you lose all ability to
create semantic markup, so it becomes less Accessible. The H tag structure in Learn
technically violates Accessibility standards.”
Students expressed their satisfaction with the Mobile Learn app, which allowed them to
access and keep up with their course materials anytime and anywhere:
“It integrates better with my mobile phone.”
“[I like] that there's an iPad/Android app for this.”
“I like working on the course material from the mobile app (iPhone, iPad), receiving
notifications, and I look forward to testing out and using the collaborate video
conferencing feature. I have yet to use this. “
“Easier to use on mobile. All others features are similar to the other system used.”
Some of the respondents indicated that they were ”not being able to listen to recorded
collaborate sessions on iPhone” that seems to be an app issue.
Comments on Additional Features of the Mobile App
Discussing what additional features staff would like to see in the Mobile app, staff
provided the following feedback:
117
“Ideally none. The app should do either everything that the desktop experience can do, or
do none of it. The LMS should be working on a responsive stylesheet and only use App
features that make sense on mobile. That way the app can direct to the responsive site,
and simply link up with the mobile features. At this stage, it does very little well.”
“Only certain pieces of the course are mobile friendly. The majority of the content is the
same as if you accessed the course on a mobile web browser.”
“Less cluttered interface.”
“More test types, better ability to handle graphics in Bb (I don't like the way Bb displays
content/handles graphics), better integration with video players (like YouTube). Perhaps
an option for high- or low-bandwidth at the user's discretion to fit their needs.”
Out of 6 responses, two staff respondents replied as “N/A.”
Recommendations provided by the students regarding what additional features they
would like to see in the Mobile app were the following:
(1) Improvement of the interface: “User interface needs work.”
(2) Make grades be seen on Discussion and Journal pages.
(3) Make pre-recorded Collaborate sessions be listened:
“Currently my iPhone does not have the software necessary to listen to recorded sessions.
I do have and use the app for live Collaborate sessions.”
“ Currently, they won't load due to the need for java to run on my phone. Can't get it to
work.”
(4) Improving an Email feature:
“Ability to forward email messages to another email account in order to view and reply to
the messages without logging into Blackboard.”
118
(5) Improvement of the Mobile app:
“The mobile app would not allow me to scroll anywhere on it. I wanted to be able to read
through the lessons while on the road and had to read the print out pages instead because
I was not able to read past the first couple of paragraphs.”
(6) Improvement of the Quiz tool:
“The ability to take quizzes on the app, make the features of the websites oriented to the
app instead of the app taking you to the website for some features. Allow for zooming on
the screen in both the app and the website display through the app (when the app takes
you to the website, you cannot zoom in which makes it very difficult to find and touch
what you want).”
Among other additional features that students would like to see in the Mobile app
were:
(7) Browser view:
“The mobile app would not allow me to scroll anywhere on it. I wanted to be able to read
through the lessons while on the road and had to read the print out pages instead because
I was not able to read past the first couple of paragraphs.”
(8) Video player:
“Video player (for iPhone if there is one for Android)”.
(9) Choice of Usage Levels:
“Perhaps the ability to change the dynamics, from a simplified experience to advanced.
Having the option in my experience gives the user a flexible experience.”
(10) Speed:
“Any reasonable speed. It was really bad, would hang up and crash often. “
119
(11) Meeting space:
“The collaborate meeting space is buggy on different networks, The choice of meeting
spaces would help. Also, more details regarding grades would be nice as well.”
(12) Additional Communication options:
One of the respondents indicated that it would be good to add more communication
options, however, he didn’t dwell upon what options he meant.
(13) Email and Announcement Alarm:
There was also a recommendation to add the Alarm for email and announcements. In
addition, it was recommended to provide the opportunity to have “An integrated email
account.”
(14) Other:
“Would like to be able to distinguish between blogs I've read and new posts or replies.”
There were many student respondents who did not provide any recommendations or
feedback replying: “N/A” (e.g. 8 responses in the mid-term survey, and 7 – in the endterm survey), “None”, “I don't know” etc.
Section Discussion
Students expressed a great pleasure having an opportunity to use the mobile devices to
access and keep up with their course materials anytime and anywhere. Faculty and
staff, in their turn, indicated their frustration with using their mobile devices to set-up
and use the LMS tools and features. Faculty reported that they experienced difficulties
setting-up and using Gradebook, Group Discussions and Collaborate. Staff pilot
participants described the Mobile app as “atrocious and inconsistent” and the one that
didn’t meet their needs.
120
Among recommendations there were (1) making grades be seen on Discussion and
Journal pages; (2) making pre-recorded Collaborate sessions be listened; (3) making it
possible to forward emails; (4) making Quiz more user-friendly; (5) increasing browser
choice; (6) improving speed; and (7) adding an Announcement Alarm for Emails and
Notifications.
When a decision is made as to the future direction of eLearning at the university it
would be good to revisit this section to identify support (training and documentation)
opportunities.
121
Overall Feelings About the LMS by WCLD
This section offers the feedback from the World Campus Learning Design (WCLD)
team that worked closely with the Blackboard Pilot Team throughout the project to
represent and coordinate the needs of the World Campus. These data were collected
and analyzed by Andrea Gregg with assistance from Dominic Pugliese in order to
capture best practices and lessons learned based on the WCLD team experiences in
WCLD in order to improve internal unit practices moving forward in future pilots. The
data sources for this pilot are the documents and knowledge accumulated over the
roughly 8 months of the Blackboard pilot, the debriefs with all of the sub-teams of the
WCLD Blackboard Core group, the debrief held with the central TLT group, and the
results of the survey distributed to the WCLD unit.
Section Findings
The section findings describe key successes and challenges that WCLD experienced
during the pilot. Moreover, feedback on the practices that are recommended to repeat
and implement are also discussed.
Key Successes
These were the key successes WCLD experienced during the pilot.
1) People in the WCLD unit mostly experienced the pilot project as a positive
challenge.
2) New project management processes emerged that greatly improved the
efficiencies of the project (e.g. clarifying project scope, prioritizing urgent and
important tasks, creating clear sub-projects and tasks with project managers and
task owners, streamlining communications).
3) We successfully piloted one WCLD course in the SU14 semester and coordinated
with the college design shops on the two other World Campus courses that were
also piloted SU14 semester.
122
4) The courses for FA14 that were originally slated to be in the pilot were opened
successfully in their backup ANGEL version after the pilot was cancelled with 11
working days before the start of the FA14 semester.
5) There was significant University collaboration throughout the pilot that helped
develop business literacy: across design teams, positions, units within World
Campus (e.g. Marketing, PP&M, Advising, Registrar), college design shops, and
other departments in the University (e.g. TLT, ITS training services)
6) Relationship bridges were built that will continue beyond the pilot (e.g. between
WCLD and the College design shops; WCLD and TLT; World Campus ITS Help
Desks; WCLD and ITS programmers; WCLD and ITS training services).
7) There is now a greater understanding throughout the university of how Penn
State serves the needs of the adult online distance learner population.
8) The pilot provided the opportunity to evaluate the pros and cons of using a
template for course design.
9) In-house trainings were conducted within WCLD to train FA14 piloteers on the
system.
10) The pilot gave people in WCLD new opportunities to stretch and grow
professionally.
Key Challenges
1) These were the key challenges WCLD experienced during the pilot.
2) People experienced time pressures given the amount of work the pilot entailed
balanced against their existing job responsibilities.
3) The compressed timeline between when the contract was signed with Penn State
and Blackboard and when pilot SU14 courses needed to open in Blackboard
created time constraints with learning and building courses in the system and
therefore generated additional pressures for staff.
123
4) Given the speed at which things evolved and changed throughout the pilot, there
were instances when communications were out of sync (e.g. which courses were
in the pilot, whether or not the FA14 pilot was canceled).
5) The most effective communications channels (e.g. website, email, Quickbase)
were still evolving when the pilot was canceled.
6) Some project management practices weren’t implemented at the very beginning
of the project so there were lost efficiencies (e.g. communications that weren’t
streamlined, lack of clarity on who was responsible for what).
7) At times people experienced shifting goals (e.g. fully testing the system versus
using a standardized template).
Successful Practices to Repeat
These were successful practices that were a part of the pilot from the WCLD perspective
that are recommended to repeat for other pilots.
1. Get the “right people on bus” – A project of this size, scope, and complexity
benefits from having people with certain skills and temperaments; e.g. flexible,
agile, positive, good communications, and strong relationship building skills.
2. Assign a technical lead to help run the pilot - Jeanette played this role within
WCLD throughout the Blackboard pilot as there were a number of technology
variables – from how courses get set up, to developing new Evo templates, to
working with systems administrators and programmers from ITS, to being
connected to OIT that require someone with that level of vision in the unit.
3. Work collaboratively with relevant stakeholders throughout the University – It’s
important in a project of this size to work collaboratively and build on
relationships between World Campus and the rest of the university. This
sometimes requires a proactive approach regarding the unique demographics of
the online distance student population.
4. Pay attention to stress management and the affective side of the pilot - Pilots are
always stressful but especially stressful for people who do not like the unknown
124
and unexpected changes. Monitoring and attending to this can ease some of the
stress.
5. Build backup versions of the pilot courses as part of the process - When the FA14
pilot was canceled, we had to quickly convert back to the ANGEL version of the
courses; having most backups already created was tremendously helpful.
Suggested Practices to Implement
These suggested practices from the WCLD perspective were based on lessons learned
throughout the pilot.
1) Follow established project management principles from the beginning of the
project. Some practices weren’t implemented until later in the pilot and it would
have been more effective were they in place from the start.
a) Establish explicit goals and metrics for project success
b) If there is no official project manager assigned to the project, ask someone
to take on these coordinating tasks
c) Create a project plan with explicit project and task scopes and ensure that
sub-projects and tasks have clear charges and owners
d) Define what is out of scope of the pilot project (e.g. planning for the
migration)
e) Identify project risks (e.g. company changing direction, conflicting
communication coming from different sources)
f) Establish an explicit communication plan at the start; e.g. who, what,
where, when, and how
g) Select a project management system (e.g. Quickbase, MicrosoftProject) and
use it from the beginning
2) Consider impact on existing workload – Multiple people mentioned that this
project was of such a large-scale, it might make sense to consider removing
duties things from people’s day jobs so that they could focus on this pilot work.
125
3) Make sure there is consistent management understanding and support of various
initiatives – This is especially true for things like the template that are a
significant change in unit process and existing practices. This caused some
confusion and stress for people.
4) Accept that you’ll never have unanimous support/make everyone happy - A
project like this naturally involves conflict and process changes. We have a range
of perspectives in the unit about the frequency of meetings and communication.
Some very much want information solely on a “need to know” basis, others want
to be in the loop on more. Being explicit about the plan for both communication
and meeting frequency can help manage expectations.
5) Whenever possible, use a Google doc rather than emailing attachments - With an
LMS pilot that spans the university and multiple sources working on things, it is
very easy for information to get out of sync. This happened multiple times and
was ameliorated when we started consistently using Google docs.
6) Continue work on the Use Cases. John Butler took the lead on collecting LMS
“use cases” at play in WCLD courses. These should continue to be developed as
there was a lot of work that can be leveraged for the benefit of the unit as a
whole.
7) Continue to evaluate the use of a Template – We only started identifying the pros
and cons of a templated approach for course design within the unit.
8) Start the planning process for the eventual migration to a new LMS – Rather than
waiting until a new LMS is selected, it makes sense to start the planning process
now.
Section Discussion
The section discussed a variety of challenges experienced by WCLD. The positive ones
included the successfully piloted WCLD course, effective University collaboration
(including established relationships with other teams, units and services), efficient
project management and new opportunities to grow professionally. Other challenges
included (1) time pressure given the amount of work the pilot entailed balanced against
126
their existing job responsibilities, (2) the compressed timeline between when the
contract was signed with Penn State and Blackboard and when pilot SU14 courses
needed to open in Blackboard, (3) lack of clarity in communication (e.g. which courses
were in the pilot, whether or not the FA14 pilot was canceled; lack of clarity on who
was responsible for what) and (4) shifting goals (e.g. fully testing the system versus
using a standardized template).
The successful practices recommended to repeat included (1) having people in the pilot
certain skills and temperaments ( e.g. flexible, agile, positive, good communications,
and strong relationship building skills); (2) assigning a technical lead to help run the
pilot (e.g. how courses get set up, to develop new Evo templates, to work with systems
administrators and programmers from ITS etc.); (3) working collaboratively with
relevant stakeholders throughout the University; (4) paying attention to stress
management and the affective side of the pilot; and (5) building backup versions of the
pilot courses as part of the process.
Among suggested practices to implement were the following: (1) follow established
project management principles from the beginning of the project; (2) consider impact on
existing workload; (3) make sure there is consistent management understanding and
support of various initiatives; (4) accept that it is impossible to provide unanimous
support/make everyone happy; (5) whenever possible, use a Google doc rather than
emailing attachments; (6) continue work on the Use Cases; (7) continue to evaluate the
use of a Template; and (8) start the planning process for the eventual migration to a new
LMS.
When a decision is made as to the future direction of eLearning at the university it
would be good to revisit this section.
127
Suggestions and Recommendations
This section of the report offers participants’ suggestions and recommendations
regarding the LMS, particularly what needs to be improved before the possible
implementation. Data was captured from the mid-term and end-term pilot surveys.
Section Findings
In the open-ended questions #22 “Is there anything else you would like to tell us about
your experience using Blackboard this semester?” participants described their
experience working with the LMS. The respondents’ responses indicate that most of the
negative experience was caused by lack of training and bugs of the system. Since all the
participants’ experience was captured in detail in the previous section, this section
focuses only on the recommendations and suggestions provided by the pilot
participants.
Faculty Recommendations and Suggestions
The feedback from the faculty can be divided into three segments – (1) the training
related, (2) the LMS related and (3) the pilot management related.
First, it was recommended to combine group training with the one-on-one session (after
the training and during the semester):
“The training I got in preparation was extraordinarily thorough. Too much so probably.
While good to know all the settings possible it was way too much to take in all at once. I'd
suggest a model where there is an initial training session followed by one on one support
as faculty get into course building and want to do a specific thing. Having someone
available to help implement that specific need would be more efficient and a lot less
frustrating. (Though probably more expensive to support...) I'd guess most needs would
last 5-15 minutes for anyone familiar with an LMS. Perhaps that's hopelessly optimistic
in most cases but that's what I could have used a few times this semester. Instead it took
me the better part of an hour of trial and error to get something set up the way I wanted.”
128
Second, the respondents pointed out that the LMS needs some improvement such as

to make the platform more intuitive and easier to use:
“Blackboard is NOT intuitive to use. It is hard to believe that this software is not
intuitive in an era of $1 mobile apps that you can navigate with ease.”
“BB is really not too bad but it's hopelessly complicated.”
“Everyone is not "menu and mouse driven", keyboard short cuts are important. Coming
from the unix world, there is a lot of scope for running scripts - give us that feature.
Provide a way to mount our course a remote/network drive on the computer.”

to improve the functionality of the system (particularly the Mobile app,
Gradebook, Email, Calendar, as well as the feature responsible for importing
course materials):
“The gradebook is complicated and clumsy. I don't have room here to describe all the
shortcomings of the grading system. I'd be happy to discuss this with someone over the
phone.”
“Importing items from Angel proved difficult.”
“Fix mobile app. I want to try more tools next time.”
“As an instructor of an online class, the things I need to do the most are communicate
with students and grade. There is no obvious way to organize messages within the
Communicate feature, and, worse, you cannot get messages forwarded to your email. All
you can get is notification email; this is not helpful since an instructor cannot email a
student back from a smart phone, nor can an instructor decide what to do with an email
notification that doesn't contain the content of the message. If I'm not near a computer,
how can I know how quickly to find one if all I get is an alert that a message is waiting
129
for me? If there's a problem with an exam or some other "emergency," I need to know so
that I can get to a terminal ASAP.”
Third, it might be good to be more flexible in terms of providing support with dropping
the pilot and switching to the old LMS:
“Overall, I was not happy with my blackboard experience. I asked to be taken out of the
pilot and be put back on Angel...it didn't happen.”
Feedback regarding having the Blackboard Learn as a potential replacement to ANGEL
is outlined here:
“I have enjoyed using blackboard. It will be hard to go back to Angel after using a vastly
superior platform.”
“Bottom line - Bb learn is better than ANGEL but not as good as D2L or Canvas. Both of
those had tools designed in a very intuitive way.”
“My understanding is that PSU has abandoned the pilot due to some new features that
BB has added in their update. I'm happy that PSU has ditched the pilot, but it worries me
that this is why: we should pitch all thoughts of switching to Blackboard because it is a
bad system. I am not being dramatic when I say that BB was an unmitigated disaster for
my course. This terrible experience was capped just today when I entered my final grades:
typically, with ANGEL, when I've finished all my grading, it takes me about 15 minutes
to calculate final grades and get them into eLion. I spent at least an hour on this today.
It's too difficult to find drilled down details of individual student performance and to
figure out the last time a student participated during the term (needed for financial aid
students who don't do well), and it's too complicated to see which students have earned
rounded grades and which haven't. If Penn State chooses to revisit the BB pilot, I will not
participate, not with my classes. I spent too many hours on making my [Course name]
classes so popular to have them subjected to BB again. My reputation among students,
130
and the quality of my classes will suffer due to this, which I'm not willing to see happen
again.”
The features recommended for improvement by the faculty are mentioned below:
(1) Log-in:
“The Mobile App seems to bump me out often and I have to log in again.”
One of the recommendations for improvement was: “Holding the log-in so I don't
need to log in every time.”
(2) Adequacy of speed:
“Some features are very slow to load.”
(3) Calendar:
“Calendar does not open in the APP but in a browser view instead.”
“Calendar does not open in the APP but in a browser view instead. Same for Bb
Collaborate, dropboxes, email, and messages.”
(4) Browser:
“Some features seem to be in the App and others only open in a browser that needs to be
expanded with my fingers in order to see it.”
(5) Accessibility:
“Could not access quizzes and races on the mobile app.”
(6) Quiz:
131
Some of the faculty wanted to know more about “how to convert quizzes to be
mobile friendly quizzes”.
(7) Test:
Respondents expressed the need for the “conversion of tests to mobile friendly.”
(8) Other:
“I would like to see a true mobile interface. It was terrible that the mobile app basically
was just a browser for the full site because textboxes would not allow any content typed
from my iPhone, and the recipient selection list never worked. It's not a real mobile app.”
Some of the respondents were not sure about the features to recommend and provide
feedback on (2/3 responses in the mid-term survey; 2/3 - in the end-term survey).
Staff Recommendations and Suggestions
According to the experience shared by the support staff, the following
recommendations might be considered.
First of all, provide the possibility to attend the training and not to cancel them at the
last moment:
“I am disappointed not to be able to attend any training. I wish it has not been canceled,
even if we are pausing. I didn't feel I got enough chance to really get to know how to use
this tool to give you a good survey result.”
Second, try to avoid the time pressure created by the shortage of pilot timing. This can
result in such a recommendation as trying not to conduct the pilot during the summer
semester, the shortest semester of the academic year:
“Overall I really liked the system. The timing of the pilot was obviously too rushed so this
delay will be helpful.”
132
“Very tight timeline with a lot of last minute changes to the pilot and system
expectations.”
Third, in case the decision is made regarding the implementation of the LMS, it is
necessary:

to improve the navigation of the system and its speed:
“For initial setup/conversion, it would be helpful to be able to move folders to the top
level (navigation menu).”
“Maybe it's just because it's a sandbox server, but the whole system was remarkably
slow. Page loads were in the 4-5 second range most of the time.”
and

to add the social component:
“Would like to see more social components for students; profile, who is online, etc. These
features allow students to have a more realistic relationship with peers.”
Fourth, it is important to provide the participants with the access to the resources. One
of the participants complained that he “never got access to sandbox.”
Finally, the participants expressed their “Yes” and “No” for LMS implementation:
“If you want my opinion: Don't do it. We need a much more flexible solution. I would
love for the University to agree on one solution for our courses (CMS and LMS), but this
LMS is not going to allow us to succeed without the huge cost of staff hours, workarounds, and the inability to adapt and progress.”
“While not perfect, overall it was a very positive experience and seems to be an
improvement over the current system.”
133
“After a slight learning curve, Blackboard has been spectacular to use. I look forward to
hopefully PSU adopting this new platform soon!”
“Blackboard is OK, though not necessarily an improvement over ANGEL.”
Discussing the possibility of moving to the Blackboard Learn after ANGEL, one of the
staff respondents mentioned that is would contribute “to the success of Blackboard and
from a marketing and strategic perspective”, however, it might be “a giant step backwards in
progress” for the university.
“Ultimately, this LMS is just far too rigid and cuts out many of our unique pieces that
have traditionally made up what a "World Campus" course is. If we had moved to this
LMS, our success would be tied directly to the success of Blackboard and from a
marketing and strategic perspective, this does not allow us to set ourselves apart from the
other Universities in the market. Learn isn't even as advanced as the combination of
ANGEL and our homegrown content management system Evolution is currently. To
move to this LMS is almost a giant step backwards in progress for us.” (Staff)
Student Recommendations and Suggestions
Many students pointed out that it was not easy to use the system, however, they agreed
that mostly it happened because of using blackboard “first time “. They remarked that
gradually becoming more familiar with the platform they experienced less issues:
“As I use Blackboard more and become more comfortable using and directing around but
in the beginning the learning curve seemed very high.”
As it was mentioned above, ease of use (and particularly the navigation) was one of the
main frustrations of the pilot participants. Thus, some of the recommendations were to
make the system easier to use and navigate:
“If navigation is fixed, I think I would appreciate the experience much more.”
134
“It's useful that everything's condensed, it just needs to be more organized.”
“Please don't implement this fully until you find ways to simplify it.”
It was also mentioned that the platform “Has the potential to be an effective replacement for
Angel”, but as of right now students would still prefer to use Angel.
The fact that this exists means that we're heading in the right direction, so keep it up.
Once the UX kinks are worked out I think this will be a great system.
Another remark was about using Blackboard Learn only in online education:
“Prefer Angel in general but Blackboard may be better for online only courses.”
The respondents pointed out that in future it would be appreciated to be informed what
LMS was going to be used in the course in order to be able to make a decision whether
to take this course or not:
“I didn't know this course was using blackboard. I would have like to have been informed
before hand.”
“If I would have known it was going to be this much of a hassle, I would have chosen
another class. I'd like to thank Penn State for choosing a 6 week course for a pilot.”
The other recommendations related to choosing the participants for the pilot. The
preference should be given to those who do not use two different LMS at the same time.
It would help to avoid unnecessary comparisons and bias:
“It might be better to pilot blackboard with students who have not or as not using
another system at the same time to get an unbiased view. I tend to compare features to
what I know vs rating them based on how good they are standing alone.”
135
Finally, it was recommended to improve the Mobile app feature, Discussion Forum,
Submission of the assignments/papers (to add the Edit button to avoid “resubmission”),
Collaborate, Calendar, Email (esp. email forward function), uploading function and
speed (doesn’t mean the speed of the Internet).
Comparison with the ANGEL was unavoidable (students):
“Navigating discussion threads, I like ANGEL better for this.”
“I DO NOT like the set up for the discussion board. It's not very well organized. You
cannot make edits after a post, spell check corrections, or delete a post. Plus it would be
nice if everything is not put in one stream but there can be off shots (like we have in
ANGEL) for discussions.”
“The group discussion format is clumsy, takes a great deal of scrolling to get to the
newest posts. The ANGEL format is much more user friendly in that it is simpler and
easier to find the posts I am looking for. The group discussion board is essential to my
studies and I don't feel comfortable with Blackboards version.”
“I do not like the Message board as much as what we already had in ANGEL. In ANGEL
- you could see every thread and who had responded in one screen without seeing each
response. Blackboard they are either all open, or everything is closed. So, not a good of an
experience.”
“The discussion forums are not as convenient to follow as in ANGEL - the fact that we
cannot subscribe to get email when new messages are posted in really an issue.”
Other student comments included:
“I think it's a bridge between ANGEL and World Campus, and I'm glad there is finally
something in the works.”
“It’s much better than ANGEL. It’s well organized.”
136
“It was convenient to see all separate files under the content. It's more specified than
ANGEL system.”
Some of the students compared Bb Learn with the ANGEL and indicated that if it were
possible to enhance Angel their preferences would be to utilize it: “I enjoyed working in
Angel, it was easier to use. Main menu was easier to get to and you didn't need to go to the main
menu to get to a specific tab in Angel. I like the app for blackboard, but if it were possible to
enhance Angel, my preference would be to utilize it.”
In order to consider the Blackboard Learn as a potential replacement to ANGEL,
students recommended to improve the LMS with regard to accessibility and navigation:
“If navigation is fixed, I think I would appreciate the experience much more.” (Student)
“It's useful that everything's condensed, it just needs to be more organized.” (Student)
“Please don't implement this fully until you find ways to simplify it.” (Student)
Section Discussion
Overall, the main recommendations related to the effectiveness of the project were (1) to
increase the pilot timing in order to have more time to learn how to work with the LMS,
(2) to provide Instructors with the support while dropping the course and switching to
the old LMS, (3) to provide the participants with the access to the resources (e.g.
Sandbox), (4) to inform students about what platform is used in the course in order to
make them choose whether to participate or not in the pilot, and (5) to choose those
student participants who do not use two different LMS at the same time to avoid
unnecessary comparisons and bias.
As for considering or not the Blackboard Learn as a potential replacement to the
ANGEL there were the following recommendations: (1) Yes, but only after improving
137
such LMS criteria such as the ease of use, ease of learning, functionality, accessibility,
course migration and Mobile app; (2) Not and/or (3) investigate other LMS (like Canvas,
D2L).
138
Technical Issues
Overall, participants reported a relatively high incidence of technical issues (50-70% in
the middle of the semester/pilot and 10-45% by the end of the project). The student pilot
group experienced the most of the troubles (51% in the midterm and 45% - end-term).
The problems encountered were - failures to login, system crashes and system
availability. Thus, some of the problems occurred due to a learning curve but some –
due to an inefficiency of the LMS tools.
Section Findings
This section describes in detail what technical issues three groups (faculty, staff,
students) of the survey respondents had. The data were captured from all five surveys
distributed in the mid of the semester and at the end of it.
Faculty About the Technical Problems
Only 10-29% of faculty respondents didn’t encounter technical issues (midterm: 29%
(2/7); end-term: 10%(1/10)). Most of the problems, as it were already described in the
previous chapters of this report. Mainly they relate to troubles with
(1) Signing out:
“Had some early problems getting signed out constantly.”
“Also, I kept getting signed out early in the semester.”
(2) Some troubles with Grade book:
“Having problems with smaller assignments appearing in grade book.”
“The grade book didn't always work smoothly.”
(3) Difficulties with importing course materials (See Migration Chapter):
(4) Troubles uploading files:
139
“I have not been able to upload certain zip files I had previously used in Angel.”
(5) Issues with browsers:
“Some students had difficulty with BB and Internet Explorer.”
(6) Email integration:
“Email integration was really bad.”
(7) Mobile app:
“Mostly "technical problems" were isolated to the app, though.”
Other comments are the following:
“A few of my students had problems but WC helpdesk resolved the problems quickly.”
“Red error pages -- I used all the resources listed below for help. “
“Enrolling students that aren't on the course Roster.”
“Although blackboard is newer than Angel, I believe I have more control with Angel. “
Staff About the Technical Problems
Below is the feedback form the IDs on what technical problems were encountered. Out
of 17 responses, 4 (25%) indicated that they did not experience any technical issues, and
if those occurred they related it ”to unfamiliarity with the product.” The rest of the
participants encountered such troubles as
(1) Troubles getting into the course:
“We had trouble getting in the course, but nothing really after that.”
(2) System bugs:
140
“I have reported several bugs, one of which they shrugged off and gave me a workaround
as a solution (Change an internal preference setting on my operating system).”
“There were several bugs and issues with Bb Learn prior to moving to the April 2014
release. Various items were captured in the document linked below
https://docs.google.com/spreadsheets/d/137IBqmtc5Y9cL9fzfzpIRuR6gLoKcxtXlhdUtHz8fTo/edit#gid=0
“
“Reported bugs to the WC Help Desk.”
(3) Sessioning issues:
“Many weird sessioning issues, where it will remember I was an instructor in another
course and show me an instructors view in a course where I am a student.”
(4) System slowness;
(5) Inability to delete an item;
(6) Error messages:
“Whenever I created a new forum thread, I got an error message and got kicked out. The
thread always showed up, it just gave me an error.”
(7) Accessibility issues;
(8) Issues with the Assessment Review:
“Students have complained about assessment reviews not opening on time.”
(9) Inability to enroll into the course:
“Could not enroll in another faculty member's course.”
(10) Limitations to group assignments:
“Once groups had been assigned to an assignment, they could not be changed.”
141
(11) Calendar did not show due dates in advance:
“We didn't use the Blackboard Calendar because due dates for unavailable assignments
are invisible. If an exam is only open for a few days, the due date won't appear until the
exam is actually available to take.”
(12) Issues changing the server:
“Also, the test description and instructions got "locked in," even before any students had
taken them. It seems like this may have happened when I moved the course from one
Blackboard server to another one. Other people reported similar issues.”
Students About the Technical Problems
Below is the feedback from the students on what technical problems were encountered.
Out of 53 responses, 27 (51%) respondents did not experience any technical problem
with the Blackboard by the time when the mid-term survey was distributed. By the end
of the semester, 20 (45%) respondents out of 40 (end-term survey) remarked that they
had no technical issues. However, 49-55% of the pilot participants encountered some
technical problems, which are described below.
The most complaints related to loading/unloading. The participants complained that
that couldn’t load “certain items” (like homework, videos, documents (like PDFs)),
couldn’t often load “on first attempt.”
“It took a long for the pages to load.”
“Not being able to load the films faster than it would play.”
Then, The Email brought a lot of frustration to the respondents:
“Unable to see sent email/messages when sending a course specific message.”
“Unable to forward course specific messages to my psu.”
142
“Difficult ….. to check email correspondence inside the blackboard client, [it] is extra
work.”
Other frustrating thing was navigation and layout:
“Navigation and layout across the board are convoluted and confusing.”
“I felt like I would cycle through areas of the course that I would not need to see. The
navigation was generally confusing. Sometimes I would not have any issues, but others,
it took me some time to figure out how to get where I wanted to get.”
The complaints also included “frozen” buttons, tabs and links:
“I pressed a button/link and it froze.”
“Sometimes, tabs and links don't work when I click them.”
The participants had some hard time accessing the tools like Blog, Quiz and Grade
Center (e.g. log-in issues):
“Sometimes when I navigate to the blog through my lesson it says I am not logged into
the course. I have to click on it a few times in order to finally get into the blog.”
“I had trouble accessing the blog sometimes. It would bring up a screen to log in.”
“I could not access my grade easily.”
Sometimes I couldn't access ac quizz but if I went into future tasks, there I could.
Some of the he respondents also had difficulties using Notification (One of the
complaints was “lack of notifications.”), Discussion, Blog, Collaborate (2/53), Grade, Test
and Quiz:
143
“Quizzes not working, not grading properly, not letting me see some links.”
“I couldn't find my previous quiz.”
“A test submission errored out on me but it submitted and kept me on the test. I moved
away from the exam and saw it had a grade so it did submit just did not move me. Only
once out of 7 tests though.”
“The feature that shows when assignments are due only worked on occasion. Not a good
feature unless the assignments are only graded occasionally.”
“[I had issues] Just [with] the blog link [that] didn't always work reliably.”
“[The] only [issues I experienced was] not being able to see a mid term but this was later
found out to be due to a test not being graded.”
“[The trouble was] Reviewing the midterm exam.”
“Discussion page editing is poor.”
By the end of the semester one of the main troubles was entering/connecting the
Collaborate space:
“Entering the collaborate space to meet with my professor.”
“Issue with Java update and connecting to Collaborate.”
“Only with getting the collaborate to load in the beginning.”
“I got kicked off of collaborate meeting space several times. I couldn't connect to the
meeting space via my iPhone, and then when I could, I couldn't connect to the team 4
meeting space. “
Other comments included:
144
“It seems Blackboard hates Internet Explorer. The layout is scrambled and the quizzes
don't properly load. I had to ask my professor to undo my tries.”
“Not Apple friendly. Had issues with the audio.”
“The first time video conferencing on Wi-Fi was very slow with video and audio lag, but
I don't know if that's my Internet speed or BB.”
“Does not print all content on “Activities Page” when you select print via short keys
(ctrl+p). Also, did not see a print option when on the “Activities Page.” o Only way to
print everything is to highlight and copy paste into a word processor.”
“BB communicate didn't work although I tested my system before I used it. I could not
see what the professor was going over during the first week and others were having
difficulty as well.”
“Once in the lesson and on pages 1+ of the lesson/lecture notes, it is difficult to get back
to the table of contents. It would be nice if there was a quick link to re-direct to the table
of contents. It also appears to be slow and not very responsive, but that is the same as
most everything from the World Campus.”
“SLOW as hell. I dunno if this is a Penn State problem or a Blackboard problem”.
“Issues submitting a project and filling at a peer assessment.”
Section Discussion
As it was mentioned in the beginning of this section, overall, participants reported a
high incidence of technical issues. The common issues were failure to log-in and/or out,
failure to load/upload documents of different formats (PDFs, Videos), browser/server
compatibility issues (Students reported problems working with Internet Explorer and
Safari), eMail integration (e.g. Forwarding eMail to the PSU account), system slowness,
system crashes/bugs, troubles with the Mobile app.
145
Faculty reported problems with importing course materials, which means that the
ability to move content from a legacy system, in our case ANGEL, into a new LMS
(Blackboard Learn) was problematic. The IDs pointed out that they experienced issues
with the Assessment Review and were not able to enroll into the course. The main
problem that students faced was the accessibility of the LMS tools (e.g. Grade Center,
Collaborate space, Blog and Quiz). Calendar, Notification and Announcement tools
brought a lot of frustration since they didn’t show due dates in advance and didn’t have
the alarm. The rest of problems included inability to delete an item, error messages,
“freezing” links, buttons and tabs. As it was mentioned above, with getting more
familiar with the LMS the number of issues slightly decreased.
146
Resources Used to Resolve Technical Problems
This section describes what resources the pilot participants used to resolve those
technical issues that they faced while using Blackboard Learn. In addition to this, the
pilot participants provided their comments on what type of support was preferred.
Section Findings
Faculty About Resources Used to Resolve Problems
The Table 40 describes the resources that the instructors used to resolve the problem. In
the beginning of the semester the most popular resource was Help Desk Email (50%).
However, towards the end of the semester the respondents were able to figure out the
problems on their own (40%).
Faculty About the Resources to Resolve Technical Issues
Technical Issues
Help Desk Email
Contacting the Listserv
Contacting an individual who is associated with
the Pilot
I was able to figure it out on my own
Total # of Responses
MidTerm
Number
3
0
MidTerm
Percent
50
0
EndTern
Number
3
0
EndTerm
Percent
30
0
1
17
3
30
2
6
33
100
4
10
40
100
Table 40: Faculty About the Resources to Resolve Technical Issues
147
Below is a graphical representation of the faculty feedback on the resources used to
resolve technical issues encountered.
I was able to figure it out on my own
Contacting an individual who is associated
with the Pilot
End-Term Survey
Mid-Term Survey
Contacting the Listserv
Help Desk Email
0
10
20
30
40
50
60
Figure 50: Faculty About the Resources to Resolve Technical Issues
Among other resources listed by respondents were:
(1) Online Blackboard help:
“I looked at the Bb Help available online…”
“Blackboard online instructions.”
(2) Other universities websites:
“[I looked] …at Bb and also at other University's websites.”
(3) Campus course designer:
“The course designer on my campus.”
(4) A tutor:
“A tutor that could explain how to do simple things that I forgot how to do since the
training.”
148
(5) Colleagues:
“Louise was wonderful! Very helpful. Other people in pilot were very helpful.”
(6) Relatives:
“My husband who has a lot of experience with Blackboard was able to help me resolve
most of the problems I encountered.”
Faculty About Preferred Support in Resolving Technical Issues
Other kind of support that participants preferred included:
(1) Phone Support:
“A phone line one could call for basic help.”
“Just being able to get on the phone with someone for 10 minutes the few times I got
stuck on something would be useful.”
(2) FAQ List:
“I cannot attend things like the weekly webinars because my schedule is too packed,
maybe a FAQ list can be developed? perhaps that already exists?”
(3) Colleagues:
“Hearing from other instructors and IDs is extremely helpful.”
(4) Email:
“Going back and forth with someone via email is workable but there are times when I'm
working on a quiz and have a quick question about a setting. Being able to talk to
someone for two minutes, at that moment, would be invaluable. Waiting an hour for an
email response is not useful.”
(5) On your own:
149
“I'd just dork around with it till I figured it out and then would be frustrated for a hour
afterwards.”
(6) Other:
“Others may want to have a person come to the office to walk them through a feature or
problem.”
Staff About Resources Used to Resolve Problems
The Table 41 describes the resources that the instructors used to resolve the problem.
Staff About the Resources to Resolve Technical Issues
Technical Issues
Help Desk Email
Contacting the Listserv
Contacting an individual who is associated with the Pilot
I was able to figure it out on my own
Total # of Responses
Number
6
0
9
3
18
Percent
33
0
50
17
100
Table 41: Staff About the Resources to Resolve Technical Issues
Below is a graphical representation of the faculty feedback on the resources used to
resolve technical issues encountered.
I was able to figure it out on my own
Contacting an individual who is associated with the
Pilot
Contacting the Listserv
Help Desk Email
0
10
20
30
40
50
60
Figure 51: Staff About the Resources to Resolve Technical Issues
The most popular way of resolving technical issues was “Contacting an individual who
is associated with the Pilot” (50%). Help Desk Email was indicated as a helpful resource
150
in resolving technical issues by 33% of respondents. Seventeen percent of the IDs
preferred to figure out the way out on their own.
Other resources included:
(1) Pilot participants:
“Other people involved in the pilot or blackboard resources (not ones provided to us, ones
that we were able to find on our own).”
(2) Peers;
(3) Tech tutors:
(4) Google:
“Any other issues or questions - I googled to find the answers. There are tons of online
resources.”
(5) Previous experience:
“I used my own experiencing supporting and teaching with Blackboard from another
institution.”
(6) Training services;
(7) Bb Help Site (E.g. Documentation (https://help.blackboard.com/).
Staff About Preferred Support in Resolving Technical Issues
Out of 6 responses, 3 (50%) indicated that the support they preferred included: Step-by
step explanations/guidance through the training or brief videos about the platform
tools:
“A Bb basics training program: Step-by-step guidance through the basic process most
faculty would follow with Bb (approx 1 hr).”
151
“AND several more advanced pre-recorded modules: Further exploration of targeted
topics - each module about 30 minutes long. “
“Quick brief videos about a tool. A "how to" one page guide about a tool.”
“I like step-by-step with examples and exemplar courses.”
Students Resources Used to Resolve Problems
Out of 29 respondents, 13 respondents from the midterm survey and 18 respondents
from the end-term survey didn’t look for resources that could resolve their technical
issues. It might be that they didn’t experience any technical issues.

The rest of the students reported that the main resources they used to resolve
their issues were (1) Youtube, (2) Google, (3) PSU Help Desk, (4) World Campus
Help Desk, (5) a Message Board, (6) Blackboards Help Feature (e.g. tutorials for
users), (7) help from the classmates, (8) Help fro the instructor, (9) External
websites (e.g. video tutorials)
and (8) patience:
“To solve issues in the past I have talked to other people who are taking the same course.”
“Similar to when I'm trying to troubleshoot other products I use, I usually try to find a
message board. If there was a bug reporting board accessible by PSU accounts, I think
that could help, and also help the support team understand how people are using the
system.”
“I just kept looking until i eventually found it.”
“Help Desk was very helpful when I cam across an issue.”
“World Campus Help Desk Very helpful.”
152
“Talking with professor and or peers.”
“I would say, BB needs more time. At the moment, there are work arounds but the overall
experience of BB feels less productive because of them.”
“Asking other people who were more familiar with Blackboard for help.”
“Teammates. We all did our own scavenger hunts to find what we needed within the
system.”
“The World Campus Help Desk has been amazing any time I've needed technical
assistance!”
In addition to this, some respondents said that their past experience with blackboard
was of a great help.
Students About Preferred Support in Resolving Technical Issues
The preferred support that the respondents reported was Online web chats, a
Collaborate video and an instructor’s help.
Section Discussion
The common resources that all pilot participants used to resolve their technical issues
were Help Desk (including World Camp Help Desk), External Websites (e.g. websites of
other universities) and Google search (YouTube was very popular), Blackboard Help
website (e.g. video tutorials), previous experience with the Blackboard or other LMS,
peers, colleagues, instructors, relatives and even own patience.
153
Recommendations For Help & Support
This section offers participants’ recommendations and suggestions regarding Help &
Support services.
Section Findings
Recommendations From Faculty
Suggestions and recommendations for Help & Support provided by the instructors
included:
(1) No Improvement is needed:
“None -- great job!”
“I think the Help and Support has been very good! Louise, in particular, has helped me a
great deal.”
“The one time I contacted them, they were helpful.”
(2) Conduct a training on the participant’s campus:
“Would like to see training on my own campus during the Fall or Spring semester, not
during the summer months.”
“In the future having training on my own campus.”
(3) Have a Template:
“Because BB has so many options, I strongly suggest the university consider some kind
of template because otherwise students might get very confused with how different
courses will be set up in wildly different way.”
(4) Do not change the LMS:
“Keep Angel.”
154
Recommendations From Staff
Out of 10 responses, 4 (40%)of the respondents indicated that they had no
recommendations to provide. One of them remarked that “did not use help and support.”
Overall, suggestions and recommendations for Help & Support provided by the IDs
related to training and documentation:
“I think trainings that we could have attended earlier in the summer would have been
nice. They may have been there and I just didn't know about them.”
“The weekly webinars have been somewhat helpful. The 3-day training I intended to
attend was canceled. I feel I haven't yet dived in at much depth because I was waiting
until after the training would be over.”
“Better explanation with what the problem is, what caused it, and how it was fixed.”
“The audio on the recording of the instructional designer training workshop is very
poor.”
It was mentioned that “The help and support on the PSU end of this pilot has been great.
Whether it is working with the HelpDesk or with central, everyone seems to be willing to help
others out.”
Other comments included:
“More of the same. I liked that it is divided by role and release number.”
“Communications can be improved.”
Recommendations From Students
Among recommendations provided by students were:
(1) Increase the number of phone numbers for Help & Support:
155
“More numbers to call for support if any questions about the material.”
(2) Consult the third party websites:
“I would suggest looking into 52 Weeks of UX (http://52weeksofux.com/), it's a great
resource and provides great perspective when trying to build a better user experience.
Just a simple look at the user interface and experience, whether a UX Design Engineer
needs to be hired or not, that's obviously not my call.”
(3) Simplify the navigation and interface of the LMS:
“Making tabs less confusing by not making students have to restart at the tab every time
they want to get somewhere.”
“The drop down menu of student status on this survey would not allow me to use the
side bar to pull additional choices. I am in non degree GO60 program.”
“Change the interface slightly making it more straight forward in navigation.”
“Make navigation easier.”
“Make the layout easier to go about.”
(4) Improve the Mobile App:
“Develop the mobile app further. I think this is the best thing about BB so far.”
(5) Keep ANGEL:
(6) Improve communication tools (e.g. Discussion Board, Email and Announcements):
“Change the discussion board format back to Angel's style, simpler is better in this.”
“In the discussion board tab, have all the boards showing, not just the group you are in.”
156
“Work on streamlining the communication client for email or discussion boards within
the blackboard course.”
“Discussion board/Email text editing sections should have more basic functions, like cut
and paste. Also spell check and doesn't create inconsistent font formatting. “
“Improve message boards and notifications of new messages, assignments, etc.”
“Fix automatic forwarding.”
(7) Improve the Assessment tool:
“Fix peer assessments. I went to the assessment link but nothing was there for me to click
on once there.”
(8) Improve the Collaborate:
“Collaborate meeting space connection issues.”
(9) Set Up a Blog related to Help & Support:
Setting up a user bug report forum may be worthwhile, as it would allow a place for
dialogue in reporting and working out issues.
Section Discussion
To sum this section up, it is important to say that many of respondents’
recommendations related to the improvement of the LMS, making it easier to use and
user-friendlier. Most of the respondents indicated that they would like to see ANGEL
improved and kept as the LMS of the university. When a decision is made as to the
future direction of eLearning at the university it would be good to revisit this section to
identify support (training and documentation) opportunities.
157
Appendixes
Appendix A: Faculty Mid-Term Survey: Blackboard Pilot,
Summer 2014
Please help us evaluate the effectiveness of the pilot to date by completing the brief survey. Your
responses will help us in determining what adjustments can be made to enhance the overall
experience.
Part I: Demographic Information
Please provide us with the following demographic information:
1. For how many years have you been an instructor/faculty member in higher
education?
o
1 year or less
o
2 - 5 years
o
6 - 10 years
o
11 - 20 years
o
21 - 30 years
o
More than 30 years
2. What is your gender?
o
Female
o
Male
o
Other:
3. In which course(s) during this semester are you using Blackboard?
(If you use Blackboard in multiple courses, please indicate them as well).
4. How many students are in your class?
5. Please indicate in what form the course is delivered.
(Choose one BEST answer)
o
Face - to - face
o
In a hybrid format using a blend of face-to-face and online interaction
o
Online with face-to-face interaction only for exams
o
Only online with no face-to-face interaction
6. Please indicate your level of comfort in using different types of technology.
o
Very Uncomfortable
o
Somewhat Uncomfortable
o
Somewhat Comfortable
o
Very Comfortable
158
o
Other:
7. What type(s) of networked device(s) do you currently use on a regular basis?
(Choose all that apply)
o
Mobile phone
o
Portable media player (e.g., iPod Touch) (e.g., mp3 player)
o
Tablet (e.g., iPad, Nexus, Galaxy)
o
Ebook reader (e.g., Kindle)
o
Laptop/Netbook computer
o
Desktop computer
o
Other:
Part II: Feedback on Blackboard
Please let us know how you feel about specific tools/features of the LMS by
answering the questions below.
8. Please rate the overall ease of use of Blackboard LMS.
o
Difficult to Use
o
Slightly Easy to Use
o
Moderately Easy to Use
o
Very Easy to Use
9. Please rate the overall usefulness of Blackboard for your teaching.
o
Not at all Useful
o
Slightly Useful
o
Moderately Useful
o
Highly Useful
10. Please rate the overall usefulness of Blackboard’s online documentation.
o
Did Not Use
o
Not at all Useful
o
Slightly Useful
o
Moderately Useful
o
Highly Useful
11. Have you been able to test the Blackboard Learn mobile app on either a
smartphone or a tablet?
o
o
Yes
No
12. If yes, please rate the usefulness of the following Blackboard Mobile Learn
products:
159
Do Not Use
This
Feature
Not at all
Useful
Slightly
Useful
Moderately
Useful
Highly
Useful
Customization
Discussions
Grades
Announcements
Content
Blogs
Journals
Dropbox
Tests
Push
Notifications
Tasks
Roster
13. If no, please, explain why.
14. What additional features would you like to see in the Mobile app?
15. Please rate your level of satisfaction with the Blackboard tools and features
designed to support the following teaching and course management tasks:
Do Not
Use
Not at all
Satisfied
Slightly
Satisfied
Moderately
Satisfied
Highly
Satisfied
Creating and
publishing the
course syllabus
(Content)
Creating a course
calendar (Course
Calendar)
Posting course
announcements
(Announcements)
Uploading,
organizing, and
sharing course files
(Control
Panel>>Content
Collection>>Course
Name)
160
Do Not
Use
Not at all
Satisfied
Slightly
Satisfied
Moderately
Satisfied
Highly
Satisfied
Posting audio/video
lectures or other
multimedia (Control
Panel>>Content
Collection>>Course
Name)
Creating course web
pages (Content Area
> Blank Page)
Organizing course
content, activities,
and assessments
into a series of
modules or lessons
(Content Area,
Learning Modules)
Posting assignments
(Content >
Assignment)
Assigning individual
and collaborative
writing tasks
(Journals, Wikis,
Blogs)
Assigning peer
reviews on student
work (Self and Peer
Assessment, Wikis,
Blogs)
Creating and
administering online
quizzes, tests,
and/or surveys
(Tests, Surveys, and
Pools)
Facilitating graded
and ungraded
discussions
(Discussions)
Giving feedback on
and/or grading
161
Do Not
Use
Not at all
Satisfied
Slightly
Satisfied
Moderately
Satisfied
Highly
Satisfied
student submissions
(GradeCenter >
Needs Grading)
Creating and using
rubrics to grade
student work
(Rubrics, Grade
Center)
Setting up and using
the gradebook to
enter and track
student grades
(Grade Center)
Monitoring course
activity and student
progress (Course
Reports,
Performance
Dashboard,
Retention Center)
Creating and
managing groups for
group assignments,
group discussions,
and/or group
projects (Groups)
Conducting online
chat sessions
(Blackboard
Collaborate>>Course
Room)
Keeping track of
your course tasks
(Calendar, To Do,
Needs Attention)
Importing or
exporting course
content (Packages
and Utilities)
Integrating an
external learning
162
Do Not
Use
Not at all
Satisfied
Slightly
Satisfied
Moderately
Satisfied
Highly
Satisfied
tool or platform with
my course, e.g.,
SoftChalk Cloud,
Piazza, etc. (Web
Link)
Customizing the
navigation, look, and
feel of your course
(Quick Setup Guide,
Teaching Style)
Connecting or
encouraging
students to connect
with Blackboard
users and groups
within or outside of
your course
(Blackboard Global
Learning Network
Sending and
receiving messages
to and from
students using
Course Messages
Sending and
receiving messages
to and from
students and groups
using Send Email
Using Turnitin
originality checking
on assignments
(Turnitin Direct
Assignment)
Using SafeAssign
originality checking
on assignments
(SafeAssign Direct
Assignment)
16. Please rate your level of agreement with the following statements about
Blackboard.
163
Strongly
Disagree
Disagree
Neither
Agree
nor
Disagree
Agree
Strongly
Agree
Not
Applicable
Blackboard
enables me to do
what I wanted
for my course(s).
Blackboard is
easy for my
students to learn
how to use.
Blackboard
increases my
efficiency as a
teacher.
Blackboard
increases my
effectiveness as a
teacher.
Using Blackboard
is beneficial to
my students’
overall learning.
Blackboard was a
valuable aid to
me in my
teaching.
17. Please indicate the average number of hours per week using Blackboard.
o
Never
o
Fewer than 5 hours
o
5-10 hours
o
11-15 hours
o
16-20 hours
o
More than 20 hours per week
18. What do you like MOST about Blackboard? Why?
19. What do you like LEAST about Blackboard? Why?
20. Which, if any, features/tools in Blackboard allows you to design your course
and/or teach in a new way?
164
21. Is there anything else you would like to tell us about your experience using
Blackboard this semester?
Part III: Help & Support
Please let us know how useful the help and support services are by answering
the questions below.
22. What technical problems have you experienced with Blackboard so far?
23. If I encountered a problem with Blackboard, I used the following resources to
help me resolve my issue:
o
Help Desk Email
o
Contacting the Listserv
o
Contacting an individual who is associated with the Pilot
I was able to figure it out on my own
o
24. If there are other resources you would use to help you resolve your issues,
please, specify what they are:
25. Please provide any suggestions/improvements for Help & Support:
26. If there is other kind of support you prefer, please, specify what it is:
Thank You!
We appreciate the time you have spent in providing us with feedback that will
help us make better decisions regarding the future of eLearning at Penn State.
Submission
Please, click "Submit" button to submit your survey responses.
Never submit passwords through Google Forms.
Powered by
Google F or ms
This content is neither created nor endorsed by Google.
Report Abuse - Terms of Service - Additional Terms
165
166
Appendix B: Staff MidTerm Survey: Blackboard Pilot, Summer
2014
Please help us evaluate the effectiveness of the pilot to date by completing the brief survey. Your
responses will help us in determining what adjustments can be made to enhance the overall
experience.
Part I: Demographic Information
Please provide us with the following demographic information:
1. Please indicate your role in the pilot:
o
Instructional Designer
o
Instructional Production Specialist
o
Support Staff
o
Other:
2. How many years have you worked in this role in higher education?
o
1 year or less
o
2 - 5 years
o
6 - 10 years
o
11 - 20 years
o
21 - 30 years
o
More than 30 years
3. What is your gender?
o
Female
o
Male
o
Other:
4. In which course(s) during this semester are you using Blackboard?
(If you use Blackboard in multiple courses, please indicate them as well).
5. Please indicate in what form the course is delivered.
(Choose one BEST answer)
o
Face - to - face
o
In a hybrid format using a blend of face-to-face and online interaction
o
Online with face-to-face interaction only for exams
o
Only online with no face-to-face interaction
6. Please indicate your level of comfort in using different types of technology.
o
Very Uncomfortable
o
Somewhat Uncomfortable
167
o
Somewhat Comfortable
o
Very Comfortable
o
Other:
7. What type(s) of networked device(s) do you currently use on a regular basis?
(Choose all that apply)
o
Mobile phone
o
Portable media player (e.g., iPod Touch) (e.g., mp3 player)
o
Tablet (e.g., iPad, Nexus, Galaxy)
o
Ebook reader (e.g., Kindle)
o
Laptop/Netbook computer
o
Desktop computer
o
Other:
Part II: Feedback on Blackboard
Please let us know how you feel about specific tools/features of the LMS by answering
the questions below.
8. Please rate the overall ease of use of Blackboard LMS.
o
Difficult to Use
o
Slightly Easy to Use
o
Moderately Easy to Use
o
Very Easy to Use
9. Have you been able to test the Blackboard Learn mobile app on either a smartphone or a
tablet?
o
Yes
o
No
10. If yes, please rate the usefulness of the following Blackboard Mobile Learn products:
Do Not Use
This
Feature
Not at all
Useful
Slightly
Useful
Moderately
Useful
Highly
Useful
Customization
Discussions
Grades
Announcements
Content
Blogs
Journals
Dropbox
168
Do Not Use
This
Feature
Not at all
Useful
Slightly
Useful
Moderately
Useful
Highly
Useful
Tests
Push
Notifications
Tasks
Roster
11. If no, please, explain why.
12. What additional features would you like to see in the Mobile app?
13. Please rate your level of satisfaction with the Blackboard tools and features designed to
support the following teaching and course management tasks:
Do Not
Use
Not at all
Satisfied
Slightly
Satisfied
Moderately
Satisfied
Highly
Satisfied
Creating and
publishing the
course syllabus
(Content)
Creating a course
calendar (Course
Calendar)
Uploading,
organizing, and
sharing course files
(Control
Panel>>Content
Collection>>Course
Name)
Posting
audio/video
lectures or other
multimedia
(Control
Panel>>Content
Collection>>Course
Name)
Creating course
web pages
(Content Area >
Blank Page)
169
Do Not
Use
Not at all
Satisfied
Slightly
Satisfied
Moderately
Satisfied
Highly
Satisfied
Organizing course
content, activities,
and assessments
into a series of
modules or lessons
(Content Area,
Learning Modules)
Creating
assignments
(Content >
Assignment)
Assigning
individual and
collaborative
writing tasks
(Journals, Wikis,
Blogs)
Creating and
administering
online quizzes,
tests, and/or
surveys (Tests,
Surveys, and Pools)
Creating rubrics to
grade student work
(Rubrics, Grade
Center)
Setting up the
gradebook (Grade
Center)
Creating and
managing groups
for group
assignments, group
discussions, and/or
group projects
(Groups)
Managing course
tasks (Calendar, To
Do, Needs
Attention)
170
Do Not
Use
Not at all
Satisfied
Slightly
Satisfied
Moderately
Satisfied
Highly
Satisfied
Importing or
exporting course
content (Packages
and Utilities)
Integrating an
external learning
tool or platform
with the course,
e.g., SoftChalk
Cloud, Piazza, etc.
(Web Link)
Customizing the
navigation, look,
and feel of the
course (Quick
Setup Guide,
Teaching Style)
14. Please indicate the overall number of hours you need to set up the course.
o
Never
o
Fewer than 5 hours
o
5-10 hours
o
11-15 hours
o
16-20 hours
o
More than 20 hours per week
15. What do you like MOST about Blackboard? Why?
16. What do you like LEAST about Blackboard? Why?
17. Which, if any, features/tools in Blackboard allows you to design your course in a new way?
18. Which, if any, features/tools in Blackboard allows you to design your course in a new way?
Part III: Help & Support
Please let us know how useful the help and support services are by answering the
questions below.
19. What technical problems have you experienced with Blackboard so far?
20. If I encountered a problem with Blackboard, I used the following resources to help me
resolve my issue:
o
Help Desk Email
171
o
Contacting the Listserv
o
Contacting an individual who is associated with the Pilot
o
I was able to figure it out on my own
21. If there are other resources you would use to help you resolve your issues, please, specify
what they are:
22. Please provide any suggestions/improvements for Help & Support:
23. If there is another kind of support you prefer, please, specify what it is:
Thank You!
We appreciate the time you have spent in providing us with feedback that will help us
make better decisions regarding the future of eLearning at Penn State.
Submission
Please, click "Submit" button to submit your survey responses.
Never submit passwords through Google Forms.
172
Appendix C: Student MidTerm Survey: Blackboard Pilot,
Summer 2014
Please help us evaluate the effectiveness of the pilot to date by completing the brief survey. Your
responses will help us in determining what adjustments can be made to enhance the overall
experience.
Part I: Demographic Information
Please provide us with the following demographic information.
1. What is your current academic level?
o
First-year undergraduate (Freshman)
o
Second-year undergraduate (Sophomore)
o
Third-year undergraduate (Junior)
o
Four or more years undergraduate (Senior)
o
Masters student (MA, MS, MBA, MFA, MSW, MPA, etc.)
o
Doctoral Student (EdD, PhD, etc.)
2. What is your age?
o
Under 24
o
25 - 29
o
30 - 34
o
35 - 39
o
40 - 44
o
45 - 49
o
50 - 54
o
55 - 59
o
60 - 64
o
65 - 70
o
71 & Over
3. What is your gender?
o
Female
o
Male
o
Other:
4. Which course(s) are you currently enrolled that uses Blackboard?
(If you used Blackboard in multiple courses, please indicate them as well).
5. Please indicate in what form the course has been delivered.
(Choose one BEST answer)
o
Face - to - face
173
o
In a hybrid format using a blend of face-to-face and online interaction
o
Online with face-to-face interaction only for exams
o
Only online with no face-to-face interaction
6. Please indicate your level of comfort in using different types of technology.
o
Very Uncomfortable
o
Somewhat Uncomfortable
o
Somewhat Comfortable
o
Very Comfortable
o
Other:
7. What type(s) of networked device(s) do you currently use on a regular basis?
(Choose all that apply)
o
Mobile phone
o
Portable media player (e.g., iPod Touch) (e.g., mp3 player)
o
Ebook reader (e.g., Kindle)
o
Tablet (e.g., iPad, Nexus, Galaxy)
o
Laptop/Netbook computer
o
Desktop computer
o
Other:
Part II: Feedback on Blackboard
Please let us know how you feel about specific tools/features of the Blackboard
by answering the questions below.
8. Please rate the overall ease of use of Blackboard.
o
Difficult to Use
o
Slightly Easy to Use
o
Moderately Easy to Use
o
Very Easy to Use
9. Please rate the overall usefulness of Blackboard’s online documentation.
o
Do Not Use
o
Not at all Useful
o
Slightly Useful
o
Moderately Useful
o
Highly Useful
10. Please rate the usefulness of the following features of Blackboard in
contributing to your learning in this course.
174
Do Not
Use This
Feature
Not at all
Useful
Slightly
Useful
Moderately
Useful
Highly
Useful
Announcements (for
reading
announcements and
other timely news and
information posted by
your instructor or
department)
Assignments (for
submitting individual
or group assignments)
Calendar (for
managing your
personal calendar and
viewing course events
and due dates)
Chat (for live text
messaging with
classmates and other
Blackboard users)
Course Messages (for
sending and receiving
messages to and from
your instructor and
other students)
Groups (for
collaborating with a
specific group of
students on
assignments,
discussions, blogs,
wikis, or projects)
Journal (for keeping a
learning journal
shared with your
instructor)
Content Collection >
My Content (for
storing personal files
related to your course
work)
175
Do Not
Use This
Feature
Not at all
Useful
Slightly
Useful
Moderately
Useful
Highly
Useful
My Grades (for
viewing a list of the
graded items in the
course and the grades
you received)
Quizzes/Tests (for
taking and receiving
feedback on online
quizzes, tests, and
self-assessments)
Roster (for viewing a
list of the other
people in the course)
Rubrics (for
understanding how
your work will be or
was graded)
Send Email (for
sending messages to
the external email
account of other
course members)
Surveys (for taking
online surveys)
Tasks (for completing
a list of tasks prepared
by the instructor)
Discussions/Discussion
Board (for
participating in online
discussions with the
entire class)
Discussions/Discussion
Board (for
participating in online
discussions in small
groups)
Blog (for individual
and group writing
176
Do Not
Use This
Feature
Not at all
Useful
Slightly
Useful
Moderately
Useful
Highly
Useful
tasks assigned by your
instructor)
Wikis (for individual
and group writing
tasks assigned by your
instructor)
Collaborate (for
participating in virtual
classrooms and
meeting spaces (web
conferencing))
Self and Peer
Assessment (for
providing and
receiving feedback
from peers)
11. Please rate your level of agreement with the following statements about
Blackboard.
Strongly
Disagree
Disagree
Neither
Agree
nor
Disagree
Agree
Strongly
Agree
Not
Applicable
Blackboard helps
me to learn the
course
materials/content.
Blackboard helps
me to study for
exams/tests.
Blackboard helps
me to complete
course
assignments.
Blackboard helps
me to take
quizzes/exams.
Blackboard helps
me to make
efficient use of my
177
Strongly
Disagree
Disagree
Neither
Agree
nor
Disagree
Agree
Strongly
Agree
Not
Applicable
time in the course.
Blackboard helps
me to be in control
of my own learning
in the course.
Blackboard helps
me to
communicate with
my professor.
Blackboard
expands access to
learning
materials/resources
available to me
(e.g., print, audio,
video, etc.).
Blackboard is
beneficial to my
overall learning in
the course.
12. If I encountered a problem with Blackboard, I used the following resources to
help me resolve my issue:
o
Help Desk Email
o
Contacting the Listserv
o
Contacting an individual who is associated with the Pilot
o
I was able to figure it out on my own
13. Have you been able to test the Blackboard Learn mobile app on either a
smartphone or a tablet?
o
o
Yes
No
14. If yes, please rate the usefulness of the following Blackboard Mobile Learn
products:
Do Not Use
This
Feature
Not at all
Useful
Slightly
Useful
Moderately
Useful
Highly
Useful
Customization
Discussions
178
Do Not Use
This
Feature
Not at all
Useful
Slightly
Useful
Moderately
Useful
Highly
Useful
Grades
Announcements
Content
Blogs
Journals
Dropbox
Tests
Push
Notifications
Tasks
Roster
15. If no, please, explain why.
16. What additional features would you like to see in the Mobile app?
17. On average, how many hours per week have you been spending in Blackboard
(BlackBoard pages, assignments, discussion forums, etc.) for this course?
o
Never
o
Fewer than 5 hours
o
5-10 hours
o
11-15 hours
o
16-20 hours
o
More than 20 hours per week
18. What do you like MOST about Blackboard ? Why?
19. What do you like LEAST about Blackboard ? Why?
20. Is there anything else you would like to tell us about your experience using
Blackboard this semester?
Part III: Help & Support
Please let us know how useful the help and support services are by answering
the questions below.
21. What technical problems have you experienced with Blackboard so far?
22. If there are other resources you would use to help you resolve your issues,
please, specify what they are:
23. Please provide any suggestions/improvements for Help & Support:
179
24. If there is other kind of support you prefer, please, specify what it is:
Thank you!
We appreciate the time you have spent in providing us with feedback that will
help us make better decisions regarding the future of eLearning at Penn State.
Submission
Please, click "Submit" button to submit your survey responses.
Never submit passwords through Google Forms.
Powered by
Google F or ms
This content is neither created nor endorsed by Google.
Report Abuse - Terms of Service - Additional Terms
180
Appendix D: Faculty End-Term Survey: Blackboard Pilot,
Summer 2014
Please help us evaluate the effectiveness of the pilot to date by completing the brief survey. Your
responses will help us in determining what adjustments can be made to increase the power of the
experience.
Part I: Demographic Information
1. For how many years have you been an instructor/faculty member in
higher education?
o
1 year or less
o
2 - 5 years
o
6 - 10 years
o
11 - 20 years
o
21 - 30 years
o
More than 30 years
2. What is your gender?
o
Female
o
Male
o
Other:
3. At which campus are you a faculty member?
4. In which course during Summer 2014 do you use Blackboard?
(If you use Blackboard in multiple courses, please indicate them as well).
5. Please indicate in what form the course is delivered.
(Choose one BEST answer)
o
Face - to - face
o
In a hybrid format using a blend of face-to-face and online interaction
o
Online with face-to-face interaction only for exams
o
Only online with no face-to-face interaction
Part II: Use of Technology
6. Please indicate the level of comfort in using different types of
technology.
o
Very Uncomfortable
181
o
Somewhat Uncomfortable
o
Somewhat Comfortable
o
Very Comfortable
o
Other:
7. Which device(s) do you currently use to interact with Blackboard?
(Choose all that apply)
o
Mobile phone without web access
o
Mobile phone with web access
o
Portable media player without web access (e.g., mp3 player)
o
Portable media player with web access (e.g., iPod Touch)
o
Ebook reader (e.g., Kindle)
o
Tablet (e.g., iPad)
o
Laptop/Netbook compute
o
Other:
Part III: Feedback on Blackboard
8. Please rate your level of satisfaction with the Blackboard tools and
features designed to support the following teaching and course management
tasks:
Did Not Use
Not at all
Satisfied
Slightly
Satisfied
Moderately
Satisfied
Highly
Satisfied
Creating and
publishing the
course syllabus
(Content)
Creating a course
calendar (Course
Calendar)
Posting course
announcements
(Announcements)
Uploading,
organizing, and
sharing course
files (Content
Collection >
Course Content)
Posting
182
Did Not Use
Not at all
Satisfied
Slightly
Satisfied
Moderately
Satisfied
Highly
Satisfied
audio/video
lectures or other
multimedia
(Content
Collection >
Course Content;
Content Area >
Video)
Creating course
web pages
(Content Area >
Blank Page)
Organizing course
content,
activities, and
assessments into
a series of
modules or
lessons (Content
Area, Learning
Modules)
Posting
assignments
(Content >
Assignment)
Assigning
individual and
collaborative
writing tasks
(Journals, Wikis,
Blogs)
Assigning peer
reviews on
student work
(Self and Peer
Assessment,
Wikis, Blogs)
Using Turnitin
originality
checking on
assignments
183
Did Not Use
Not at all
Satisfied
Slightly
Satisfied
Moderately
Satisfied
Highly
Satisfied
(Turnitin Direct
Assignment)
Creating and
administering
online quizzes,
tests, and/or
surveys (Tests,
Surveys, and
Pools)
Facilitating
graded and
ungraded
discussions
(Discussions)
Giving feedback
on and/or grading
student
submissions
(GradeCenter >
Needs Grading)
Creating and
using rubrics to
grade student
work (Rubrics,
Grade Center)
Setting up and
using the
gradebook to
enter and track
student grades
(Grade Center)
Monitoring
course activity
and student
progress (Course
Reports,
Performance
Dashboard,
Retention Center)
Sending and
receiving
184
Did Not Use
Not at all
Satisfied
Slightly
Satisfied
Moderately
Satisfied
Highly
Satisfied
messages to and
from students
and groups
(Course
Messages, Send
Email)
Creating and
managing groups
for group
assignments,
group
discussions,
and/or group
projects (Groups)
Conducting
online chat
sessions
(Collaborations >
Chat)
Keeping track of
your course tasks
(Calendar, To Do,
Needs Attention)
Importing or
exporting course
content
(Packages and
Utilities)
Integrating an
external learning
tool or platform
with my course,
e.g., SoftChalk
Cloud, Piazza, etc.
(Web Link)
Customizing the
navigation, look,
and feel of your
course (Quick
Setup Guide,
Teaching Style)
185
Did Not Use
Not at all
Satisfied
Slightly
Satisfied
Moderately
Satisfied
Highly
Satisfied
Connecting or
encouraging
students to
connect with
Blackboard users
and groups within
or outside of your
course
(Blackboard
Global Learning
Network
9. Please rate the overall ease of use of Blackboard LMS.
o
Difficult to Use
o
Slightly Easy to Use
o
Moderately Easy to Use
o
Very Easy to Use
10. Please rate the overall usefulness of Blackboard for your teaching.
o
Not at all Useful
o
Slightly Useful
o
Moderately Useful
o
Highly Useful
11. Please rate the overall usefulness of Blackboard’s online
documentation.
o
Did Not Use
o
Not at all Useful
o
Slightly Useful
o
Moderately Useful
o
Highly Useful
12. What did you like MOST about Blackboard? Why?
13. What did you like LEAST about Blackboard? Why?
14. Which, if any, features/tools in Blackboard allowed you to design your
course and/or teach in a new way?
186
15. Please rate your level of agreement with the following statements about
Blackboard.
Strongly
Disagree
Disagree
Neither
Agree nor
Disagree
Agree
Strongly
Agree
Not
Applicable
Blackboard
enabled me to do
what I wanted for
my course(s).
Blackboard was
easy for my
students to learn
how to use.
Blackboard
increased my
efficiency as a
teacher.
Blackboard
increased my
effectiveness as a
teacher.
Blackboard was a
valuable aid to
me in my
teaching.
Using Blackboard
was beneficial to
my students’
overall learning.
16. Please indicate the average number of hours per week using Blackboard
LMS.
o
Never
o
Fewer than 5 hours
o
5-10 hours
o
11-15 hours
o
16-20 hours
o
More than 20 hours per week
17. Is there anything else you would like to tell us about your experience
using Blackboard this semester?
187
Thank You!
We appreciate the time you have spent in providing us with feedback that will
help us make better decisions regarding the future of eLearning at Penn State.
Submission
Please, click "Submit" button to submit your survey responses.
Never submit passwords through Google Forms.
Powered by
Google F or ms
This content is neither created nor endorsed by Google.
Report Abuse - Terms of Service - Additional Terms
188
Appendix E: Student End-Term Survey: Blackboard Pilot,
Summer 2014
Please help us evaluate the effectiveness of the pilot to date by completing the brief survey. Your
responses will help us in determining what adjustments can be made to increase the power of the
experience.
Part I: Demographic Information
1. What is your current academic level?
2. At which campus are you enrolled as a student?
3. What is your age?
4. What is your gender?
5. Which course are you enrolled in during summer 2014 that uses
Blackboard?
(If you used Blackboard in multiple courses, please indicate them as well).
6. Please indicate in what form the course was delivered.
(Choose one BEST answer)
o
Face - to - face
o
In a hybrid format using a blend of face-to-face and online interaction
o
Online with face-to-face interaction only for exams
o
Only online with no face-to-face interaction
Part II: Use of Technology
7. Please indicate the level of comfort in using different types of
technology.
o
Very Uncomfortable
o
Somewhat Uncomfortable
o
Somewhat Comfortable
o
Very Comfortable
o
Other:
8. What type(s) of networked device(s) do you currently use on a regular
basis?
(Choose all that apply)
189
o
Mobile phone
o
Portable media player (e.g., iPod Touch) (e.g., mp3 player)
o
Ebook reader (e.g., Kindle)
o
Tablet (e.g., iPad, Nexus, Galaxy)
o
Laptop/Netbook compute
o
Desktop computer
o
Other:
9. On average, how many hours per week did you spend in Blackboard for
this course?
o
Never
o
Fewer than 5 hours
o
5-10 hours
o
11-15 hours
o
16-20 hours
o
More than 20 hours per week
Part III: Feedback on Blackboard
10. Please rate the usefulness of the following features of Blackboard in
contributing to your learning in this course.
Did Not
Use This
Feature
Not at all
Useful
Slightly
Useful
Moderately
Useful
Highly
Useful
Announcements (for
reading
announcements and
other timely news and
information posted by
your instructor or
department)
Assignments (for
submitting individual
or group assignments)
Blog and Wikis (for
individual and group
writing tasks assigned
by your instructor)
Calendar (for
managing your
personal calendar and
190
Did Not
Use This
Feature
Not at all
Useful
Slightly
Useful
Moderately
Useful
Highly
Useful
viewing course events
and due dates)
Chat (for live text
messaging with
classmates and other
Blackboard users)
Content (for viewing
course materials and
completing activities
organized into lessons
or modules)
Course Messages (for
sending and receiving
messages to and from
your instructor and
other students)
Discussions/Discussion
Board (for
participating in online
discussions with the
entire class or in small
groups)
Groups (for
collaborating with a
specific group of
students on
assignments,
discussions, blogs,
wikis, or projects)
Journal (for keeping a
learning journal
shared with your
instructor)
Content > My Content
(for storing personal
files related to your
course work)
My Grades (for
viewing a list of the
graded items in the
191
Did Not
Use This
Feature
Not at all
Useful
Slightly
Useful
Moderately
Useful
Highly
Useful
course and the grades
you received)
Quizzes/Tests (for
taking and receiving
feedback on online
quizzes, tests, and
self-assessments)
Roster (for viewing a
list of the other
people in the course)
Rubrics (for
understanding how
your work will be or
was graded)
Send Email (for
sending messages to
the external email
account of other
course members)
Surveys (for taking
online surveys)
Tasks (for completing
a list of tasks prepared
by the instructor)
11. Please rate the overall ease of use of Blackboard LMS.
o
Difficult to Use
o
Slightly Easy to Use
o
Moderately Easy to Use
o
Very Easy to Use
12. Please rate the overall usefulness of Blackboard’s online
documentation.
o
Did Not Use
o
Not at all Useful
o
Slightly Useful
o
Moderately Useful
o
Highly Useful
13. What did you like MOST about Blackboard ? Why?
192
14. What did you like LEAST about Blackboard ? Why?
15. Please rate your level of agreement with the following statements about
Blackboard.
Strongly
Disagree
Disagree
Neither
Agree
nor
Disagree
Agree
Strongly
Agree
Not
Applicable
Blackboard helped
me to learn the
course
materials/content.
Blackboard helped
me to study for
exams/tests.
Blackboard helped
me to complete
course
assignments.
Blackboard helped
me to take
quizzes/exams.
Blackboard helped
me to make
efficient use of my
time in the course.
Blackboard helped
me to be in control
of my own learning
in the course.
Blackboard helped
me to
communicate with
my professor.
Blackboard
expanded access to
learning
materials/resources
available to me
(e.g., print, audio,
video, etc.).
193
Strongly
Disagree
Disagree
Neither
Agree
nor
Disagree
Agree
Strongly
Agree
Not
Applicable
Blackboard was
beneficial to my
overall learning in
the course.
16. Is there anything else you would like to tell us about your experience
using Blackboard this semester?
Thank you!
We appreciate the time you have spent in providing us with feedback that will
help us make better decisions regarding the future of eLearning at Penn State.
Submission
Please, click "Submit" button to submit your survey responses.
Never submit passwords through Google Forms.
Powered by
Google F or ms
This content is neither created nor endorsed by Google.
Report Abuse - Terms of Service - Additional Terms
194
Appendix F: Blackboard Pilot Survey - “Lessons Learned”
Data Source: IDs and IPSs from World Campus Learning Design who worked on
courses in Bb in SU15.
1. From what you can remember, what did the Blackboard LMS do well?
2. From what you can remember, what did the Blackboard LMS do poorly?
3. What method did you use to start to build your course in Bb? (import/export, scratch,
other)
3a. If you used built in import/export options, what worked and what didn’t work?
4. What re-configuration, if any, of activities, tools, settings, gradebook, etc. did you have to
do in order to build your course in Bb?
5. What re-design work, if any, did you have to do in order to build your course in Bb?
[NOTE: Work that typically requires an instructional designer and/or faculty is considered
re-design.]
195
Appendix G: Focus Group with Support Staff
Blackboard Learn Meeting Minutes “Lessons Learned”
Date: August 13, 2014
Conducted via Adobe Connect by Brett Bixler
Pros Bb Training
 Separate ID & Faculty Trainings 
 Ability to ask Bb experts questions 
 Hands-on training is critical 
 Weekly Webinars 
Cons Bb Training 
 Need to set up training sooner. 
 Need training chunked/split up. Two day sessions are too
much. 
 If BB could provide strong videos and training materials for
faculty and ID’s to review, then to conduct collaborate
sessions would have been better 
 Maybe flip the trainings – allow people to try things out first,
then come together for assistance & questions answered. 
 Stress to BB that this has been an ANGEL institution for many
years, and their trainers should put more emphasis on
comparing the two 

196
Pros Internal Process 
 Good communication – open processes 
 Use of Basecamp for Project management 
 Project Web Site 
 Implementation of sandboxes in training using pre planned
assignments 
 Moderators to help people attending from a distance 
 Use of Basecamp as a project management tool 
 Recording training sessions and posting them on website 
 Trainees were doing instead of just watching 
Cons Internal Process 
 Needed to start sooner on meetings, planning, web site. As soon
as the project is approved. Need a clear division of labor. 
 Need a streamlined way to ask questions to Bb yet maintain a
ticketed system for data analysis after the pilot. We need a
goto person at Bb for direct support. 
 Revisit use of Bb Collaborate for meetings – maybe use Adobe
Connect instead. 
 Should have daily itinerary for training 
 Should be able to offer Blackboard feedback on their training 
197
Download