LMS Surveys Paper - AERA 2008 - CTools

advertisement
Investigating Perceptions and Uses of Learning Management Systems:
Saving Time or Innovating Practice?
Steven Lonn & Stephanie D. Teasley
University of Michigan, Digital Media Commons and School of Information
2281 Bonisteel Blvd., Ann Arbor, MI 48109-2094
Email: slonn@umich.edu, steasley@umich.edu
Paper presented at the Annual Meeting of the American Educational Research Association, New York, March 26, 2008
Abstract: Learning Management Systems (LMS) are web-based systems that allow instructors
and/or students to share materials, submit and return assignments, and communicate online. In this
study, we explored the uses and perceived benefits of using a LMS as reported by instructors and
students at a large Midwestern university. We examined three years of survey data focusing on
specific uses of the LMS that emphasized either efficient communication or interactive teaching
and learning practices. We matched aggregate user log data with corresponding survey items to
see if system use was consistent with patterns seen in the survey results. Findings suggest that
instructors and students value tools for efficient communication over interactive tools for
innovating existing practices. However, survey item analysis reveals that instructors and students
also highly value teaching and learning tools within the LMS.
Technology-enabled learning is increasingly important and pervasive in higher
education. Called Course Management Systems (CMS) or Learning Management
Systems (LMS), web-based systems allow instructors and students to share instructional
materials, make class announcements, submit and return course assignments, and
communicate with each other online. A 2005 report showed that over 90% of all
universities and colleges have established one or more LMS-type products for student
and faculty use (Hawkins, Rudy, & Nicolich, 2005). While the adoption rate for these
systems has been rapid, very little is known about when and how these systems benefit
learning (Koszalka & Ganesan, 2004).
Early adopters of LMS in higher education have typically come to these
applications because they promise to make teaching more efficient (Morgan, 2003).
Although most LMS are used for the distribution, management, and retrieval of course
materials, these systems can also incorporate functionality that supports interaction
between students and instructors and among students (West, Waddoups, & Graham,
2007) to provide opportunities for enabling institutional innovations in learning and
education (Dutton, Cheong, & Park, 2003).
Increasingly LMS are providing tools for the kinds of active online engagement
preferred by today’s generation of students, such as discussion tools, chat rooms, wikis,
and blogs. These tools provide opportunities for using LMS that are consistent with
constructivist approaches to learning rather than simple transmission of knowledge
models. Specifically, LMS can facilitate a shift from "the transmission of information
towards the management and facilitation of student learning" (Coaldrake & Stedman,
1999, p. 7). However, for this shift to occur both faculty and students will need to
recognize the opportunities provided by the system and use them to innovate teaching
and learning practices.
In our study, we explored the perceived benefits and actual use of a LMS by
instructors and students from our campus where the system supplements traditional faceto-face classroom instruction. We examined three years of survey responses focusing
specifically on items relating to efficiency versus more interactive teaching and learning
practices. Aggregated log data from the LMS for the most recent academic semester was
also examined to see if system use was consistent with beliefs about use shown in the
survey results.
Prior Survey-Based Studies of Learning Management Systems
Morgan (2003) surveyed 740 faculty and instructional staff across the 15
institutions of the University of Wisconsin system. The survey found that instructors
adopt LMS principally to manage administrative and relatively mundane tasks associated
with teaching, particularly in large lecture courses, and faculty used the system to achieve
a number of teaching goals that included supplementing lecture materials, increasing
transparency and feedback, and increasing contact with and among students. In the
process of using the various LMS tools, many instructors reported that they began to
rethink and restructure their courses and ultimately their teaching resulting in a kind of
"accidental pedagogy." However, others have claimed that as LMS design develops and
interacts with other available technologies, LMS tools can be used deliberately to
positively impact teaching and learning (e.g. McGee, Carmean, & Jafari, 2005).
An online survey of 57 faculty members using WebCT at several universities in
Switzerland also found that communication with students is an important reason to use
LMS (Holm, Röllinghoff, & Ninck, 2003). In an examination of perceived usefulness,
instructors reported that they valued the content module (file management) and threaded
discussion tool the most, and the chat tool the least. This result was somewhat surprising
to the authors since several instructors have anecdotally reported that asynchronous
communication via this tool was particularly useful for communicating with their
students. The authors conclude that it is not the LMS tool per se that is useful or not
useful, but rather the way the tool is used in a given course and whether the tool helps
instructors and/or students achieve the desired course goals.
In a study designed to ask students about their use of WebCT, Herse and Lee
(2005) surveyed 113 students from the School of Optometry and Vision Science at the
University of New South Wales. Students ranked lecture notes and handouts tools as
more highly useful than oral presentations and written assignments. Although the results
of the survey indicated that students still preferred passive instead of active modes of
learning, the authors note that the "use of WebCT or similar web-based learning tools can
be used as a catalyst for self-reflection and to help facilitate change from passive to active
learning" (p. 51).
In a study of WebCT and Blackboard use at 3 colleges (Williams, Brandeis, and
Wesleyan), researchers sought to determine: 1) if instructors and students perceived a
learning value from using LMS, 2) what web-based processes provided the most learning
1
benefit, and 3) if views differed between instructors and students (Hanson & Robson,
2004). When asked to select the benefits of LMS, both instructors and students chose
"saves time" more often than "improves learning." Indeed, features that supported making
class information and readings available online were most highly valued. With respect to
learning benefits, instructors highly valued online discussions while students responded
favorably about LMS features that allowed online access to grades, sample quizzes, and
audiovisual materials.
Ansorge and Bendus (2003) surveyed instructors, students, and administrators at
the University of Nebraska-Lincoln about the educational impact of LMS. All
respondents reported that their main objective when using LMS was to post, review, or
print documents for traditional classroom courses. Despite a high overall satisfaction
rating by instructors who used the LMS, many reported a lack of interest, a lack of
recognition by administrators for using online tools, difficulty finding time to prepare for
courses, and difficulty finding time to learn about using LMS tools. Overall, students
viewed the LMS as user-friendly and helpful in their learning experience.
In a 2003 survey study of 172 faculty members at Colorado State University,
significantly more WebCT adopters than non-adopters reported that technology saved
them time on their daily tasks and enabled them to improve their teaching (Yohon,
Zimmerman, & Keeler, 2004). Instructors were found to use content publishing tools the
most while interactive tools such as chat and threaded discussion were seldom used.
These researchers concluded that WebCT was used more to increase instructors'
productivity than to increase opportunities for higher order learning or more active
student-centered teaching strategies.
West, Waddoups, and Graham (2007) conducted interviews and surveys about
how instructors at Brigham Young University implemented Blackboard into their
instruction. Using Rogers' (2003) model for understanding the adoption decision process,
they found that instructors grapple with several small decisions as they weigh the
advantages and disadvantages in each stage of their adoption. Although West and his
colleagues found that most instructors used Blackboard for the distribution, management
and retrieval of course materials, the authors note that LMS are increasingly
incorporating functionality that supports communication between students and instructors
and among students.
While some of these studies described above have found a preference for efficient
activities over teaching and learning functions of the LMS, very few have asked
instructors and students to rate the value of these activities and none have asked these
questions over multiple years. Additionally, there has not been any prior study that has
linked survey responses with actual use of the LMS, as reported via user logs. Our study
aims to fill this gap in the literature by examining the perceived benefits and actual use of
a LMS by instructors and students from our campus.
2
Method
Data Sources
Our study used data from an online survey administered annually in of the spring
of 2005, 2006, and 2007 at a large Midwestern research university. All instructional
faculty and a random sample of 25% of university students (stratified by school/college)
were invited to participate in this survey. Table 1 shows our sample size and response
rate for each year.
Table 1: Survey Sample Size and Response Rate by Year
Year
Instructors
Students
Sample
Response Rate
Sample
Response Rate
2005
n = 782
14%
n = 2,039
23%
2006
n = 1,357
19%
n = 2,485
27%
2007
n = 1,481
20%
n = 2,281
26%
The LMS examined in this study is based on the Sakai (http://sakaiproject.org)
community-source architecture. This environment is comparable to other popular systems
such as Blackboard, WebCT (http://www.webct.com), and Moodle (http://moodle.org).
The data on system use was generated from the event logs from the most recent winter
semester (January-April, 2007). Event logs capture when a user takes a particular action
with a specific tool, such as downloading a document or posting an announcement. There
were a total of 12,238,089 events during this timeframe.
Design & Procedure
Our analysis of the survey data began with looking at how respondents answered
general questions about information technology (IT) improving teaching and learning and
a question about their perception of the most valuable benefit from using IT. Based on the
results for this question, we identified other survey questions that could be categorized as
either "efficient communication" or "teaching & learning" items. Our analyses also
looked at the respondents’ role (instructor/student), computer expertise, use/preference
for IT in courses, and the survey year. These results were also compared to system log
data that reported aggregated user actions in the LMS.
Results
Because our survey invitation was not sent only to users of the LMS, we
examined the percentage of respondents that did use our LMS (see Table 2). We found
that the majority of instructor and student respondents used the LMS and usage increased
each year with the biggest increase in use being for instructors between 2005 and 2006.
This increase corresponded to a mandate from the dean of the university’s largest school
to use the LMS for all introductory-level classes.
3
Table 2: Usage of LMS by Year
Year
% of Instructors
Using LMS
% of Students
Using LMS
2005
54%
91%
2006
81%
98%
2007
85%
99%
Does Information Technology Improve Teaching and Learning?
Survey respondents were asked to rate their agreement with two statements about
whether the use of IT in courses improved instructors' teaching and students' learning (see
Table 3) using a 5-point Likert scale from 1=strongly disagree to 5=strongly agree. Since
students were asked about instructors' teaching only in the 2006 and 2007 surveys, our
analysis here is limited to those two years. Instructors and students generally agreed that
IT improved teaching and learning, although students were less positive about the effect
of IT on instructors’ teaching, (t (7,378) = 10.254, p < .001).
Table 3: Mean Ratings for Survey Items about Information Technology's Effect on Teaching and
Learning
Mean
Instructors Students
Survey Item
n=2,698
n=4,682
Difference
IT improves instructors' teaching
3.92
3.70
.22*
IT improves students' learning
3.92
3.96
.04
Note: * p < .001
Most Valuable Benefits of Information Technology
We evaluated a forced choice survey item that asked respondents to specify
"Which one of the following benefits from using information technology in your courses
was the most valuable to you?" (see Table 4). Overall, there was a significant difference
in how instructors and students answered this question (X2(7, N=9,657) = 1354.990, p <
.001).
4
Table 4: Results of Survey Item about Most Valuable Benefit from Using Information Technology in
Courses
Instructors
n=3,155
Students
n=6,502
Improved
Student
to
Instructor
Communi
-cation
Improved
Instructor
to
Student
Communi
-cation
Improved
Student
to
Student
Communi
-cation
No
Benefits
Improved
Instructor
Teaching
Improved
Student
Learning
Efficiency
(Saves
Time)
Helped
Manage
Course
Activities
8%
7%
26%
11%
4%
40%
3%
1%
1%
11%
51%
16%
3%
14%
3%
1%
The most popular response from instructors (40%) was "Improved my communication to
my students" while the most popular response from students (51%) was "Efficiency
(saves time)." However, 26% of instructors choose "Efficiency (saves time)" indicating
that efficiency is important to many instructors as well as their students. Very few
instructors or students chose teaching or learning improvements as the most valuable
benefit from using IT in their courses, nor did many choose the items about student to
instructor communication or student to student communication. This suggests that these
systems are valued most for the ways they improve instructors’ ability to push out
information to students rather than general support for teaching and learning, or for
opening up communication from the students to the instructor or from student to student.
LMS Event Log Data & Corresponding Survey Results
We investigated the aggregated LMS event logs to see if actual system use was
consistent with the preferences expressed in the survey data. The user action logs were
aggregated by tool for the four months (one semeter) immediately preceding the most
recent IT/LMS survey. We also compared the use of each tool to the corresponding 2007
survey items about the perceived value of each LMS tool (see Table 5) asking the
respondent if they considered the tool to be not valuable, valuable, or very valuable.
There were significant differences in how instructors and students valued the following
tools: Resources (X2(2, N=2,840) = 27.680, p < .001), Assignments (X2(2, N=2,785) =
38.908, p < .001), Announcements (X2(2, N=3,028) = 107.118, p < .001), Discussion
(X2(2, N=1,678) = 15.465, p < .001), and Syllabus (X2(2, N=2,873) = 29.114, p < .001).
Overall, document management and broadcast-oriented communication tools (Resources,
Assignments, Announcements, Schedule, and Syllabus) are heavily used (95% of all user
actions) and very few instructors or students rated them as not valuable. By contrast,
tools that are more interactive (Chat, Discussion, and Wiki) are not used as much (5% of
all user actions) and, compared with the other tools, more instructors and students rated
them as not valuable.
5
Table 5: Aggregated LMS User Action Log Reports by Tool and Corresponding Tool Value Results
from the 2007 User Survey
Tool
Aggregated
Log
Reports
2007 Survey Results
% of all
actions
Resources
91.54%
Assignments
2.39%
Chat
1.19%
Announcements
0.59%
Schedule
0.57%
Discussion
0.55%
Wiki
0.14%
Syllabus
0.08%
Role
n
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
1,008
1,832
883
1,902
293
1,205
1,075
1,953
520
1,284
390
1,288
126
461
941
1,932
% Not
Valuable
1%
2%
3%
4%
38%
38%
2%
3%
20%
22%
28%
35%
33%
34%
6%
2%
%
Valuable
28%
37%
42%
54%
44%
46%
33%
52%
49%
50%
51%
52%
48%
47%
42%
45%
% Very
Valuable
71%
61%
55%
42%
18%
16%
65%
45%
31%
28%
21%
13%
18%
19%
52%
53%
2
X
27.680*
38.908*
1.129
107.118*
1.706
15.465*
.097
29.114*
Note: * p < .001
Efficient Communication and Teaching & Learning Activities
We categorized several survey items about specific activities within the LMS as
either supporting "efficient communication" or "teaching & learning" (see Table 6).
These items were rated either Not Valuable, Valuable, or Very Valuable. Since 7 of these
11 items were only on the 2006 and 2007 surveys, our analysis here is limited to these
two years.
Table 6: Survey Items by Category
Efficient Communication
•
•
•
•
•
•
Post/Access a syllabus online
Send/Receive announcement messages
(instructor to student)
Post/Access online readings &
supplementary course materials
Give/Take exams & quizzes online for
grading purposes
Students turn in assignments online
Instructors return assignments to students
online with comments & grade
Teaching & Learning
•
•
•
•
•
Post/Access sample exams & quizzes for
learning purposes
Post/Access lecture outline or notes after
the lecture
Students post questions before lecture
Students post questions after lecture
Students read and/or comment on each
other's course-based work
We also used two items to characterize our population: computer expertise
(novice, intermediate, or advanced) and overall use (instructors) or preference for use
6
(students) of IT in their courses (none, limited, moderate, extensive, exclusive).
Preliminary analysis indicated that there was no significant difference between instructors
and students regarding computer expertise, but there was a significant difference with
regard to use/preference of IT, (t (10405) = 19.747, p < .001) where students reported
preferring a higher level of IT use than did instructors.
We examined independently each of our four respondent characteristics (role,
survey year, computer expertise, use/preference of IT). Our first analysis looked at
differences between instructors and students, aggregated over two years (see Table 7).
There were significant differences between instructors' and students' value ratings for all
of the efficient communication items: Post/Access a syllabus (X2(2, N=6,087) = 64.226, p
< .001), Send/Receive announcements (X2(2, N=6,165) = 152.652, p < .001), Post/Access
online readings (X2(2, N=6,157) = 68.998, p < .001), Students turn in assignments online
(X2(2, N=4,344) = 42.386, p < .001), and Instructors return assignments online (X2(2,
N=2,868) = 49.909, p < .001). Overall, more instructors than students rated these
activities as very valuable, although very few instructors or students rated them as not
valuable.
Table 7: Mean Differences between Instructors and Students for Efficient Communication Survey
Items
Efficient Communication Survey Items
% Not
%
Survey Item
Role
n
Valuable Valuable
Instructors 1,945
5%
43%
Post/Access syllabus
Students
4,142
2%
40%
Instructors 2,016
4%
30%
Send/Receive announcements
Students
4,149
6%
43%
Instructors 2,019
1%
29%
Post/Access online readings
Students
4,138
2%
39%
Give/Take online exams &
Instructors
346
24%
47%
quizzes
Students
1,428
24%
52%
Students turn assignments
Instructors 1,000
10%
44%
online
Students
3,344
13%
53%
Instructors return assignments
Instructors
641
15%
40%
online
Students
2,227
18%
52%
% Very
Valuable
52%
58%
67%
51%
70%
59%
29%
24%
45%
34%
45%
33%
X2
64.226*
152.652*
68.998*
3.888
42.386*
49.909*
Note: * p < .001
There were also significant differences in how instructors and students valued all
but one of the teaching & learning items (see Table 8): Post/Access sample exams &
quizzes (X2(2, N=4,577) = 37.507, p < .001), Post/Access lecture notes/outline after
lecture (X2(2, N=5,329) = 47.946, p < .001), Students post questions before lecture (X2(2,
N=2,045) = 11.185, p = .004), and Students read/comment on each other's work (X2(2,
N=2,016) = 56.283, p < .001). Overall for these activities, the not valuable rating was
used by a higher percentage of instructors and students than for the efficient
communication activities.
7
Table 8: Mean Differences Between Instructors and Students for Teaching & Learning Survey Items
Teaching & Learning Survey Items
Survey Item
Post/Access sample exams &
quizzes
Post/Access notes or outline
after lecture
Students post questions before
lecture
Students post questions after
lecture
Students read/comment on each
other's work
Role
n
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
1,123
3,454
1,439
3,890
484
1,561
531
1,905
493
1,523
% Not
Valuable
4%
3%
2%
3%
15%
22%
12%
16%
13%
26%
%
Valuable
44%
34%
53%
42%
54%
51%
66%
62%
59%
59%
% Very
Valuable
52%
63%
45%
55%
31%
27%
22%
22%
28%
15%
X2
37.507**
47.946**
11.185*
5.291
56.283**
Note: * p < .01 ** p < .001
Further Analyzing Efficient Communication and Teaching & Learning Activities
For each variable analyzed above in Tables 7-8, the two years of survey data were
collapsed. To tease out what was driving the differences between instructors and students
and if their answers changed from year to year, we next employed a statistical procedure
using "frequency ratios." That is, we took the number of respondents for each role who
answered "valuable" or "very valuable" and divided by the number for respondents for
each role who answered "not valuable" (see Tables 9-10). For example, if the number of
respondents who answered "very valuable" was exactly the same number of those who
answered "not valuable," the frequency ratio would be 1.
Table 9: Frequency Ratios for Instructors and Students for Efficient Communication Survey Items
Efficient Communication Survey Items
2006
Survey Item
Post/Access syllabus
Send/Receive announcements
Post/Access online readings
Give/Take online exams &
quizzes
Students turn assignments
online
Instructors return assignments
online
Role
n
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
1,945
4,142
2,016
4,149
2,019
4,138
346
1,428
1,000
3,344
641
2,227
2007
Valuable /
Not
Valuable
Very
Valuable /
Not
Valuable
Valuable /
Not
Valuable
Very
Valuable /
Not
Valuable
6.49
18.89
6.91
5.47
17.44
15.54
1.38
2.25
3.57
3.75
2.59
2.64
7.14
27.24
12.98
5.29
41.94
23.20
0.96
1.30
3.80
2.16
2.98
1.35
11.78
27.67
11.32
10.68
27.82
20.92
2.62
2.12
4.82
4.44
2.78
3.22
15.64
40.79
30.72
15.11
66.91
31.92
1.49
0.77
4.80
3.16
3.00
2.03
8
Table 10: Frequency Ratios for Instructors and Students for Teaching & Learning Survey Items
Teaching & Learning Survey Items
2006
Survey Item
Post/Access sample exams &
quizzes
Post/Access notes or outline
after lecture
Students post questions before
lecture
Students post questions after
lecture
Students read/comment on each
other's work
Role
n
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
1,123
3,454
1,439
3,890
484
1,561
531
1,905
493
1,523
2007
Valuable /
Not
Valuable
Very
Valuable /
Not
Valuable
Valuable /
Not
Valuable
Very
Valuable /
Not
Valuable
9.68
14.92
23.87
12.79
3.29
2.10
5.12
3.53
4.53
2.03
12.32
30.42
19.47
16.18
1.97
1.13
1.67
1.17
2.13
0.52
13.29
11.04
23.53
17.31
4.03
2.68
5.52
4.08
4.62
2.67
15.10
17.67
21.00
23.51
2.14
1.36
1.79
1.50
2.12
0.71
Below we present the outcome of the frequency ratios for each survey item,
beginning with the efficient communication items. For posting/accessing a syllabus (see
Figure 1), there are significant differences between instructors and students for
respondents who answered "valuable" (X2(1, N=2,646) = 38.621, p < .001) and those who
answered "very valuable" (X2(1, N=3,614) = 56.646, p < .001). There are also
significant differences between years for respondents who answered "valuable" (X2(1,
N=2,646) = 10.116, p = .002) and those who answered "very valuable" (X2(1, N=3,614) =
13.425, p < .001). Overall, more students rated this activity as "valuable" and "very
valuable" than did instructors, although both instructors' and students' ratings increased
over time.
Figure 1: Frequency Ratios by Survey Year for Post/Access a Syllabus Time
9
For sending/receiving an announcement (see Figure 2), there is a significant difference
between instructors and students for respondents who answered "very valuable" (X2(1,
N=3,774) = 35.456, p < .001). There are also significant differences between years for
respondents who answered "valuable" (X2(1, N=2,688) = 24.557, p < .001) and those who
answered "very valuable" (X2(1, N=3,774) = 64.500, p < .001). Overall, more instructors
rated this activity as "very valuable" than did students, although both instructors' and
students' ratings increased over time.
Figure 2: Frequency Ratios by Survey Year for Send/Receive Announcement Messages
For posting/accessing online readings and supplementary course materials (see Figure 3),
there is only a significant difference between instructors and students for respondents
who answered "very valuable" (X2(1, N=3,959) = 8.671, p = .003). Overall, more
instructors rated this activity as "very valuable" than did students, although more students
also rate this activity as "very valuable" than "valuable." Both instructors' and students'
ratings increased over time.
Figure 3: Frequency Ratios by Survey Year for Post/Access Online Readings
10
For giving/taking exams and quizzes online for grading purposes (see Figure 4),
there is only a significant difference between survey years for respondents who answered
"very valuable" (X2(1, N=870) = 5.148, p = .023). The proportion of students who rated
this activity as "valuable" or "very valuable" decreased over time, while the proportion of
instructors who rated this activity "valuable" or "very valuable" increased over time. For
both instructors and students, there were more respondents who rated this activity as
"valuable" than "very valuable."
Figure 4: Frequency Ratios by Survey Year for Give/Take Online Exams & Quizzes
For having students turn in assignments online (see Figure 5), there is a significant
difference between instructors and students for respondents who answered "very
valuable" (X2(1, N=2,132) = 15.942, p < .001). There are also significant differences
between years for respondents who answered "valuable" (X2(1, N=2,754) = 4.134, p =
.042) and those who answered "very valuable" (X2(1, N=2,132) = 11.454, p < .001).
About the same proportion of instructors rated this activity as "valuable" and "very
valuable" in both survey years, while there was a lower proportion of students who rated
this activity as "very valuable." Both instructors' and students' ratings increased over
time.
Figure 5: Frequency Ratios by Survey Year for Students Turn In Assignments Online
11
For having instructors return assignments to students online with comments and grade(s)
(see Figure 6), there is a significant difference between instructors and students for
respondents who answered "very valuable" (X2(1, N=1,448) = 18.963, p < .001). There
are also significant differences between years for respondents who answered "very
valuable" (X2(1, N=1,448) = 7.797, p = .005). About the same proportion of instructors
rated this activity as "valuable" and "very valuable" in both survey years, while there was
a lower proportion of students who rated this activity as "very valuable." Students' ratings
increased over time while instructors' ratings remained about the same.
Figure 6: Frequency Ratios by Survey Year for Instructors Return Assignments to Students Online
We next present the outcome of the frequency ratios for each of the "teaching & learning"
survey items using the same statistical methods as above. For posting/accessing sample
exams and quizzes for learning purposes (see Figure 7), there is a significant difference
between instructors and students for respondents who answered "very valuable" (X2(1,
N=2,889) = 10.708, p = .001). Overall, about the same proportion of instructors rated this
activity as "valuable" and "very valuable" in both survey years and students' ratings
decreased over time while instructors' ratings slightly increased.
12
Figure 7: Frequency Ratios by Survey Year for Post/Access Sample Exams & Quizzes
For posting/accessing a lecture outline or notes after the lecture (see Figure 8), there is a
significant difference between instructors and students for respondents who answered
"valuable" (X2(1, N=2,538) = 5.249, p = .022). A greater proportion of students rated this
activity as "very valuable" than did "valuable" and a greater proportion of instructors
rated this activity as "valuable" than did "very valuable. Students' ratings increased over
time while instructors' ratings remained about the same.
Figure 8: Frequency Ratios by Survey Year for Post/Access Outline or Notes After Lecture
For students posting questions before lecture (see Figure 9), there are significant
differences between instructors and students for respondents who answered "valuable"
(X2(1, N=1,479) = 8.309, p = .004) and those who answered "very valuable" (X2(1,
N=977) = 9.796, p = .002). There is also a significant difference between years for
respondents who answered "valuable" (X2(1, N=1,479) = 4.115, p = .043). Overall, more
instructors rated this activity as "valuable" than did students, although both instructors'
and students' ratings increased over time.
13
Figure 9: Frequency Ratios by Survey Year for Students Posting Questions Before Lecture
For students posting questions after lecture (see Figure 10), there is a significant
difference between instructors and students for respondents who answered "valuable"
(X2(1, N=1,910) = 5.238, p = .022). Overall, more instructors rated this activity as
"valuable" than did students, who also had a greater proportion of respondents that rated
this activity as "valuable" than "very valuable." Both instructors' and students' ratings
remained about the same over time.
Figure 10: Frequency Ratios by Survey Year for Students Posting Questions After Lecture
For students reading and/or commenting on each other's course-based work (see Figure
11), there are significant differences between instructors and students for respondents
who answered "valuable" (X2(1, N=1,645) = 20.133, p < .001) and those who answered
"very valuable" (X2(1, N=825) = 52.312, p < .001). There is also a significant difference
between years for respondents who answered "valuable" (X2(1, N=1,645) = 4.454, p =
.035). Overall, more instructors rated this activity as "valuable" than did students, who
also had a greater proportion of respondents that rated this activity as "not valuable" than
14
"very valuable," resulting in a proportion of less than 1. Both instructors' and students'
ratings remained about the same over time.
Figure 11: Frequency Ratios by Survey Year for Students Reading/Commenting on Each Other's Work
This analysis helped illuminate what determined the highly significant differences
between instructors and students for the "efficient communication" and "teaching &
learning" survey items and how respondents rated the value of these items over two
consecutive years.
Analysis & Discussion
When asked generally if IT improves teaching and learning, both instructors and
students we surveyed responded positively, although the students were less positive about
IT’s effect on their instructors’ teaching. Our analysis showed that when you ask
instructors to pick the most valuable benefit of using IT, the majority picked efficient
communication and they were more likely to rated this functionality as very valuable than
did students. Fewer instructors and students rated teaching & learning uses as being the
most valuable, as well as the communication uses that did not pertain to instructor-tostudent communication. This finding was consistent with the actual tool use within the
system as document and communication management tools (Resources, Announcements,
Assignments, Syllabus) were more often rated as being very valuable than the interactive
tools (Chat, Discussion, Wiki).
The multiyear analysis of efficient communication and teaching and learning
survey items confirms that instructors and students highly rated activities within the LMS
that make the transmission of course materials and announcements more efficient, and
this positive viewpoint increased for both groups over time. By contrast, most of the
teaching and learning activities were not as highly rated by faculty or students. It is also
relevant to note, however, that instructors did rate these activities as valuable and, for
some activities, very valuable, and instructors' ratings increased or at least remained
constant over time. If the goal of LMS is to support interactive teaching & learning
activities as well as efficient communication, then instructors might benefit from training
about how to use LMS tools to facilitate student questions and peer evaluation. Such
15
training could help more instructors to view those activities as valuable and find them as
easy as posting course materials or sending an announcement.
Our analysis of the uses and perceived benefits of our LMS has demonstrated that
instructors and students more highly value these systems for efficient instructor
communication than for interactive teaching and learning uses. However, both types of
LMS use are highly rated when evaluated as a general goal of the system, replicating
findings from previous LMS survey-based studies (Hanson & Robson, 2004). Overall,
students prefer a that a higher level of information technology is employed in their
courses than instructors are currently providing, although the instructor use of the LMS
has increased for the past three years on our campus. Similarly, the perceived value of
many of the system’s various tools for both efficient communication and teaching and
learning have increased each year.
In recent years, these types of systems have switched their monikers from Course
Management Systems to Learning Management Systems. Embedded in this change is the
notion that learning involves more than providing course content efficiently. Our survey
results indicate that while both instructors and students agree that information
technologies improve learning, students do not agree as strongly as instructors that such
technologies improve teachers' instruction. These ratings suggest that students, in
particular, may be responding not to whether these tools are used, but rather how they are
used, consistent with the findings from Holm, Röllinghoff, & Ninck's (2003) study of
Swiss instructors. If LMS use is intended to support constructivist-based models of
learning rather than instructivist teaching and consumptive learning models, then using
tools to scaffold more interactive forms of instruction and learning may be required for
success. As instructors and students gain experience using the different tools available in
the LMS, these tools can affect how teaching and learning take place within the online
environment and the classroom. For example, when asked how using the LMS has
changed instructor's in-class time, one instructor in our survey responded that she "spend
a lot less time on administrative details (and) don't have to make announcements about
such stuff and students know they can find stuff (online)" representing efficient
communication uses of the LMS. By contrast, another instructor who had used several of
the interactive tools responded that the LMS has helped her students "to be better
prepared, and, by allowing the other students to read the responses in advance, has led to
a more effective use of class time for discussion, analysis, and additional activities."
Future Research
In our examination of our LMS, the interactive tools available in the system were
not yet as heavily used nor as highly rated as are the tools that simply push out
information from instructors to students. In future research, we will investigate how
different instructors are using the more interactive tools to understand how the design and
structure of these tools, or possibly new tools, could better support innovative teaching
and learning activities. By looking more deeply at the various LMS uses in future
research, we hope to be able to make specific recommendations about system design and
16
instructional practices that will help instructors and students realize the full benefits of
Learning Management Systems.
Acknowledgements
Many thanks to Jim Lepkowski and Chan Zhang from the Institute for Social
Research at the University of Michigan for their review of our proposal version of this
paper and particularly for their statistical guidance and work on the multiyear analysis of
the efficient communication and teaching & learning survey items.
References
Ansorge, C. J., & Bendus, O. (2003). The pedagogical impact of course management
systems on faculty, students, and institution. In R. H. Bruning, C. A. Horn, & L.
M. PytlikZillig (Eds.), Web-based learning: What do we know? Where do we go?
(pp. 169-190)., Greenwich, CT: Information Age Publishing
Bass, R. J. (2000). Technology, evaluation, and the visibility of teaching and learning.
New Directions for Teaching and Learning, 83, 35-50.
Bennett, J. & Bennett, L. (2003). A review of factors that influence the diffusion of
innovation when structuring a faculty training program. Internet and Higher
Education, 6(1), 53-63.
Coaldrake, P. & Stedman, L. (1999). Academic work in the twenty-first century.
Occasional Paper Series, Higher Education Division, DETYA, no. 99H. Canberra:
Australian Government Publishing Service.
Dutton, W. H., Cheong, P. H., & Park, N. (2003). The social shaping of a virtual learning
environment: The case of a university-wide course management system. The
Electronic Journal of e-Learning, 2(1).
Hanson, P., & Robson, R. (2004). Evaluating course management technology: A pilot
study. Boulder, CO: EDUCAUSE Center for Applied Research, Research
Bulletin, Issue 24.
Harrington, C. F., Gordon, S. A., & Schibik, T. J. (2004). Course management system
utilization and implications for practice: A national survey of department
chairpersons. Online Journal of Distance Learning Administration, 7(4)
http://www.westga.edu/~distance/ojdla/winter74/harrington74.htm
Hawkins, B. L., Rudy, J. A., & Nicolich, R. (2005). Educause core data service: Fiscal
year 2004 summary report. Washington D.C.: Educause.
Herse, P., & Lee, A. (2005). Optometry and WebCT: A student survey of the value of
web-based learning environments in optometric education. Clinical and
Experimental Optometry, 88(1), 46-52.
Holm, C., Röllinghoff, A., & Ninck, A. (2003). WebCT and elearning in Switzerland.
Proceedings of the 5th International Conference on New Educational
Environments, Luzerne. 139-143.
Koszalka, T. A., & Ganesan, R. (2004). Designing online courses: A taxonomy to guide
strategic use of features available in course management systems (CMS) in
distance education. Distance Education, 25(2), 243-256.
17
Malikowski, S. R., Thompson, M. E., & Theis, J. G. (2006). External factors associated
with adopting a CMS in resident college courses. Internet and Higher Education,
9(3), 163-174.
Malikowski, S. R., Thompson, M. E., & Theis, J. G. (2007). A model for research into
course management systems: Bridging technology and learning theory. Journal of
Educational Computing Research, 36(2), 148-173.
McGee, P., Carmean, C., & Jafari, A. (Eds.). (2005). Course management systems for
learning: Beyond accidental pedagogy. Hershey, PA: Information Science
Publishing.
Morgan, G. (2003). Faculty use of course management systems. Boulder, CO:
EDUCAUSE Center for Applied Research.
Osman, M. E. (2005). Students' reaction to WebCT: Implications for designing on-line
learning environments. International Journal of Instructional Media, 32(4), 353362.
Rogers, E. M. (2003). Diffusion of innovations (5th ed). New York: The Free Press.
West, R. E., Waddoups, G., & Graham, C. R. (2007). Understanding the experiences of
instructors as they adopt a course management system. Educational Technology
Research and Development, 55(1), 1-26.
Yohon, T., Zimmerman, D., & Keeler, L. (2004). An exploratory study of adoption of
course management software and accompanying instructional changes by faculty
in the liberal arts and sciences. Electronic Journal of e-Learning, 2(2), 313-320.
18
Download