Saving time or innovating practice - CTools

This article appeared in a journal published by Elsevier. The attached
copy is furnished to the author for internal non-commercial research
and education use, including for instruction at the authors institution
and sharing with colleagues.
Other uses, including reproduction and distribution, or selling or
licensing copies, or posting to personal, institutional or third party
websites are prohibited.
In most cases authors are permitted to post their version of the
article (e.g. in Word or Tex form) to their personal website or
institutional repository. Authors requiring further information
regarding Elsevier’s archiving and manuscript policies are
encouraged to visit:
http://www.elsevier.com/copyright
Author's personal copy
Computers & Education 53 (2009) 686–694
Contents lists available at ScienceDirect
Computers & Education
journal homepage: www.elsevier.com/locate/compedu
Saving time or innovating practice: Investigating perceptions
and uses of Learning Management Systems
Steven Lonn a,*, Stephanie D. Teasley b
a
b
School of Education and USE Lab, Digital Media Commons, University of Michigan, 3350 Duderstadt Center, 2281 Bonisteel Blvd., Ann Arbor, MI 48109-2094, USA
School of Information and USE Lab, Digital Media Commons, University of Michigan, 2252 SI North, 1075 Beal Ave., Ann Arbor, MI 48109-2112, USA
a r t i c l e
i n f o
Article history:
Received 25 November 2008
Received in revised form 3 April 2009
Accepted 7 April 2009
Keywords:
Interactive learning environments
Learning Management Systems
Post-secondary education
Teaching/learning strategies
Distributed learning environments
a b s t r a c t
Learning Management Systems (LMS) are web-based systems that allow instructors and/or students to
share materials, submit and return assignments, and communicate online. In this study, we explored
the uses and perceived benefits of using a LMS to support traditional classroom teaching as reported
by instructors and students at a large American Midwestern university. We examined two years of survey
data focusing on specific uses of the LMS that emphasized either efficient communication or interactive
teaching and learning practices. We matched aggregate user log data with corresponding survey items to
see if system use was consistent with patterns seen in the survey results. Findings suggest that instructors and students value tools and activities for efficient communication more than interactive tools for
innovating existing practices. However, survey item analysis reveals that instructors and students also
highly value the teaching and learning tools within the LMS.
Ó 2009 Elsevier Ltd. All rights reserved.
1. Introduction
Technology-enabled learning is increasingly important and pervasive in higher education. Called Course Management Systems (CMS) or
Learning Management Systems (LMS), web-based systems allow instructors and students to share instructional materials, make class
announcements, submit and return course assignments, and communicate with each other online. Recent reports showed that over 90%
of all responding American universities and colleges (Hawkins & Rudy, 2007) and 95% of higher education institutions in the UK (Browne,
Jenkins, & Walker, 2006) have established one or more LMS-type products for student and faculty use. While the adoption rate for these
systems has been rapid, very little is known about when and how these systems benefit learning (Koszalka & Ganesan, 2004) or change
pedagogical practice (Becker & Jokivirta, 2007).
Early adopters of LMS in higher education have typically come to these applications because they promise to make teaching more efficient (Morgan, 2003). Although most LMS are used for the distribution, management, and retrieval of course materials, these systems can
also incorporate functionality that supports interaction between students and instructors and among students (West, Waddoups, & Graham, 2007) to provide opportunities for enabling institutional innovations in learning and education (Dutton, Cheong, & Park, 2003).
Increasingly LMS are providing tools for the kinds of active online engagement preferred by today’s generation of students, such as discussion tools, chat rooms, wikis, and blogs. These tools provide opportunities for using LMS that are consistent with constructivist approaches
to learning rather than simple transmission of knowledge models. Specifically, LMS can facilitate a shift from ‘‘the transmission of information towards the management and facilitation of student learning” (Coaldrake & Stedman, 1999, p. 7). However, for this shift to occur,
both faculty and students will need to recognize the opportunities provided by the system and use them in order to innovate teaching and
learning practices.
In this study, we explored the perceived benefits and actual use of a LMS by instructors and students where the system supplements
traditional face-to-face classroom instruction. We looked at two years of survey data to learn whether LMS is changing instructors’ pedagogical practice and how students’ preferences for how they learn. Specifically, we investigated survey responses focusing on items relating to efficiency vs. interactive teaching and learning practices. In addition, we examined aggregated log data from the LMS for the most
recent academic semester to see if system use was consistent with beliefs shown in the survey results. This study extends the current
* Corresponding author. Tel.: +1 734 763 4667; fax: +1 734 615 2818.
E-mail addresses: slonn@umich.edu (S. Lonn), steasley@umich.edu (S.D. Teasley).
0360-1315/$ - see front matter Ó 2009 Elsevier Ltd. All rights reserved.
doi:10.1016/j.compedu.2009.04.008
Author's personal copy
S. Lonn, S.D. Teasley / Computers & Education 53 (2009) 686–694
687
LMS-related literature by looking at user attitudes longitudinally and by relating system use with instructor and student perceptions about
teaching and learning.
1.1. Learning Management System-related research
A wide variety of LMS-related research has been conducted as the LMS adoption rate has increased in higher education. Much of this
work has focused on how instructors and students value various features and functions of the technology when used for distance learning
(e.g. Swan, 2001). More relevant to the context of our study, Hanson and Robson (2004) found that both instructors and students found
various features of LMS to have strong learning benefits. Specifically, respondents responded more favorably to benefits of LMS that save
time over those that improve learning. Similar results were found in subsequent studies where users indicated a preference for LMS tools
and functions that help manage materials and information for courses (e.g. Parker, Bianchi, & Cheah, 2008; Yohon, Zimmerman, & Keeler,
2004). However, it is not necessarily the LMS tool by itself that is useful or not useful, but rather the way the tool is used in a given course
and if the tool helps the user achieve the desired course goals (Holm, Röllinghoff, & Ninck, 2003).
Web-based systems like LMS, ‘‘can be used as a catalyst for self-reflection and to help facilitate change from passive to active learning”
(Herse & Lee, 2005, p. 51). However, there is no prescribed method of how this change occurs, or even if the change can be controlled. Within the 15 institutions of the University of Wisconsin system, many instructors reported that after using LMS, they began to rethink and
restructure their courses and ultimately their teaching resulting in a kind of ‘‘accidental pedagogy” (Morgan, 2003). Although most instructors begin adopting LMS for the distribution, management, and retrieval of course materials, they begin to use more of the interactive features for teaching and learning as they increase familiarity with the technology, which is one way to facilitate active learning (West et al.,
2007).
There are several other examples of instructors changing their teaching practices by utilizing the interactive features of LMS. Topper
(2003) found that the way he managed face-to-face discussions for his graduate-level course did not translate well in an online, asynchronous environment. Specifically, he had to change his practice by avoiding immediate intervention in online discussions and instead allow
students to interact amongst themselves before asserting his control or authority in the conversation.
A similar lesson was reported by Dougiamas and Taylor (2003) who found that without employing a variety of scaffolds to facilitate
reflective dialog including writing prompts and instructor-supplied models of ‘‘good” messages, the students’ posts did not engender
reflection or engaged discussion with each other. There are a number of similar examples in LMS-related literature showing that instructors
not only had to learn to use the interactive LMS tools, but also change their practice in order to facilitate students’ learning with those tools
(e.g. Bender, 2005; Gaensler, 2004). Ultimately, it is incumbent upon the instructor to use the provided LMS tools in ways that encourage a
deeper learning experience in which students are able to ‘‘construct their own representations of the new knowledge and share those representations with the instructor or the rest of the class” (Carmean & Haefner, 2002, p. 32).
In the studies described above, the lessons learned about LMS use have not been examined longitudinally and may not be capturing if
and how instructors and students come to change their beliefs about the value of LMS and how to best integrate this technology into effective teaching and learning practices. Understanding both instructor and student perceptions is important for evaluating the potential success of the LMS (McGill & Klobas, 2009). Additionally, prior studies have not linked survey responses with actual LMS use, as captured via
user logs. Our study aims to fill this gap in the literature by examining the perceived benefits and actual use of a LMS by instructors and
students. This empirical approach is necessary for making specific recommendations about system design and about instructional practices
to help instructors and students realize the educational potential of these systems.
2. Method
2.1. Setting
We drew our sample from the instructors and students at the main campus of a large American Midwestern university. According to the
Carnegie Classification of Institutions of Higher Education (http://www.carnegiefoundation.org/classifications/), this institution is a large,
public, four-year research university with very high research activity and a majority undergraduate enrollment. The university enrolls
approximately 26,000 undergraduate and 15,000 graduate and professional students. Approximately 5700 faculty members are employed
by the university.
The LMS examined in this study is based on the Sakai (http://sakaiproject.org) community-source architecture and has been in use at
the university since 2004. This environment is comparable to other popular systems such as Blackboard and Moodle (http://moodle.org).
Training in the use of the LMS is not required for faculty or students, although the university library and other departments do offer voluntary training sessions for instructors. Use of the LMS is not uniformly mandated across the institution, although the largest department
on campus has required that all available courses establish a LMS site. Within LMS course sites, instructors may elect to use any of the
following tools:
Announcements: Non-threaded, asynchronous messages for all site participants to read. Email notifications may be sent as well.
Assignments: Allows students to upload and submit assignments and projects and instructors to grade and comment on students’
submissions.
Chat: Synchronous conversation tool. All messages are automatically saved and are viewable for all site participants.
Content Sharing: File storage space. Any file type may be stored; URLs to other websites may be created, and shared citation lists may be
created. Email notifications of new content may be sent as well.
Discussion: Threaded, asynchronous messages for all site participants to read.
Schedule: A shared calendar used to post deadlines, due dates, etc.
Syllabus: Instructors may use this tool to post their syllabus as HTML or an attachment.
Wiki: A collaborative document-writing tool. Any site participant may add or modify additional pages and a history of changes is automatically recorded.
Author's personal copy
688
S. Lonn, S.D. Teasley / Computers & Education 53 (2009) 686–694
2.2. Data sources and participants
Our study used data from an online survey administered annually in of the spring of 2006 and 2007. All instructional faculty and a random sample of 25% of university students (stratified by school/college) were invited to participate in this survey via email. Table 1 shows
the number of survey respondents and response rate for each year the survey was administered.
The annual online survey was divided into two sections. The first section (11 items) asked participants for demographic information and
their general opinion about using information technologies for their courses. The second section (16 items) focused on participants’ perceptions and opinions about the LMS. If participants indicated that they did not use the LMS, they were not asked to complete the second
section of the survey.
The data on LMS use was generated from the event logs from the winter semester (four months) immediately preceding administration
of the 2007 survey. Event logs capture when a user takes a particular action with a specific tool, such as downloading a document or posting an announcement, and are similar to data from other LMS (e.g. Romero, Ventura, & García, 2008). There were a total of 12,238,089
events during the semester.
2.3. Design and procedure
Our analysis of the survey data began by looking at how all respondents answered general questions about information technology (IT)
improving teaching and learning, and a question about their perception of the most valuable benefit from using IT. Based on the results
from this later question, we identified questions in the second section of the survey that could be categorized as either ‘‘efficient communication” or ‘‘teaching and learning” items. Our analyses also looked at the respondents’ role (instructor/student), computer expertise, use/
preference for IT in courses, and the survey year. These survey results were then compared to the system log data reporting aggregated user
actions in the LMS over one semester.
Several different statistical methods were employed in our analysis of the survey results. T-tests were used to compare differences between means. Chi-square tests were used to identify statistical differences in categorical data. Finally, multinomial logistic regressions
were used in order to investigate differences in responses between students and instructors across both survey years. In addition to this
regression analysis, a descriptive analysis was also performed using frequency ratios.
3. Results
Because the university does not require LMS use for every course, we needed to ascertain within the survey how many instructors and
students used this technology. We found that the majority of instructor and student respondents used the LMS, and usage increased over
time. Table 2 shows that almost every individual student on campus used the LMS, as did most faculty.
3.1. Does information technology improve teaching and learning?
All survey respondents, including users and non-users of the LMS, were asked to rate their agreement with two general statements
about whether the use of IT in courses improved instructors’ teaching and students’ using a 5-point Likert scale from 1 = strongly disagree
to 5 = strongly agree. Table 3 shows that instructors and students believed that IT improved teaching and learning, although students were
less positive about the effect of IT on instructors’ teaching, (t (7378) = 10.254, p < .001).
We also used two survey items to characterize our population: computer expertise (novice, intermediate, or advanced) and overall use
(instructors) or preference for use (students) of IT in their courses (none, limited, moderate, extensive, exclusive). Preliminary analysis indicated that there was no significant difference between instructors and students regarding computer expertise, but there was a significant
Table 1
Number of survey respondents and survey response rate by year.
Year
Instructors
Students
Respondents
Response rate (%)
Respondents
Response rate (%)
2006
2007
n = 1357
n = 1481
19
20
n = 2485
n = 2281
27
26
Table 2
Usage of LMS by year.
Year
% of instructors using LMS
% of students using LMS
2006
2007
81
85
98
99
Table 3
Mean ratings (and standard deviations) for survey items about information technology’s effect on teaching and learning.
Survey Item
Instructors n = 2698
Students n = 4682
Mean difference
IT improves instructors’ teaching
IT improves students’ learning
3.92 (.880)
3.92 (.852)
3.70 (.918)
3.96 (.864)
.22*
.04
*
p < .001.
Author's personal copy
689
S. Lonn, S.D. Teasley / Computers & Education 53 (2009) 686–694
difference with regard to use/preference of IT, (t (7591) = 19.687, p < .001) where students reported preferring a higher level of IT use
(M = 3.29, SD = .779) than did instructors (M = 2.93, SD = .797). Respondents who indicated that they did not use (77 instructors) or prefer
using (69 students) IT generally were still included in the analysis of LMS because they did use the system in at least one of their courses.
3.2. Most valuable benefits of information technology
We evaluated a forced choice survey item that asked all respondents to specify ‘‘Which one of the following benefits from using information technology in your courses was the most valuable to you?” (see Table 4). Overall, there was a significant difference in how instructors and students answered this question (X2(7, N = 9657) = 1354.990, p < .001).
The most popular response from instructors (39%) was ‘‘improved my communication to my students” while the most popular response
from students (45%) was ‘‘efficiency (saves time)”. However, 26% of instructors choose ‘‘efficiency (saves time)” indicating that efficiency is
important to many instructors as well as their students. Very few instructors or students chose teaching or learning improvements as the
most valuable benefit from using IT in their courses, nor did many respondents choose the items about student-to-instructor communication or student-to-student communication. This suggests that these systems are valued most by faculty and students for the ways they
improve instructors’ ability to push out information to students rather than general support for teaching and learning, and for opening up
communication from the students to the instructor or from the students to their peers.
3.3. LMS event log data and corresponding survey results
We investigated the aggregated LMS event logs to see if actual system use was consistent with the preferences expressed in the survey
data. The user action logs were aggregated by tool for the winter semester immediately preceding the 2007 survey. We also compared the
use of each tool to the corresponding 2007 survey items about the perceived value of each LMS tool asking the respondent if they considered the tool to be Not Valuable, Valuable, or Very Valuable. Table 5 shows that there were significant differences in how instructors and
students valued the following tools: Content Sharing (X2(2, N = 2840) = 27.680, p < .001), Assignments (X2(2, N = 2785) = 38.908, p < .001),
Announcements (X2(2, N = 3028) = 107.118, p < .001), Discussion (X2(2, N = 1678) = 15.465, p < .001), and Syllabus (X2(2,
N = 2873) = 29.114, p < .001). Overall, document management and broadcast-oriented communication tools (Content Sharing, Assignments,
Announcements, Schedule, and Syllabus) are heavily used (95% of all user actions) and very few instructors or students rated them as Not
Valuable. By contrast, tools that are more interactive (Chat, Discussion, and Wiki) are not used as much (5% of all user actions) and, compared with the other tools, more instructors and students rated them as Not Valuable.
3.4. Efficient communication and teaching and learning activities
In addition to asking about specific tools, we also asked respondents to rate the value of specific activities afforded by the LMS and categorized them as supporting ‘‘efficient communication” or ‘‘teaching and learning” (see Table 6). These items were also rated as Not Valuable, Valuable, or Very Valuable.
We examined independently each of our four respondent characteristics (role, survey year, computer expertise, use/preference of IT).
Our first analysis looked at differences between instructors and students, aggregated over two years. Table 7 shows that there were significant differences between instructors’ and students’ value ratings for all of the ‘‘efficient communication” items: Post/Access a syllabus
Table 4
Results of survey item about most valuable benefit from using information technology in courses.
Instructors n = 2573
Students n = 4537
Improved
instructor
teaching (%)
Improved
student
learning (%)
Efficiency
(saves
time) (%)
Helped manage
course
activities (%)
Improved student-toinstructor
communication (%)
Improved instructor
to student
communication (%)
Improved studentto-student
communication (%)
No benefits
(%)
7
1
8
14
26
45
11
18
5
4
39
15
3
2
1
1
Table 5
Aggregated LMS log data and corresponding survey results by individual tool.
Tool
Aggregated log reports
2007 survey results
% of all actions
Role
n
% Not Valuable
% Valuable
% Very Valuable
X2
Content Sharing
91.54
2.39
Chat
1.19
Announcements
0.59
Schedule
0.57
Discussion
0.55
Wiki
0.14
Syllabus
0.08
1008
1832
883
1902
293
1205
1075
1953
520
1284
390
1288
126
461
941
1932
1
2
3
4
38
38
2
3
20
22
28
35
33
34
6
2
28
37
42
54
44
46
33
52
49
50
51
52
48
47
42
45
71
61
55
42
18
16
65
45
31
28
21
13
18
19
52
53
27.680*
Assignments
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
*
p < .001.
38.908*
1.129
107.118*
1.706
15.465*
.097
29.114*
Author's personal copy
690
S. Lonn, S.D. Teasley / Computers & Education 53 (2009) 686–694
(X2(2, N = 6087) = 64.226, p < .001), Send/Receive announcements (X2(2, N = 6165) = 152.652, p < .001), Post/Access online readings (X2(2,
N = 6157) = 68.998, p < .001), Students turn in assignments online (X2(2, N = 4344) = 42.386, p < .001), and Instructors return assignments
online (X2(2, N = 2868) = 49.909, p < .001). Overall for almost all of these activities, the Very Valuable rating was used by a higher percentage of instructors than students. Also, very few instructors or students rated these activities as Not Valuable.
There were also significant differences in how instructors and students valued all but one of the ‘‘teaching and learning” items (see Table
8): Post/Access sample exams and quizzes (X2(2, N = 4577) = 37.507, p < .001), Post/Access lecture notes/outline after lecture (X2(2,
N = 5329) = 47.946, p < .001), Students post questions before lecture (X2(2, N = 2045) = 11.185, p = .004), and Students read/comment on
each other’s work (X2(2, N = 2016) = 56.283, p < .001). Overall, both instructor and student respondents rated these activities as Valuable
rather than Very Valuable, although, similar to the efficient communication survey items, very few instructors or students rated these
activities as Not Valuable.
3.5. Analyzing efficient communication and teaching and learning activities across years
For each variable analyzed above in Tables 7 and 8, the two years of survey data were collapsed. Since continued use of and exposure to
the LMS may have affected instructors’ and students’ perceptions, we were interested to see if there were any changes over time. Multinomial logistic regressions were used to investigate whether (1) the survey responses differed between students and instructors; (2) there
were significant changes in responses between 2006 and 2007; and (3) those changes, with regard to either size or direction, depended on
respondents’ role. To address these questions, the analysis of each survey item started with a main effect model including two binary variables, role (instructors vs. students) and year (2006 vs. 2007). If both of the covariates were significant in the model, an interaction term
(role year) was also checked. The baseline category was Not Valuable in all of the regressions.
In addition to the regression analysis, a descriptive analysis was also performed using frequency ratios. That is, we took the number of
respondents for each role who answered Valuable or Very Valuable and divided by the number for respondents for each role who answered
Not Valuable (see Tables 9 and 10). For example, if the number of respondents who answered Very Valuable was exactly the same number
Table 6
Survey items by category.
Efficient communication
Teaching and learning
Post/Access a syllabus online
Send/Receive announcement messages (instructor to student)
Post/Access online readings and supplementary course materials
Give/Take exams and quizzes online for grading purposes
Students turn in assignments online
Instructors return assignments to students online with comments and grade
Post/Access sample exams and quizzes for learning purposes
Post/Access lecture outline or notes after the lecture
Students post questions before lecture
Students post questions after lecture
Students read and/or comment on each other’s course-based work
Table 7
Mean differences between instructors and students for efficient communication survey items.
Survey item
Role
n
% Not Valuable
% Valuable
% Very Valuable
X2
Post/Access syllabus
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
1945
4142
2016
4149
2019
4138
346
1428
1000
3344
641
2227
5
2
4
6
1
2
24
24
10
13
15
18
43
40
30
43
29
39
47
52
44
53
40
52
52
58
67
51
70
59
29
24
45
34
45
33
64.226*
Send/Receive announcements
Post/Access online readings
Give/Take online exams and quizzes
Students turn in assignments online
Instructors return assignments online
*
152.652*
68.998*
3.888
42.386*
49.909*
p < .001.
Table 8
Mean differences between instructors and students for teaching and learning survey items.
Survey Item
Role
n
% Not Valuable
% Valuable
% Very Valuable
X2
Post/Access sample exams and quizzes
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
1123
3454
1439
3890
484
1561
531
1905
493
1523
4
3
2
3
15
22
12
16
13
26
44
34
53
42
54
51
66
62
59
59
52
63
45
55
31
27
22
22
28
15
37.507**
Post/Access notes or outline after lecture
Students post questions before lecture
Students post questions after lecture
Students read/comment on each other’s work
*
**
p < .01.
p < .001.
47.946**
11.185*
5.291
56.283**
Author's personal copy
691
S. Lonn, S.D. Teasley / Computers & Education 53 (2009) 686–694
of those who answered Not Valuable, the frequency ratio would be 1. The frequency ratios are conceptually approximate to odds and a plot
of frequency ratios (see Figs. 1 and 2) provides a straight-forward look into the structure explored by the regression analysis.
Below we present the analysis results for each survey item, beginning with the ‘‘efficient communication” items. We have illustrated the
differences between ‘‘efficient communication” and ‘‘teaching and learning” items by graphing the frequency ratios of four representative
items.
For Sending/Receiving an announcement (see Fig. 1, left), there is a significant difference between instructors and students for respondents who answered Very Valuable (OR = 0.44, X2(1, N = 3774) = 35.456, p < .001). There are also significant differences between years for
respondents who answered Valuable (OR = 1.88, X2(1, N = 2688) = 24.557, p < .001) and those who answered Very Valuable (OR = 2.73, X2(1,
N = 3774) = 64.500, p < .001). Overall, more instructors rated this activity as Very Valuable than did students, and both instructors’ and students’ ratings increased over time.
For Posting/Accessing online readings and supplementary course materials (see Fig. 1, right), there is only a significant difference between instructors and students for respondents who answered Very Valuable (OR = 0.52, X2(1, N = 3959) = 8.671, p = .003). Overall, more
instructors rated this activity as Very Valuable than did students, although more students also rate this activity as Very Valuable than Valuable. Both instructors’ and students’ ratings increased over time.
Other ‘‘efficient communication” survey items follow similar patterns as the two items discussed above (see Appendix A). The items in
this category have fairly high frequency ratios and the ratings for both instructors and students increased over time for nearly all survey
items.
The analysis of the ‘‘teaching and learning” survey items used the same statistical methods as above and yielded noticeably different
results than the ‘‘efficient communication” items. For students posting questions before lecture (see Fig. 2, left), there are significant differences between instructors and students for respondents who answered Valuable (OR = 0.65, X2(1, N = 1479) = 8.309, p = .004) and those
who answered Very Valuable (OR = 0.60, X2(1, N = 977) = 9.796, p = .002). There is also a significant difference between years for respondents who answered Valuable (OR = 1.27, X2(1, N = 1479) = 4.115, p = .043). Overall, more instructors rated this activity as Valuable than
did students, although both instructors’ and students’ ratings modestly increased over time.
For students posting questions after lecture (see Fig. 2, right), there is a significant difference between instructors and students for
respondents who answered Valuable (OR = 0.71, X2(1, N = 1910) = 5.238, p = .022). Overall, more instructors rated this activity as Valuable
than did students, who also had a greater proportion of respondents that rated this activity as Valuable than Very Valuable. Both instructors’ and students’ ratings remained consistent over time.
Other ‘‘teaching and learning” survey items follow similar patterns as the two items discussed above (see Appendix A). The items in this
category have fairly low frequency ratios compared to the ‘‘efficient communication” items and the ratings for both instructors and students either stayed relatively flat or only moderately increased over time. Overall, this analysis helped illuminate what determined the
highly significant differences between instructors and students for the ‘‘efficient communication” and ‘‘teaching and learning” survey items
and how respondents rated the value of these items over two consecutive years.
Table 9
Frequency ratios for instructors and students for efficient communication survey items.
Survey Item
Post/Access syllabus
Send/Receive announcements
Post/Access online readings
Give/Take online exams and quizzes
Students turn assignments online
Instructors return assignments online
Role
n
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
2006
1945
4142
2016
4149
2019
4138
346
1428
1000
3344
641
2227
2007
Valuable/Not Valuable
Very Valuable/Not Valuable
Valuable/Not Valuable
Very Valuable/Not Valuable
6.49
18.89
6.91
5.47
17.44
15.54
1.38
2.25
3.57
3.75
2.59
2.64
7.14
27.24
12.98
5.29
41.94
23.20
0.96
1.30
3.80
2.16
2.98
1.35
11.78
27.67
11.32
10.68
27.82
20.92
2.62
2.12
4.82
4.44
2.78
3.22
15.64
40.79
30.72
15.11
66.91
31.92
1.49
0.77
4.80
3.16
3.00
2.03
Table 10
Frequency ratios for instructors and students for teaching and learning survey items.
Survey Item
Post/Access sample exams and quizzes
Post/Access notes or outline after lecture
Students post questions before lecture
Students post questions after lecture
Students read/comment on each other’s
work
Role
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
Instructors
Students
n
1123
3454
1439
3890
484
1561
531
1905
493
1523
2006
2007
Valuable/Not
Valuable
Very Valuable/Not
Valuable
Valuable/Not
Valuable
Very Valuable/Not
Valuable
9.68
14.92
23.87
12.79
3.29
2.10
5.12
3.53
4.53
2.03
12.32
30.42
19.47
16.18
1.97
1.13
1.67
1.17
2.13
0.52
13.29
11.04
23.53
17.31
4.03
2.68
5.52
4.08
4.62
2.67
15.10
17.67
21.00
23.51
2.14
1.36
1.79
1.50
2.12
0.71
Author's personal copy
692
S. Lonn, S.D. Teasley / Computers & Education 53 (2009) 686–694
Fig. 1. Frequency ratios by survey year for Send/Receive Announcements and Post/Access online readings survey items. (Frequency ratio graphs. Inst. V/NV = instructor role
Valuable divided by Not Valuable, Inst. VV/NV = instructor role Very Valuable divided by Not Valuable, Stud V/NV = student role Valuable divided by Not Valuable, Stud. VV/
NV = student role Very Valuable divided by Not Valuable.)
Fig. 2. Frequency ratios by survey year for students posting questions before lecture and students posting questions after lecture survey items. (Frequency ratio graphs. Inst.
V/NV = instructor role Valuable divided by Not Valuable, Inst. VV/NV = instructor role Very Valuable divided by Not Valuable, Stud V/NV = student role Valuable divided by
Not Valuable, Stud. VV/NV = student role Very Valuable divided by Not Valuable.)
4. Analysis and discussion
When asked generally if information technology (IT) improves teaching and learning, both instructors and students we surveyed responded positively, although the students were less positive about IT’s effect on their instructors’ teaching. Students reported preferring
a higher level of IT use than did instructors, and over 50% of students thought that efficiency was the most valuable benefit. By contrast,
our analysis showed that when you ask instructors to pick the most valuable benefit of using IT, the majority picked efficient communication. When asked about the LMS specifically, instructors again differed from students by more often indicating a Very Valuable rating for
efficient communication activities, although very few instructors or students rated these activities as Not Valuable. For teaching and learning activities, however, both instructor and student respondents rated these activities as Valuable rather than Very Valuable. This finding
was consistent with the actual tool use within the system as document and communication management tools (Content Sharing,
Announcements, Assignments, Syllabus) were more often rated as being Very Valuable than the interactive tools (Chat, Discussion, Wiki).
These findings indicate a gap between instructors and students and their perceived benefits and usage levels of IT and LMS technologies
(see also Hornik, Johnson, & Wu, 2007). As long as students fail to see the relevance of interactive tools for their learning or for instructors’
teaching, they are likely to continue to view IT as merely a quick and accessible means to retrieve course documents and get messages from
instructors.
Author's personal copy
S. Lonn, S.D. Teasley / Computers & Education 53 (2009) 686–694
693
In recent years, these types of systems have switched their monikers from Course Management Systems to Learning Management Systems. Embedded in this change is the notion that learning involves more than providing course content efficiently. Our survey results indicate that while both instructors and students agree that information technologies improve learning, students do not agree as strongly as
instructors that such technologies do improve instruction. These ratings suggest that students, in particular, may be responding not to
whether these tools are used, but rather how they are used, (see also Holm et al., 2003). If LMS use is intended to support constructivist-based models of learning rather than instructivist teaching and consumptive learning models (Lipponen, Hakkarainen, & Paavola,
2004), then using tools to scaffold more interactive forms of instruction and learning may be required for success.
Our multiyear analysis of efficient communication and teaching and learning survey items demonstrates that both instructors and students highly rated activities within the LMS that make the transmission of course materials and announcements more efficient, and this
positive viewpoint increased for both groups over time. By contrast, most of the teaching and learning activities were not as highly rated by
instructors or students. However, instructors and students did rate these activities as Valuable and, for some activities, Very Valuable, and
ratings modestly increased or at least remained constant over time. Thus, both instructors and students at this university may be just
beginning to understand the value of the interactive LMS features and through continued use may begin to use them to facilitate more
active learning opportunities (Herse & Lee, 2005; West et al., 2007). If the goal of LMS is to support interactive teaching and learning activities as well as efficient communication, then instructors might benefit from receiving training that shows them how to better use LMS
tools to facilitate student questions and peer evaluation. Such training could help more instructors understand how online interactive
activities can be valuable and to find using them as easy as posting course materials or sending an announcement.
As instructors and students gain experience using the different tools available in the LMS, these tools can affect how teaching and learning take place within the online environment and the classroom. For example, consider two different instructors’ responses to an openended question on our survey that asked how using the LMS has changed instructor’s in-class time. One instructor responded that she
would ‘‘spend a lot less time on administrative details (and does not) have to make announcements about such stuff and students know
they can find stuff (online).” By contrast, another instructor who has used several of the interactive tools responded that the LMS has
helped her students ‘‘to be better prepared, and, by allowing the other students to read the responses in advance, has led to a more effective
use of class time for discussion, analysis, and additional activities.” Both comments reflect a change in the way that class time was used.
However, the first comment does not specify that the additional time changed the way the instructor taught, and indeed log data from this
course site did not show any use of the interactive tools. The latter comment suggests that the efficiency afforded by the LMS allowed this
instructor to engage her students more deeply during class and her use of the interactive tools appears to be one way she extended those
activities outside the classroom as well.
5. Future research
In this examination of a LMS, the interactive tools available in the system were not yet as heavily used nor as highly rated, as were the
tools that simply pushed out information from instructors to students. In future research, we will investigate how specific instructors are
successfully using the more interactive tools to engage their students inside the classroom and out. Case studies can help us better illuminate how the design and structure of LMS tools can support innovative teaching and learning activities to provide better opportunities for
student-centered knowledge construction (Watson & Watson, 2007). By looking more deeply at the various LMS uses, we will be able to
make specific recommendations about system design and about instructional practices to help instructors and students realize the educational potential of these systems. As Learning Management Systems continue to evolve and gain popularity, further research is needed to
help instructors and students identify the most effective ways to use these technologies to improve teaching and learning in higher
education.
Acknowledgements
Many thanks to Jim Lepkowski and Chan Zhang from the Institute for Social Research at the University of Michigan for their review of an
early version of this paper and particularly for their statistical guidance and work on the multiyear analysis of the survey items.
Appendix A. Continued results analyzing efficient communication and teaching and learning activities across years
The analysis of each survey item started with a main effect model including two binary variables, role (instructors vs. students) and year
(2006 vs. 2007). If both of the covariates were significant in the model, an interaction term (role year) was also checked. The baseline
category was ‘Not Valuable’ in all of the regressions. In addition to the regression analysis, a descriptive analysis was also performed using
frequency ratios.
For Posting/Accessing a syllabus, there are significant differences between instructors and students for respondents who answered
‘‘Valuable” (OR = 2.70, X2(1, N = 2646) = 38.621, p < .001) and those who answered ‘‘Very Valuable” (OR = 3.30, X2(1, N = 3614) = 56.646,
p < .001). There are also significant differences between years for respondents who answered ‘‘Valuable” (OR = 1.68, X2(1,
N = 2646) = 10.116, p = .002) and those who answered ‘‘Very Valuable” (OR = 1.81, X2(1, N = 3614) = 13.425, p < .001). Overall, more students rated this activity as ‘‘Valuable” and ‘‘Very Valuable” than did instructors, although both instructors’ and students’ ratings increased
over time.
For giving/taking exams and quizzes online for grading purposes, there is only a significant difference between survey years for respondents who answered ‘‘Very Valuable” (OR = 0.73, X2(1, N = 870) = 5.148, p = .023). The proportion of students who rated this activity as
‘‘Valuable” or ‘‘Very Valuable” decreased over time, while the proportion of instructors who rated this activity ‘‘Valuable” or ‘‘Very Valuable” increased over time. For both instructors and students, there were more respondents who rated this activity as ‘‘Valuable” than ‘‘Very
Valuable.”
For having students turn in assignments online, there is a significant difference between instructors and students for respondents who
answered ‘‘Very Valuable” (OR = 0.62, X2(1, N = 2132) = 15.942, p < .001). There are also significant differences between years for respondents who answered ‘‘Valuable” (OR = 1.22, X2(1, N = 2754) = 4.134, p = .042) and those who answered ‘‘Very Valuable” (OR = 1.40, X2(1,
Author's personal copy
694
S. Lonn, S.D. Teasley / Computers & Education 53 (2009) 686–694
N = 2132) = 11.454, p < .001). About the same proportion of instructors rated this activity as ‘‘Valuable” and ‘‘Very Valuable” in both survey
years, while there was a lower proportion of students who rated this activity as ‘‘Very Valuable.” Both instructors’ and students’ ratings
increased over time.
For having instructors return assignments to students online with comments and grade(s), there is a significant difference between
instructors and students for respondents who answered ‘‘Very Valuable” (OR = 0.56, X2(1, N = 1448) = 18.963, p < .001). There are also significant differences between years for respondents who answered ‘‘Very Valuable” (OR = 1.37, X2(1, N = 1448) = 7.797, p = .005). About the
same proportion of instructors rated this activity as ‘‘Valuable” and ‘‘Very Valuable” in both survey years, while there was a lower proportion of students who rated this activity as ‘‘Very Valuable.” Students’ ratings increased over time while instructors’ ratings remained about
the same.
For Posting/Accessing sample exams and quizzes for learning purposes, there is a significant difference between instructors and students for respondents who answered ‘‘Very Valuable” (OR = 2.47, X2(1, N = 2889) = 10.708, p = .001). Overall, about the same proportion
of instructors rated this activity as ‘‘Valuable” and ‘‘Very Valuable” in both survey years and students’ ratings decreased over time while
instructors’ ratings slightly increased.
For Posting/Accessing a lecture outline or notes after the lecture, there is a significant difference between instructors and students for
respondents who answered ‘‘Valuable” (OR = 0.63, X2(1, N = 2538) = 5.249, p = .022). A greater proportion of students rated this activity as
‘‘Very Valuable” than did ‘‘Valuable” and a greater proportion of instructors rated this activity as ‘‘Valuable” than did ‘‘Very Valuable. Students’ ratings increased over time while instructors’ ratings remained about the same.
For students reading and/or commenting on each other’s course-based work, there are significant differences between instructors and
students for respondents who answered ‘‘Valuable” (OR = 0.51, X2(1, N = 1645) = 20.133, p < .001) and those who answered ‘‘Very Valuable”
(OR = 0.29, X2(1, N = 825) = 52.312, p < .001). There is also a significant difference between years for respondents who answered ‘‘Valuable”
(OR = 1.27, X2(1, N = 1645) = 4.454, p = .035). Overall, more instructors rated this activity as ‘‘Valuable” than did students, who also had a
greater proportion of respondents that rated this activity as ‘‘Not Valuable” than ‘‘Very Valuable,” resulting in a proportion of less than 1.
Both instructors’ and students’ ratings remained about the same over time.
References
Becker, R., & Jokivirta, L., (2007). Online learning in universities: Selected data from the 2006 Observatory survey – November 2007. The observatory on borderless higher
education (OBHE). [Electronic Version] <http://www.obhe.ac.uk> Retrieved 15.03.09.
Bender, B. (2005). Learner engagement and success in CMS environments. In P. McGee, C. Carmean, & A. Jafari (Eds.), Course management systems for learning: Beyond accidental
pedagogy (pp. 107–113). Hershey, PA: Information Science Publishing.
Browne, T., Jenkins, M., & Walker, R. (2006). A longitudinal perspective regarding the use of VLEs by higher education institutions in the United Kingdom. Interactive Learning
Environments, 14(2), 177–192.
Carmean, C., & Haefner, J. (2002). Mind over matter: Transforming course management systems into effective learning environments. Educause Review, 37(6), 26–34.
Coaldrake, P., & Stedman, L. (1999). Academic work in the twenty-first century. Occasional paper series, higher education division, DETYA, no. 99H. Canberra: Australian
Government Publishing Service.
Dougiamas, M., & Taylor, P. C. (2003). Moodle: Using learning communities to create an open source course management system. In Proceedings of the EDMEDIA 2003
Conference, Honolulu, HI.
Dutton, W. H., Cheong, P. H., & Park, N. (2003). The social shaping of a virtual learning environment: The case of a university-wide course management system. The Electronic
Journal of e-Learning, 2(1).
Gaensler, I. E. (2004). A study of social constructivist learning in a WebCT-based precalculus course. Unpublished Doctoral dissertation, Georgia State University.
Hanson, P., & Robson, R. (2004). Evaluating course management technology: A pilot study. Boulder, CO: EDUCAUSE Center for Applied Research, Research Bulletin(24).
Hawkins, B. L., & Rudy, J. A. (2007). Educause core data service. Fiscal year 2006 summary report. Boulder, CO: Educause.
Herse, P., & Lee, A. (2005). Optometry and WebCT: A student survey of the value of web-based learning environments in optometric education. Clinical and Experimental
Optometry, 88(1), 46–52.
Holm, C., Röllinghoff, A., & Ninck, A. (2003). WebCT and e learning in Switzerland. In Proceedings of the 5th International Conference on New Educational Environments, Luzerne
(pp. 139–143).
Hornik, S., Johnson, R. D., & Wu, Y. (2007). When technology does not support learning: Conflicts between epistemological beliefs and technology support in virtual learning
environments. Journal of Organizational and End User Computing, 19(2), 23–46.
Koszalka, T. A., & Ganesan, R. (2004). Designing online courses: A taxonomy to guide strategic use of features available in course management systems (CMS) in distance
education. Distance Education, 25(2), 243–256.
Lipponen, L., Hakkarainen, K., & Paavola, S. (2004). Practices and orientations of CSCL. In J. Strijbos, P. A. Kirschner, R. L. Martens, & P. Dillenbourg (Eds.), What we know about
CSCL and implementing it in higher education (pp. 31–50). Norwell, MA: Kluwer Academic Publishers.
McGill, T. J., & Klobas, J. E. (2009). A task-technology fit view of learning management system impact. Computers and Education, 52(2), 496–508.
Morgan, G. (2003). Faculty use of course management systems. Boulder, CO: EDUCAUSE Center for Applied Research.
Parker, R. E., Bianchi, A., & Cheah, T. Y. (2008). Perceptions of instructional technology: Factors of influence and anticipated consequences. Educational Technology and Society,
11(2), 274–293.
Romero, C., Ventura, S., & García, E. (2008). Data mining in course management systems: Moodle case study and tutorial. Computers and Education, 51(1), 368–384.
Swan, K. (2001). Virtual interaction: Design factors affecting student satisfaction and perceived learning in asynchronous online courses. Distance Education, 22(2), 306–332.
Topper, A. (2003). Teacher professional development via distance education: Assessing student learning in a web-based learning environment (WBLE). In Proceedings of the
society for information technology and teacher education international conference annual (pp. 2526–2531). Albuquerque, NM.
Watson, W. R., & Watson, S. L. (2007). An argument for clarity: What are learning management systems, what are they not, and what should they become? Tech Trends, 51(2),
28–34.
West, R. E., Waddoups, G., & Graham, C. R. (2007). Understanding the experiences of instructors as they adopt a course management system. Educational Technology Research
and Development, 55(1), 1–26.
Yohon, T., Zimmerman, D., & Keeler, L. (2004). An exploratory study of adoption of course management software and accompanying instructional changes by faculty in the
liberal arts and sciences. Electronic Journal of e-Learning, 2(2), 313–320.