DIMENSIONS OF COURSE DESIGN AND DELIVERY

advertisement
CSECS 2011, pp. 000 - 000
The 7 Annual International Conference on
Computer Science and Education in Computer Science,
July 06-10 2011, Sofia, Bulgaria
DIMENSIONS OF COURSE DESIGN AND DELIVERY
AND THEIR EFFECT ON STUDENT
SATISFACTION/PERCEPTION IN ONLINE LEARNING
Tanya ZLATEVA, Svetlana WILLETT, Suresh KALATHUR,
Robert SCHUDY, Leo BURSTEIN, Lou CHITKUSHEV,
Masatake SAITO, Elizabeth M HAINES
Abstract: Online learning is a disruptive technology that has significantly
transformed the educational landscape in a very short time. It is therefore of
imminent importance to understand the major factors that determine the online
learning experience. This paper analyzes student perceptions based on course
evaluations from 53 computer information courses with a total enrollment of
4,089. A multiple regression analysis of factors along four dimensions—course,
instructor, facilitator and technology—identified as significant factors the course
material organization, discussions, assignments, and the instructor’s ability to
present. This indicates that student satisfaction is independent from the delivery
medium. A correlation analysis of the relationship of course size and student
satisfaction suggests that courses with more than 100 students pose a greater
challenge but that this challenge can be addressed by adding novel multi-media
components such as synchronous video-collaboration sessions and ad-hoc
whiteboard discussions.
Keywords: online learning, student perceptions, face to face learning, course
design parameters, media-rich, synchronous, asynchronous, correlation and
regression analysis,
ACM Classification Keywords: computer science education, computer
information education, multi-media, animation, simulation
2
1
Zlateva et al.
INTRODUCTION
Online learning continues to expand within a fundamental dichotomy:
lauded for its dynamism, flexibility, multi-modal technology its academic
validity continues to be questioned for the (presumed) disconnect
between students and teachers. For the last seven years the growth of
online enrollments has consistently far exceeded the growth of the
overall college population [Allen, 2010]. In the United States more than
5.6 million students (or 30% of the higher education population) were
taking at least one online class in the fall 2009 semester. This is a
substantial 21% increase as compared to a less than 2% increase of the
higher education population [Allen, 2010]. In contrast to this the
perception of the quality of online education improved more modestly:
66% of academic leaders believe it to be the same or superior to face-toface education as compared to 57 % in 2009. Independently on where
one stands in the online education debate it is clear that we are
witnessing a disruptive innovation in one of the most traditional fields of
human endeavor. It is therefore of eminent importance to understand
the major factors that determine the online learning experience.
Considerable work has been devoted to this problem and a growing
number of studies address specific design aspects, courses in different
fields, as well as student and faculty perceptions of online learning in
general, e.g. [Volery, 2000], [Soong et al., 2000], [Sun et al., 2008].
In a previous study [Zlateva et al., 2010] we introduced a parametric
model for online courses that included class size, course content,
assessments, and student satisfaction and used it to analyze the online
learning experience based on data from 51 online courses delivered in
the MS in Computer Information Program at Boston University’s
Metropolitan College. This paper follows up on the previous results and
takes a more in-depth look at student perceptions as reflected in the
Dimension of Course Design and Delivery
CSECS 2011, July 7-11 2011, Sofia, Bulgaria
3
student course evaluations. The data set was expanded with three more
courses and more importantly with the full student evaluation survey.
All courses were part of the Master’s in Computer Information Systems
that is offered online in a fully asynchronous mode with optional live
webinar and video-conferencing for discussions sessions. The courses
are offered in an intensive seven week format instead of the traditional
14 week semester and are implemented in Blackboard Vista with mediarich online content, discussion boards, videos, simulations, selfassessments, virtual laboratories, online exams. The courses are
developed and delivered almost exclusively by full-time faculty and
capped at 150 students. In addition to the faculty of record for the course
a facilitator is assigned to every 15 students with responsibilities to
answer questions, lead discussions and grade homework assignments.
2
DATA AND METHODOLOGY
The raw data was drawn from 54 online classes delivered over seven
semesters from Summer 2008 to Summer 2010. A large number of
parameters describing course structure, assessments, class size, and
student perception was collected. In this paper we discuss the major
factors for student satisfaction and follow up on our initial findings
[Zlateva et al., 2010] that pointed to a negative impact of class size and
student satisfaction. The data used for the present discussion include
along the class size responses to an online survey of 30 questions that
are rated on a five-level Likert scale with 1 (negative/strongly disagree)
and 5 (positive/strongly agree). The latter fall into four groups that
assess perceptions of the course, instructor, facilitators, and technical
support. The exact wording of the questions is given in Table 1. The
response rate for the survey ranged from 27.78% to 59.62% with a mean
of 52.38% and standard deviation 8.89%. A confidence interval of 95%,
(including mean response rates from 34.96 to 69.81) was computed and
The Title of the Section
4
Zlateva et al.
led to the exclusion of one class. The remaining aggregated data from 53
classes with an overall enrollment of 4,089 provided the basis for the
regression analysis.
The survey questions directly relate to course content and design
parameters such as intellectual challenge, structure, discussion, multiple
modalities; instructor qualifications and ability (subject mastery,
presentation skills, grading, openness to question); facilitator
contributions (clarity and timeliness of response, encouraging of
discussions, added value); and course technology (navigation, user
manuals, accessibility, student services). Determining the degree to
which the individual parameters contribute to the overall course
perception is at the heart of an effective course improvement and
development. Toward this goal we undertook an analysis of the
relationships within the questionnaire as well as between the individual
survey questions and the class size.
The correlation matrix was computed as an exploratory first step for
identifying potential dependencies. It displayed two salient
characteristics: a considerable complexity and range of correlation
levels (from the slightly negative to over 0.9 correlation coefficients) and
consistent negative correlation of class size with all survey questions. To
better understand the underlying relationships we analyzed the
significance of the individual survey questions through multiple
regression analysis (section 3) and the relationship of class size with
overall course satisfaction in different time period (section 4).
Dimension of Course Design and Delivery
Table 1: Student Evaluation Survey
Course
CC01 – I found the class intellectually
challenging
CC02 – Course materials were well
organized and clearly presented
CC03 - Discussion topics enhanced the
learning experience
CC04 - Assignments furthered
understanding of course content
CC05 - Textbook/cases/course materials
furthered understanding of course content
CC06a - Animations/Simulations enhanced
understanding of key concepts
CC06b - Videos enhanced understanding of
key concepts
CC06c - Webinars/Web-Meetings (e.g.
GoTo meetings) enhanced understanding
of key concepts
CC06d - Video-Conferencing enhanced
understanding of key concepts
CC07 - I would recommend this course to
others
CC08 - The overall course experience was:
Instructor
CE09 - The instructor’s
mastery of the
course materials
was:
CE9a – The instructor’s
ability to present
course material is:
CE10 - The instructor’s
grading criteria are
fair and clear
CE11 – The instructor
was supportive and
responsive to my
questions
CE12 – Assignments
were returned in a
timely manner by the
instructor
CE13 – I would rate the
instructor overall as:
Facilitator
FE14 – Facilitator
feedback was
informative and
clear
FE15 – Assignments
were returned in a
timely manner by
the facilitator
FE16 – The facilitator
responded to my
questions in a timely
manner
FE17 – The facilitator
added value to my
learning experience
FE18 – The facilitator’s
ability to encourage
questions/discussion
s is:
FE19 – The facilitator’s
overall rating is:
Course Technology
CT20 – Navigation allowed
easy access to information.
CT21 – Instructions as to how
to use media technologies
(audio, video, CD-ROMs,
etc.) were
CT22 – Access and response
time to courseware system
was:
CT23 – Technology support
was:
CT24 – I was able to resolve
course problems with the
help of Technical Support in
a timely manner
CT25 – The Student Services
Representative/Manager
was:
CT26 – Overall, I would rate
the course technology and
support as:
3
MULTIPLE REGRESSION ANALYSIS AND DISCUSSION
OF RESULTS
The correlation matrix revealed a number of strong correlations between
survey questions, some predictable, (e.g. there is a 0.91 correlation
between the course overall rating and the degree to which the course is
recommended to others) some not so obvious (e.g. it is not readily clear
why the instructor’s ability to present strongly correlates with a coefficient
of 0.72 with his/her support and responsiveness). Given the complexity
of the learning experience, its multiple aspects and the large number of
parameters needed for an accurate representation it is critical to identify
the most significant factors that shape the overall course perception.
Towards this goal we performed multiple regression analysis for the
“overall course experience” (CC08) as the dependent variable. The
independent variables were drawn from the remaining parameters with
the exception of parameters in close to linear dependence (r > 0.9) with
the predictor, such as “I would recommend this course to others” (CC07),
and parameters that address the overall experience and do not yield
information about specific aspects of course design, such as “I would
rate the instructor overall as”(CE13), “The facilitators’ overall rating
is”(FE19), “ Overall, I would rate technology and support”(CT26) .
Additional parameters were excluded to ensure that the predictor set will
be free of strong pairwise correlations, i.e. correlation coefficients
between predictors will be less than 0.7. In order to exclude highly
correlated pair(s) regression computation was performed for all
combinations of uncorrelated predictors. The combination with the
largest R-Square was retained for the final regression computation. The
multiple regression was performed in two stages: First, the statistically
significant parameters within each category were determined. These
category parameters were then combined and considered predictors in
an integrated model including the course, instructor, facilitator, and
CSECS 2011, July 7-11 2011, Sofia, Bulgaria
7
educational technology aspect. Regression analysis of the integrated
model identified the strongest predictors for the overall course
experience.
3.1
Course Dimensions
The course category has eleven parameters and after excluding the
dependent variable “overall course experience” (CC08) and the almost
linearly related “recommend course to others” (CC07) we are left with a
pool of nine candidates for the predictor set. Regression including all
nine variables results in an R-Square of 0.8656, i.e. the predictors
account for 86.56% of the variance of the dependent variable CC08 .
However, two of the nine predictors—“webinar/web-meetings”(CC06c)
and “video-conferencing” (CC06d)—are highly correlated (r=0.91 ).
Table 1: Course dimensions.
(statistically significant for p < 0.05; 95% confidence level)
Variable
Pr > |t| (p value)
Intercept
0.0026
CC01 intellectually challenging
0.6046
CC02 materials organized clearly
<.0001
CC03 discussion enhanced learning
0.0033
CC04 assignments furthered understanding
0.0019
CC05 materials furthered understanding
0.2406
CC06a animations/simulations enhanced
0.0609
CC06b videos enhanced
0.3302
CC06d video-conferencing enhanced
0.5253
Regression analysis for the two uncorrelated predictor combination,
without CC06c and without CC06d, yields an R-Square values of 0.8655
The Title of the Section
8
Zlateva et al.
and 0.8648 respectively; thus we exclude CC06c and keep CC06d for
further analysis.
The p-values for the parameters obtained from the regression are shown
in Table 1 and reveal that there are only three statistically significant
predictors.
3.2
Instructor Dimensions
The instructor category contains five parameters and when all are used
as predictors R-square is 0.8163. Strong correlations exist between
CE09 and CE09a, and between CE11 and CE09a. The combination
excluding CE09 and CO11 has the highest R-square (0.8040) among all
parameter combinations with no strong pairwise correlation and is
retained for further analysis.
Table 2 shows the p-values of the parameters. There are two statistically
significant predictors—the instructor’s ability to present (CS09a) and the
fairness and clarity of the grading criteria (CE10).
Table 2: Instructor dimensions.
(statistically significant for p <0.05; 95% confidence level)
Variable
Pr > |t| (p value)
Intercept
0.0028
CE09a ability to present
<.0001
CE10 grading criteria
0.0013
CE12 assignments returned timely
0.2730
3.3
Facilitator Dimensions
The survey questions in the facilitator category show substantial
dependencies (between FE14-16, FE14-17, FE14-18, FE15-16, FE1517, and FE16-17). Regression over all variables yields an R-square of
Dimension of Course Design and Delivery
CSECS 2011, July 7-11 2011, Sofia, Bulgaria
9
0.3556. The parameter combination retaining FE15 and FE18 has an
R-square of 0.3468 which is the largest of all combinations free of strong
correlation. The resulting parameter estimates and p-value (Table 3)
indicate that only one parameter, the facilitator’s ability to encourage
questions and discussions (FE18), is statistically significant.
Table 3: Facilitator dimensions.
(statistically significant for p <0.05; 95% confidence level)
Variable
Pr > |t| (p value)
Intercept
FE14 Facilitator feedback informative and clear
0.0309
FE15 Assignments returned in a timely manner
0.2670
FE18 Facilitators encourage questions/discussion
0.0047
3.4
0.6794
Educational Technology Dimensions
Educational technology parameters were very weakly correlated and
therefore were all included in the regression that yielded an R-square of
0.3162 and as a single statistically significant parameter “navigation
allows easy access to information” (Table 4) .
Variable
Table 4: Educational technology dimensions.
(statistically significant for p <0.05; 95% confidence level)
Pr > |t| (p value)
Intercept
0.1229
CT20 Navigation easy to access
CT21 -Instructions for technology use are clear
0.0537
CT22 Access and response time to courseware system
0.1746
CT23 Technology support
CT24 Technical support timely
0.1432
CT25 Student services representative/manager:
0.3415
The Title of the Section
0.4273
0.5790
10
3.5
Zlateva et al.
Significant Factors for Overall Course Satisfaction
The regression model that integrates the statistically significant factors of
the categories was characterized by an R-square of 0.8988 and yielded
four statistically significant predictors—clear organization of the material
(CC02), discussion (CC03), assignments (CC04), and the instructor’s
ability to present (Table 5).
Table 5: Significant factors for overall course experience
(statistically significant for p <0.05; 95% confidence level)
Variable
Pr > |t| (p value)
Intercept
0.0014
CC02 materials organized clearly
0.0182
CC03 discussion enhanced learning
0.0539
CC04 assignments furthered understanding
0.0065
CE09a instructor’s ability to present
0.0004
CE10 grading criteria
0.0765
FE18 facilitators encourage questions/discussions
0.7481
CT20 navigation easy access
0.2484
In a learning environment distributed through cyber space and so
crucially dependent on technology for access, communication and
assessment it is reasonable to expect that perception are substantially
shaped by the nature and quality of the online medium. The surprising
result in our analysis is that none of the statistically significant factors
relates to a technology dimension. This is in contrast to prior studies that
identified technology as critical for success (e.g. [Volery, 2000], [Soong
et al., 2000])
Dimension of Course Design and Delivery
CSECS 2011, July 7-11 2011, Sofia, Bulgaria
11
Instead the determining factors—class structure, discussion,
presentation ability--are mainstays of learning theory since its very
beginnings. In theories of distance education, more notably Moore’s
theory of transactional distance [Moore, 1993], they are considered the
basic dimensions that define the learning experience.
4
CLASS SIZE AND STUDENT SATISFACTION
Earlier analysis based on data from 2008 to 2009 identified a problem
with satisfaction in higher enrollment courses. For example, not one of
the six courses with more than 100 enrollments in 2008 and 2009 had
attained a student satisfaction of more than 4.0, while more than a dozen
courses with enrollments below 100 had attained satisfaction above 4.0.
During these years the correlation coefficient between overall student
satisfaction and enrollment was -0.4561 [Zlateva et al., 2010].
We revisited these findings in the expanded data set and took a closer
look at the relationship between class size and student satisfaction in
consecutive years. We found that the overall trend was only slightly
negative with a correlation coefficient of -0.156 (Figure 1). In addition
several classes had achieved higher than 4.0 average rating. More
interestingly the scatter plots for the three consecutive years showed the
correlation changing from -0.434 to -0.368 in the first two years to a
slightly positive 0.174 in 2010-2011. (Figure 2-4).
We hypothesize that this is due to the addition of novel multi-media
interactive components (lecture and problem solution recordings, live
classroom, additional animation) and expanding the self-assessment
questions and adding term projects that are sequenced with the content.
The Title of the Section
12
Zlateva et al.
5
4.5
4
3.5
3
2.5
0
50
100
150
200
Figure 1: Course Overall vs. Enrollments SU 2008-SP 2011
correlation coefficient = -0.15655, 75 classes with 5,951 overall enrollment
5
4.5
4
3.5
3
2.5
0
50
100
150
200
Figure 2: Course Overall vs. Enrollments SU 2008-SP 2009
correlation coefficient = --0.4342, 22 classes with 1,677 overall enrollment
Dimension of Course Design and Delivery
CSECS 2011, July 7-11 2011, Sofia, Bulgaria
13
5
4.5
4
3.5
3
2.5
0
50
100
150
200
Figure 3: Course Overall vs. Enrollments SU 2009-SP 2010
correlation coefficient = -0.3681, 24 classes with 1,920 overall enrollment
5
4.5
4
3.5
3
2.5
0
50
100
150
200
Figure 4: Course Overall vs. Enrollments SU 2010-SP 2011
correlation coefficient =-0.1745, 29 classes with 2,354 overall enrollment
5
CONCLUSION
Our analysis indicates that the main course design and delivery
parameters that determine student satisfaction are independent from
the delivery medium. Students perceive online courses online based on
The Title of the Section
14
Zlateva et al.
the same course attributes as students perceive face-to-face courses.
They key determiners of satisfaction are the quality and organization of
the content, clear grading policy, the instructor’s ability to present the
material well, assignments that furthered understanding, and discussions
that enhanced learning.
A follow up analysis of the relationship between class size and student
satisfaction showed that the initially negative correlation has become
slightly positive in the last year. We believe this is due by the increasing
maturity of the courses and greater experience of the faculty as well as
the additional resources allocated to the course. More specifically we
added facilitators without group, responsible for conducting synchronous
sessions with the students and address more difficult aspects of the
material and/or demonstrate problem solutions. The synchronous
sessions were conducted in a multi-media video-collaboration
environment and included formal lecture presentations, question and
answer sessions, ad hoc whiteboard discussions, chat, or/and other
multimedia content. The sessions were recorded and made available for
review. Further analysis is needed to better understand how and to what
extent each of these aspects contributes to improve student perceptions.
6
BIBLIOGRAPHY
[Allen, 2010] I.E. Allen, J. Seaman. Class Differences: Online Education in the
United States 2010. 8th Annual Survey of the Sloan Consortium
http://www.sloan-c.org/publications/survey/pdf/learningondemand.pdf
[Moore, 1993] M. Moore. Theory of transactional distance. In "Theoretical
Principles of Distance Education, Keegan, D. (ed.). Routledge, pp. 22-38.
[Soong et al., 2000] M. H. B. Soong, H.C. Chan, B.C. Chua, K. F. Loh. Critical
success factors for on-line course resources. Computers & Education,
Volume 36, Issue 2, February 2001, Pages 101-120.
[Sun et al., 2008] P-C. Sun, R. J. Tsai, G Finger, Y-Y. Chen, D. Yeh. What
drives a successful e-Learning? An empirical investigation of the critical
Dimension of Course Design and Delivery
CSECS 2011, July 7-11 2011, Sofia, Bulgaria
15
factors influencing learner satisfaction. Computers & Education 50 (2008)
1183–1202.
[Volery, 2000] T. Volery, D. Lord. Critical success factors in online education,
International Journal of Educational Management, Vol. 14 Iss: 5, pp.216 223
[Zlateva et al., 2010] T. Zlateva, M. Saito, S. Kalathur, R. Schudy, A. Temkin, L.
Chitkushev. A Unified Approach for Designing, Developing, and Evaluating
Online Curricula. 6th International Workshop on Computer Science and
Education in Computer Science, Fulda-Munich, Germany, June 2010
7
AUTHORS' INFORMATION
Tanya ZLATEVA, Ph.D., Assoc. Professor, Boston University, Boston, MA, USA,
zlateva @bu.edu
Major Fields of Scientific Research: cyber security, computer science education,
visual recognition
Svetlana WILLETT, BS, Master’s candidate
Major Fields of Scientific Research: statistics
Suresh KALATHUR, Asst. Professor, Boston University, Boston, MA, USA,
Major Fields of Scientific Research: data mining, programming languages, web
technologies, computer science education, visual recognition
Robert SCHUDY, Assoc. Professor, Boston University, Boston, MA, USA,
Major Fields of Scientific Research: data bases, computer science education
Leo BURSTEIN, MS, Senior Architect, Boston University, Boston, MA, US,
Major Fields of Scientific Research: educational technologies
Lou CHITKUSHEV, Assoc. Professor, Boston University, Boston, MA, USA,
Major Fields of Scientific Research: networking, medical informatics, cyber security,
computer science education,
Masatake SAITO, Visting Asst. Professor, Boston University, Boston, MA, USA,
Major Fields of Scientific Research: computer science education, online teaching
technologies
Elizabeth M HAINES, MS, Instructor, Boston University, Boston, MA, USA,
Major Fields of Scientific Research: computer science education, online teaching
technologies
The Title of the Section
Download