Appendix A - Steve's Doctoral Journey HOME

advertisement
Andragogy and Online Course Satisfaction: A Correlation Study
Dissertation Manuscript
Submitted to Northcentral University
Graduate Faculty of the School of Education
in Partial Fulfillment of the
Requirements for the Degree of
DOCTOR OF PHILOSOPHY
by
STEPHEN W. WATTS
Prescott Valley, Arizona
September 2014
Abstract
The high rate of online course dropout has instigated several studies focused on learner
satisfaction with online learning. This study seeks to identify whether adult learner
characteristics and instructional process design elements facilitate online learner
satisfaction, thereby providing means of mitigating online dropout. The purpose of this
quantitative correlation study is to investigate relationships between adult learner
characteristics, instructional process design elements, and learner satisfaction among
adult learners in a postsecondary online program. This study evaluates the predictive
value of 14 predictor variables; six adult learner characteristics and eight instructional
process design elements on the criterion variable of learner satisfaction. The population
consists of online postsecondary students who are over the age of 24, who have taken at
least one online course from an HLC-NCA accredited program with at least one physical
facility in the state of Missouri. Participants were chosen using stratified random
sampling at the school level to ensure a proportional mix of qualifying learners from
public state universities, public universities, and private universities or colleges and
ensure that the individuals in the sample represent the population as nearly as possible.
One in three randomly selected qualifying students will receive an email inviting them to
participate in the study, providing a representative sample of the target population. Based
on a G*Power analysis a minimum sample size of 194 is required. An online survey
adapted from the Andragogy in Practice Inventory and the Learner Satisfaction and
Transfer of Learning Questionnaire was presented to the study sample and used to collect
demographic data, as well as responses regarding the study’s 14 predictor variables.
Collected data was reviewed to ensure completeness, and analyzed using hierarchical
regression analysis for hypothesis testing to assess the relationship of the predictor
variables to the criterion variable using IBM SPSS Statistics Package Version 21. By
establishing which learner characteristics and instructional process design elements affect
learner satisfaction, strategies may be developed to mitigate postsecondary online
dropouts.
Table of Contents
Chapter 1: Introduction ....................................................................................................... 1
Background ................................................................................................................... 3
Statement of the Problem .............................................................................................. 4
Purpose of the Study ..................................................................................................... 5
Theoretical Framework ................................................................................................. 7
Research Questions ....................................................................................................... 9
Hypotheses .................................................................................................................. 10
Nature of the Study ..................................................................................................... 11
Significance of the Study ............................................................................................ 13
Definition of Key Terms ............................................................................................. 15
Summary ..................................................................................................................... 19
Chapter 2: Literature Review ............................................................................................ 22
Documentation ............................................................................................................ 23
Historical Overview .................................................................................................... 23
Online Technological Advances ................................................................................. 27
Purported Benefits of eLearning ................................................................................. 31
Factors that Bring eLearning Success ......................................................................... 45
eLearning and Dropout ............................................................................................. 105
Learner Factors of Dropout ....................................................................................... 111
Online Course or Program Factors of Dropout ......................................................... 117
Learner Satisfaction and Online Course Dropout ..................................................... 122
Factors that Engender eLearner Satisfaction ............................................................ 124
Summary ................................................................................................................... 131
Chapter 3: Research Method ........................................................................................... 136
Research Method and Design ................................................................................... 136
Population ................................................................................................................. 137
Sample....................................................................................................................... 138
Materials/Instruments ............................................................................................... 139
Operational Definition of Variables.......................................................................... 142
Data Collection, Processing, and Analysis ............................................................... 148
Assumptions.............................................................................................................. 151
Limitations ................................................................................................................ 151
Delimitations ............................................................................................................. 152
Ethical Assurances .................................................................................................... 153
Summary ................................................................................................................... 154
References ....................................................................................................................... 158
Appendixes ..................................................................................................................... 191
Appendix A: Higher Learning Commission of the North Central Association of Colleges
and Schools Institutions with Physical Facilities in Missouri ........................................ 192
Appendix B: Results of Random Selection of Schools .................................................. 195
Appendix C: Andragogy in Practice Inventory .............................................................. 197
Appendix D: Permission to Use API .............................................................................. 201
Appendix E: Learner Satisfaction and Transfer of Learning Survey ............................. 203
Appendix F: Permission to Use LSTQ ........................................................................... 204
Appendix H: Responses from Provosts and Chief Academic Officers .......................... 206
Appendix I: Informed Consent Form .............................................................................. 209
Appendix J: G*Power A Priori Analysis ........................................................................ 211
List of Tables
Table 1 Factors in the API and Learner Satisfaction scale on the LSTQ........................140
1
Chapter 1: Introduction
Since 2000, technological advances in information and communication
technology have caused profound changes in the way many people communicate,
socialize, work, and receive training or education (Bala, 2010; Bolliger & Halupa, 2012).
Leaders of 65% of institutions of higher education consider online learning critical to
their long-term strategies (Allen & Seaman, 2011). The number of new online students is
outpacing the number of new traditional students by a proportion of 5 to 1, with 32% of
all college-level students taking at least one online class (Allen & Seaman, 2013).
These high numbers represent the abundant benefits that electronic learning
(eLearning) provides to learners. These benefits include the improvement of learning
efficiency (Cabrera-Lozoya, Cerdan, Cano, Garcia-Sanchez, & Lujan, 2012; Huang, Lin,
& Huang, 2012), improvements in learner behavior (Bhuasiri, Xaymoungkhoun, Zo, Rho,
& Ciganek, 2011), enhanced communication (Abrami, Bernard, Bures, Borokhovski, &
Tamim, 2010; Alshare, Freeze, Lane, & Wen, 2011), convenience (Bollinger & Halupa,
2012), time efficiencies (Pastore, 2012), and improved learning (Ismail, Gunasegaran, &
Idrus, 2010). Despite these benefits, the incidence of dropout or failure in online courses
is larger than for traditional courses. Although the percentages vary between programs;
dropout rates ranging between two and five times larger than rates for traditional courses
have been reported (Brown, 2012; Lee & Choi, 2011; Lint, 2013; Wilson & Allen, 2011).
This high rate of dropout in online courses has led to several studies that have focused on
satisfaction with online learning (Bollinger & Halupa, 2012; Gunawardena, LinderVanBerschot, LaPointe, & Rao, 2010); because satisfaction is considered to be the largest
2
determinant in reducing dropout (Chen & Lien, 2011; Kozub, 2010; Martinez-Caro,
2011).
According to the theory of andragogy, adult students learn differently than do
children (Holton, Wilson, & Bates, 2009; Knowles, 1984; McGrath, 2009), and the
successful teaching of adults is optimized when (a) instructors engender an environment
where the learner is properly prepared, (b) the climate is encouraging, and (c) there is
coordination between instructor and student in planning learning, (d) diagnosing the
learner’s specific needs, (e) agreeing on learning objectives and (f) designing the
necessary experience through (g) learning activities and (h) evaluation (Gilbert, Schiff, &
Cunliffe, 2013). When adult students possess an intrinsic motivation to learn, prior
experience, a need to know, readiness to learn, are self-directed, and learning is
immediately applied to real-world situations, they may learn better (Cox, 2013).
Knowles (1995, 2005) originally codified these instructional process design
elements and adult learner characteristics and the theory of andragogy has had a strong
influence on distance and online education (Blaschke, 2012) because the theory addresses
the facilitation of a climate where students can learn (Marques, 2012; McGrath, 2009).
Some dropout factors have been mitigated by specific adult learner characteristics,
including (a) motivation (Omar, Kalulu, & Belmasrour, 2011; Park & Choi, 2009; Travis
& Rutherford, 2012), (b) self-efficacy (Chen & Lien, 2011; Gunawardena et al., 2010),
and (c) increased interaction (learner-to-learner and faculty-to-learner; Ali & Ahmad,
2011; Alshare et al., 2011; Boling, Hough, Krinsky, Saleem, & Stevens, 2011; Donavant,
2009; Morrow & Ackermann, 2012). When emphasized in online learning, researchers
have demonstrated that performance, participation, and satisfaction of adult learners
3
increased (Cacciamani, Cesareni, Martini, Ferrini, & Fujita, 2012; Cercone, 2008; Huang,
Lin, & Huang, 2012; Keengwe & Georgina, 2011). This chapter includes the following
contributions to the extant study: (a) background, (b) statement of the problem, (c)
purpose of the study, (d) theoretical framework, (e) research questions, (f) hypotheses, (g)
nature of the study, (h) significance of the study, (i) definition of key terms, and (j) a final
summary of salient points.
Background
In a meta-analysis regarding dropout factors for online courses, Lee and Choi
(2011) classified 44 unique factors that they further organized into three categories. The
incipient field of academic analytics has attracted much consideration because of the
large variation in the number of learners who drop out of higher education programs that
are online versus traditional (Cacciamani et al., 2012; Tuquero, 2011). Of the three
categories of drop out factors, a majority of the factors that appear to have the greatest
impact on drop out decisions are learner factors (Lee & Choi, 2011), or elements that a
learner either has or does. Since a majority of online learners are non-traditional and over
the age of 24 (Goddu, 2012), it is thought that by further exploring these adult learner
characteristics, better support or specific programs may be instituted to alleviate these
issues.
While institutions may or may not have control or influence over the
characteristics of their learners, they do have control over their courses and programs.
Lee and Choi (2011) found that the next largest number of factors and greatest
subsequent impact on learner decisions for retention or dropout were course–program
factors. These adult learner characteristics and instructional process design elements fit
4
nicely into the theoretical framework of this study. Knowles (1973, 1975, 1980, 1984,
1990, 1995) posited that the optimal learning environment would take into account
learner characteristics and would have a strategy for designing the experience around the
learners, instructors, and institutional goals.
The relationship between retention and learner satisfaction (Biasutti, 2011; Chen
& Chih, 2012; DeLotell et al., 2010) is well established, and there is support that learner
characteristics and instructional process design elements contribute to a good learning
environment. By combining these elements, adult learning characteristics, instructional
process design elements, and learner satisfaction, through quantitative analysis, a better
understanding of whether and how these different elements relate to one another may
extend the findings of previous studies and contribute to the theory of adult online
learning.
Statement of the Problem
Dropout rates in online courses often exceed 30%; two to five times larger than
corresponding rates for traditional courses (Brown, 2012; Lee & Choi, 2011; Lint, 2013;
Wilson & Allen, 2011). The specific problem to be addressed is the low satisfaction
among adults in online postsecondary courses (Donavant, 2009; Huang et al., 2012;
Watkins, 2005) since learner satisfaction has been considered the largest determinant in
reducing online dropout (Chen & Lien, 2011; Kozub, 2010; Martinez-Caro, 2009).
Determining factors that engender learner satisfaction with online courses, which may
reduce dropout, would be a benefit to higher education (Lee & Choi, 2011; Levy, 2007;
Willging & Johnson, 2009). Past researchers have affirmed that when specific learner
characteristics and instructional process design elements are present learner satisfaction is
5
increased (Cox, 2013; Gilbert et al., 2013; Knowles, Holton, & Swanson, 2005), which
may reduce the incidence of dropout (Beqiri, Chase, & Bishka, 2010; Deil-Amen, 2011;
Lee & Choi, 2011). Researchers have called for continued research to examine the
learner characteristics and instructional process design elements associated with learner
satisfaction (Abrami & Bernard, 2006; Burke & Hutchins, 2007; Gunawardena et al.,
2010; Holton et al., 2009). Therefore, online dropouts may be mitigated (Ali & Ahmad,
2011; Alshare et al., 2011; Boling et al., 2011; Chen & Lien, 2011; Morrow &
Ackermann, 2012; Omar et al., 2011; Travis & Rutherford, 2012) by establishing which
learner characteristics and instructional process design elements affect learner satisfaction
(Donavant, 2009; Gunawardena et al., 2010; Holton et al., 2009; Huang et al., 2012;
Taylor & Kroth, 2009b).
Purpose of the Study
The purpose of this quantitative correlation study is to investigate relationships
between adult learner characteristics, instructional process design elements, and learner
satisfaction among adult learners in a postsecondary online environment with at least one
physical facility in Missouri. The specific adult learner characteristics and instructional
process design elements were selected based on Knowles’ (1973, 1975, 1980, 1984,
1995) theory of andragogy as the theoretical framework for the study. The 14 predictor
variables include six adult learner characteristics: (a) intrinsic motivation to learn, (b)
prior experience, (c) need to know, (d) readiness to learn, (e) self-directed learning, and
(f) orientation to learn; and eight instructional process design elements: (g) preparing the
learner, (h) climate setting, (i) mutual planning, (j) diagnosis of learning needs, (k) setting
of learning objectives, (l) designing the learning experience, (m) learning activities, and
6
(n) evaluation and serve as predictor variables (Knowles, 1995, 2005). The criterion
variable is learner satisfaction. The study target population includes adult (over age 24)
online learners attending a postsecondary institution accredited by the Higher Learning
Commission of the North Central Association of Colleges and Schools (HLC-NCA; see
Appendix A) with at least one physical facility in Missouri. According to a G*Power
analysis a minimum sample size of 194 is required (Faul, Erdfelder, Buchner, & Lang,
2009; see Appendix J). Participants were selected through stratified random sampling by
first selecting 13 schools from Appendix A followed by a random selection of 1 in 3
qualifying students from each of the participating postsecondary institutions. The 14
predictor variables will be measured by the 66-item Andragogy in Practice Inventory
(API; Holton et al., 2009; see Appendix B). The API is a psychometrically pre-validated
instrument that will be used to collect quantitative data for the characteristics and design
elements (Holton et al., 2009), and has been used in other studies with regression analysis
with demonstrated validity and reliability (Holton et al., 2009; Wilson, 2005). The
criterion variable of learner satisfaction was measured by the pre-validated Satisfaction
subscale of the Learner Satisfaction and Transfer-of-Learning Questionnaire (LSTQ) that
also has demonstrated high reliability (Gunawardena et al., 2010). The 14 predictor
variables were grouped into two sets, with six variables constituting learner
characteristics and eight variables constituting instructional process design elements.
Hierarchical regression analysis was used for hypothesis testing to determine whether
either or both of the two sets significantly add to the prediction of the criterion variable
satisfaction. Demographic characteristics including college major, gender, ethnicity,
level of education, number of online classes, and age for the study sample were collected
7
and used to ensure that the sample was statistically representative of the population, and
reported. Study results may offer information for instructors to determine which learner
characteristics or instructional process design elements predict online adult learner
satisfaction.
Theoretical Framework
Andragogy, “the art and science of helping adults learn” (Knowles, 1980, p. 43;
see also 1973, 1975, 1984, & 1995), is a foundational educational theory that has many
supporters and will serve as the theoretical framework for this study. The term was
originally coined by Kapp (1833) and philosophically flowed from Plato’s theory
regarding education (Abela, 2009). Knowles (1973, 1975, 1980, 1984, 1995) was the
leading proponent of andragogy as a theory of adult learning in the United States and
developed a number of tenets describing the adult learner. As the theoretical framework
for the proposed study, andragogy focuses attention on certain salient characteristics for
teaching and learning while ignoring others (Young, 2008). Several other authors
(Karge, Phillips, Dodson, & McCabe, 2011) have used the term andragogy to identify
methods for teaching adult learners, and others (Baran, Correia, & Thompson, 2011) have
argued for the positive influence of andragogy in online learning.
Andragogy has had its critics. Blaschke (2012) noted that andragogy was
outmoded because of new technology and teaching methods, and Cercone (2008) and
McGrath (2009) argued that the theory had done almost nothing to provide clarity or
understanding of how learning occurs. According to Newman (2012), transformative
learning has replaced andragogy as the preeminent theory of adult learning. Other
authors (Guilbaud & Jerome-D’Emelia, 2008; McGrath, 2009; Taylor & Kroth, 2009b)
8
have argued that andragogy is not a theory but rather a framework or a set of
assumptions. Specific criticisms of andragogy include (a) critiques of self-direction,
(b) critiques regarding motivation, (c) lack of reflection, (d) lack of accounting for
learning context, and (e) lack of empirical evidence. Knowles (1984) stated that adults
became more self-directed as they matured and that this self-direction guided their
learning; however, self-direction has been shown to not be unique to adults (Clapper,
2010; Taylor & Kroth, 2009b). Cercone (2008) noted that all adults were not
automatically self-directed and that many may require assistance to become so. In the
United States, growth towards self-direction was found to be inhibited by a lack of desire
on the part of many learners to accept greater responsibility for learning (Dibiase &
Kidwai, 2010). These arguments were similar to critiques regarding motivation. Both
Abela (2009) and McGrath (2009) noted that andragogy lacked adequate explication
regarding motivation, did not include mention of extrinsic motivation, was inconsistent
regarding intrinsic motivation, and omitted an exploration of the role of instructors as an
important cause of motivation in learners.
Although andragogy is highly regarded as a theory, it has also been widely
criticized for its lack of empirical verification, despite having been presented as early as
1968 (Cercone, 2008; Henschke, 2011; Holton et al., 2009; McGrath, 2009; Taylor &
Kroth, 2009b). Additionally, researchers have argued that the acceptance of andragogy
as the primary theory of adult learning is inappropriate (Holton et al., 2009; Taylor &
Kroth, 2009b). Since 1980, researchers have noted that the field of adult learning is
dominated by descriptive and qualitative research studies, particularly with regard to
andragogy (Brookfield, 1986; Holton et al., 2009; Long, Hiemstra, & Associates, 1980;
9
Rachel, 2002). Merriam, Caffarella, and Baumgartner (2007) stated that determining
whether the theory of andragogy engendered learner’s achievement and satisfaction in an
empirical setting was overdue. In the proposed study, the principles of andragogy
provide the theoretical lens for examination of the variables. Knowles (1984, 1995) and
Knowles et al. (2005) propounded that the presence of these adult learner characteristics
and instructional process design elements may provide an optimal learning environment
for adults; accordingly, the study objective is to examine these adult learner
characteristics and the instructional process design elements and their relationship to
learner satisfaction in a postsecondary online environment from the theoretical
perspective of Knowles’ (1973, 1975, 1980, 1984, 1995) andragogy.
Thus, the results of this study may further validate and confirm the validity of the
API as a predictive measurement on the effect of andragogical practices on learning. As
noted earlier, the field is replete with descriptive and qualitative research studies and
essays regarding the benefits of andragogy (Brookfield, 1986; Holton et al., 2009; Long,
Hiemstra, & Associates, 1980; Rachel, 2002). By empirically confirming the effect
andragogical principles (adult learning characteristics and instructional process design
elements) have on learner satisfaction, more inferential studies may follow to further
strengthen andragogy’s empirical research base (Cercone, 2008; Henschke, 2011; Holton
et al., 2009; McGrath, 2009; Taylor & Kroth, 2009b), and the proposed study may
establish the path whereby the theory of andragogy may be examined.
Research Questions
This quantitative correlation study was conducted to assess relationships between
learner characteristics, instructional process design elements, and learner satisfaction
10
within a diverse postsecondary online population. Following are the questions that
guided this inquiry:
Q1. Do adult learner characteristics of (a) intrinsic motivation to learn, (b) prior
experience, (c) need to know, (d) readiness to learn, (e) self-directed learning, and (f)
orientation to learn predict learner satisfaction in a Missouri HLC-NCA accredited
postsecondary online environment?
Q2. Do the instructional process design elements (a) preparing the learner, (b)
climate setting, (c) mutual planning, (d) diagnosis of learning needs, (e) setting of
learning objectives, (f) designing the learning experience, (g) learning activities, and (h)
evaluation predict learner satisfaction in a Missouri HLC-NCA accredited postsecondary
online environment?
Hypotheses
H10. The six learner characteristics of (a) intrinsic motivation to learn, (b) prior
experience, (c) need to know, (d) readiness to learn, (e) self-directed learning, and (f)
orientation to learn, collectively, are not predictors of postsecondary online learner
satisfaction.
H1a. The six learner characteristics of (a) intrinsic motivation to learn, (b) prior
experience, (c) need to know, (d) readiness to learn, (e) self-directed learning, and (f)
orientation to learn, collectively, are significant predictors of postsecondary online
learner satisfaction.
H20. The eight instructional process design elements; (a) preparing the learner,
(b) climate setting, (c) mutual planning, (d) diagnosis of learning needs, (e) setting of
11
learning objectives, (f) designing the learning experience, (g) learning activities, and (h)
evaluation, collectively, are not predictors of postsecondary online learner satisfaction.
H2a. The eight instructional process design elements: (a) preparing the learner,
(b) climate setting, (c) mutual planning, (d) diagnosis of learning needs, (e) setting of
learning objectives, (f) designing the learning experience, (g) learning activities, and (h)
evaluation, collectively, are significant predictors of postsecondary online learner
satisfaction.
Nature of the Study
A quantitative, correlational design was used to investigate relationships between
adult learning characteristics and instructional design elements as predictor variables and
learner satisfaction as the criterion variable. A correlational design is most appropriate
for determining whether relationships exist between study variables, the strength of those
relationships, and the mechanisms by which they relate (Aiken & West, 1991; Miles &
Shevlin, 2001). This study evaluated the predictive value of 14 predictor variables: six
adult learner characteristics and eight instructional process design elements on the
criterion variable of learner satisfaction (Aiken & West, 1991; Miles & Shevlin, 2001).
The API was used to isolate and measure the presence or absence of the adult learner
characteristics of (a) intrinsic motivation to learn, (b) prior experience, (c) need to know,
(d) readiness to learn, (e) self-directed learning and (f) orientation to learn and the
instructional process design elements of (g) preparing the learner, (h) climate setting, (i)
mutual planning, (j) diagnosis of learning needs, (k) setting of learning objectives, (l)
designing the learning experience, (m) learning activities, and (n) evaluation in the online
classroom (Holton et al., 2009).
12
A stratified random sample (Brown, 1947; Khowaja, Ghufran, & Ahsan, 2011)
was used for choosing participants for the study. The population for this study consists of
online postsecondary students who are over age 24 and who attend a postsecondary
institution accredited by the HLC-NCA with at least one physical facility in Missouri
(Appendix A). According to a G*Power analysis a minimum sample size of 194 is
required for this study (Faul et al., 2009; see Appendix J). Based on an assumed
completion rate of 5% (Nulty, 2008), a total of 3,900 students in the target population
need to be solicited for participation. Schools were chosen through stratified random
sampling from the list of all qualifying schools that will serve as the sampling frame,
selecting sufficient schools so that the number of potential subjects is three times larger
than is needful for the sample, with the same proportion as total state enrollments; public
state university (3,750), public university (3, 510), and private university or college
(4,440).
Permission has been attained to gather invited participants from the collaborating
institutions in the study (see Appendix C). The administration will be requested to send
an electronic link to an online survey to all learners who have taken at least one online
course, either successfully or unsuccessfully, and who are over the age of 24. From each
of the participating colleges, 1 in 3 randomly selected qualifying students received an
email inviting them to participate in the study, with up to two follow-ups (Nulty, 2008).
A representative sample of the target population was sought; demographic information
from the participants was used to ensure that the respondents were representative based
on the stratified categories of schools, gender, ethnicity, college major, and age. Each
participant, who took at least one online course, either successfully or unsuccessfully, and
13
was over the age of 24, completed an online survey after confirming acceptance of the
informed consent. The survey was a combination of two pre-validated instruments,
Holton et al.’s (2009) 66-item API, which measures six adult learner characteristics, eight
instructional process design elements, and six demographic questions (see Appendix B),
and the 5-item Satisfaction subscale of the LSTQ to determine learner satisfaction
(Gunawardena et al., 2010) with their most recent online course (see Appendix B). Deidentified quantitative data was retrieved for analysis in encrypted form.
Data was analyzed using hierarchical regression analysis (Aiken & West, 1991;
Miles & Shevlin, 2001) to assess the relationships, if any, between the predictor variables
and the criterion variable (Hair, Black, Babin, & Anderson, 2009). Hierarchical
regression analysis assesses any variance explained in online learner satisfaction by the
adult learner characteristics and instructional process design elements and determines
whether either set is a significant predictor on the criterion variable (Cohen, Cohen, West,
& Aiken, 2003).
Significance of the Study
Because of the vast differences in dropout rates for online courses as compared to
traditional courses, past researchers have noted the importance to identify factors that
may minimize this phenomenon (Brown, 2012; Lee & Choi, 2011; Wilson & Allen,
2011). Lee and Choi (2011) indicated in their meta-analysis of articles on dropout factors
that although learner factors accounted both quantitatively and qualitatively for
contributions to dropout, few studies have sought to identify strategies regarding learner
factors. For this reason, Lee and Choi concluded “there is a need to learn more about
these dropout factors” (p. 616). Six of the variables that this study focuses on are
14
associated with learner factors and may assist future researchers to create and test
strategies for addressing learner issues related to dropout. Wilson and Allen (2011)
suggested that there is a need to see how interactions and processes within an online class
contribute to learner success. The study measured eight process design elements and
their relationship to one element of learner success, satisfaction. By determining what
works versus what does not, this study contributes to the literature according to Wilson
and Allen’s suggestion.
One of the most influential determinants for reducing online dropout is learner
satisfaction (Chen & Lien, 2011; Kozub, 2010; Martinez-Caro, 2009), and the link
between retention and learner satisfaction has been established (Chen & Lien, 2011).
The proposed study will seek more specific evidence for learner and process factors that
contribute to and engender learner satisfaction. A better understanding and confirmation
of these learner and process design elements on learner satisfaction may be useful to
reduce or curtail online course dropout rates (Ali & Ahmad, 2011; Alshare et al., 2011;
Boling et al., 2011; Morrow & Ackermann, 2012; Omar et al., 2011; Travis &
Rutherford, 2012). Morrow and Ackermann (2012) acknowledged that much research
remains to be conducted before “the best predictors of retention as well as what
interventions and modifications” (p. 489) are best. The study sought to identify the
predictive value of learner characteristics and specific instructional processes from which
strategies and interventions may be derived (Donavant, 2009; Gunawardena et al., 2010;
Holton et al., 2009; Huang et al., 2012; Taylor & Kroth, 2009b).
15
Definition of Key Terms
Climate setting. For the purposes of this study, climate setting refers to one of
the eight andragogical instructional process design elements. Climate setting includes (a)
the physical setting (e.g., “temperature, ventilation, easy access to refreshments and
restrooms, comfortable chairs, adequate light, good acoustics;” Knowles, 1995, p. 118);
(b) access to a rich supply of both human and material resources; and (c) a psychological
setting that is “relaxed, trusting, mutually respectful, informal, warm, collaborative,
supportive,” open, and authentic (Knowles et al., 2005, p. 116).
Designing the learning experience. Designing the learning experience refers to
one of the eight instructional process design elements. Designing the learning experience
consists of (a) focusing on areas of challenge identified by learners through selfdiagnostic techniques, (b) selecting the most appropriate format for learning, (c)
employing appropriate experiential learning methods and materials, and (d) sequencing
these methods and materials based on the learners’ needs (Knowles et al., 2005).
Diagnosis of learning needs. Diagnosis of learning needs refers to an
instructional process design element. Diagnosing learning needs consists of collaborative
work between the learner and the instructor to create an accurate gap analysis regarding
what is known versus what is needed to know from the learning opportunity. This
assessment of needs can be simple, elaborate, or somewhere in between (Knowles, 1995;
Knowles et al., 2005).
Dropout. A dropout is a postsecondary learner who fails to complete a course
either by earning an incomplete or an “F” on the transcript or by withdrawing voluntarily
16
from a course after the school drop period, thereby incurring a financial penalty (Lee &
Choi, 2011).
Evaluation. Evaluation refers to an instructional process design element.
Optimally, evaluation occurs in four steps; (a) an ongoing collection of data as learning
occurs, (b) structured pre and posttests to ascertain learning gains, (c) assessment of
behavior changes consonant with learning, and (d) identifying the result of the new
behavior or learning on the organization (Knowles et al., 2005).
Intrinsic motivation to learn. For the purposes of this study, intrinsic
motivation to learn is an adult learner characteristic. Intrinsic motivation to learn refers
to motivation to learn for its own sake, rather than for the sake of external drives,
rewards, or punishment avoidance (Abela, 2009; Blaschke, 2012; Chan, 2010; Clapper,
2010; Harper & Ross, 2011; Karge, Phillips, Dodson, & McCabe, 2011; Minter, 2011;
Wang & Kania-Gosche, 2011).
Learning activities. Learning activities are collaborative and based on
experiential techniques where the teacher or facilitator helps learners or students to
organize themselves to facilitate mutual inquiry and share responsibility and an
instructional process design element (Knowles et al., 2005).
Mutual planning. Mutual planning is a process element in an andragogical
classroom whereby the instructor and the learner identify and agree to the learning focus
of the course (Knowles, 1995; Knowles et al., 2005). This process element is based on
engaging the learner and encouraging participation not only for the topic, but also in the
process of learning itself (Knowles, 1995; Knowles et al., 2005).
17
Need to know. Need to know is an adult learner characteristic and refers to the
evolution from subject-centered learning to problem-centered learning as people mature
(Keengwe & Georgina, 2011; McGrath, 2009) that is life-focused and task-oriented
(Chan, 2010; Kenner & Weinerman, 2011; Moore, 2010), and suggests that the demands
of life and family, drive adults to seek learning that is relevant to their home and working
lives (Cheng, Wang, Yang, Kinshuk, & Peng, 2011; Karge et al., 2011; Taylor & Kroth,
2009b).
Online learning. Online learning consists of higher education courses that
typically have no face-to-face meetings between faculty and learners, and where at least
80% of the content is delivered online (Allen & Seaman, 2011).
Online learner satisfaction. Online learner satisfaction is a learner’s perception
of how well eLearning was received, accepted, and esteemed in an online educational
setting (Bollinger & Halupa, 2012; Gunawardena et al., 2010). Satisfaction is a complex
construct that researchers have shown leads to increases in motivation, engagement,
performance, learning, and success (Bollinger & Halupa, 2012; Gunawardena et al.,
2010; Kozub, 2010; Martinez-Caro, 2009; McGlone, 2011).
Orientation to learn. For the purposes of this study, orientation to learn is an
adult learner characteristic. Orientation to learn suggests “more effective learning will
occur when the adult learner can transfer the new knowledge to a real life problem”
(Wilson, 2005, p. 32). Adult learners are more problem-centered rather than subjectcentered in their approach to learning, and are oriented to learn about topics that will
complement their daily lives, rather than seeking knowledge just for the sake of
knowledge (Knowles, 1995; Knowles et al., 2005).
18
Prepare the learner. Prepare the learner is an instructional process design
element. Knowles (1995) identified that the modern adult needs to learn to become selfdirected, and preparation consists of receiving (a) training regarding proactive versus
reactive learning, (b) an introduction into the available resources for a course whether
those resources are people or materials, and (c) utilizing the proactive skills learned
(Knowles et al., 2005).
Prior Experience. Prior experience is an adult learner characteristic. Experience
allows three things in mature learners; it can be used as a resource in the learning process
(Green & Ballard, 2011), allows integration of new learning with past experience and
events (Cercone, 2008; Marques, 2012), and may be used to validate and build the selfconcept of the learner (Fidishun, 2011; Harper & Ross, 2011).
Readiness to learn. For the purposes of this study, readiness to learn is an adult
learner characteristic. Adults want (Cercone, 2008) and are ready to learn (Clapper,
2010; Kenner & Weinerman, 2011; Marques, 2012), but they want to have a reason for
learning something (Blaschke, 2012; Harper & Ross, 2011; Strang, 2009) and need to
know how it will benefit them (Cercone, 2008; McGrath, 2009; Moore, 2010).
Self-directed learning. Self-direction is an adult learner characteristic that
means that adults are independent (Kenner & Weinerman, 2011), responsible (Blaschke,
2012; Harper & Ross, 2011; Keengwe & Georgina, 2011; McGlone, 2011; Minter, 2011),
autonomous (Cercone, 2008; Chan, 2010), and expect to have some say in what they will
learn, and oppose learning that is foisted upon them (McGrath, 2009; Moore, 2010;
Taylor & Kroth, 2009b).
19
Setting of learning objectives. Setting of learning objectives is an instructional
process design element and constitutes the process of mutually formulating activities and
learning based on the needs of the learner, the facilitator, the institution, and society
(Knowles et al., 2005).
Traditional learning. Traditional learning comprises higher education courses
where content is delivered either orally or through writing in a physical setting, and
where little to no online technology is utilized (Allen & Seaman, 2011).
Summary
Electronic learning has numerous advantages (Abrami et al., 2010; Al-Asfour,
2012; Alshare et al., 2011; Bhuasiri et al., 2011; Biasutti, 2011; Bollinger & Halupa,
2012; Cabrera-Lozoya et al., 2012; Huang et al., 2012; Ismail et al., 2010; Pastore, 2012)
and is becoming more and more popular in higher education (Allen & Seaman, 2011;
Dykman & Davis, 2008a; Falloon, 2011; Hsieh & Cho, 2011; Lykourentzou,
Giannoukos, Mpardis, Nikolopoulos, & Loumos, 2009; Wilson & Allen, 2011). Dropout
from eLearning programs, however, is significantly higher than from more traditional
programs (Brown, 2012; Cacciamani et al., 2012; Ekmekci, 2013; Gunawardena et al.,
2010; Henning, 2012; Lee & Choi, 2011; Nandeshwar, Menzies, & Nelson, 2011; Travis
& Rutherford, 2012). A primary determinant of dropout from eLearning programs is
learner satisfaction (Chen & Lien, 2011; Hrastinski & Jaldemark, 2012; Kozub, 2010;
Martinez-Caro, 2009; Revere & Kovach, 2011), and a number of studies have identified
some of the factors that contribute to this satisfaction (Amrein-Beardsley & Haladyna,
2012; Bollinger & Halupa, 2012; Driscoll, Jicha, Hunt, Tichavsky, & Thompson, 2012;
Gonzalez-Gomez, Guardiola, Rodriguez, & Alonso, 2012; Ke & Kwak, 2013). In a study
20
to validate a theory-based survey of teaching effectiveness, Amrein-Beardsley and
Haladyna (2012) found that a survey based on Chickering and Gamson’s (1987) seven
principles of teaching effectiveness could both be shorter and more specific in its
instructor assessments than traditional surveys based on pedagogy that provided more
summative than formative measures. The problem addressed in this study is the low
satisfaction among adults in online postsecondary courses (Donavant, 2009; Huang et al.,
2012; Watkins, 2005), and determine factors that engender learner satisfaction with
online courses (Lee & Choi, 2011; Levy, 2007; Willging & Johnson, 2009), which may
reduce dropout and benefit higher education (Beqiri et al., 2010; Deil-Amen, 2011; Lee
& Choi, 2011). The purpose of this quantitative correlation study is to investigate
relationships between adult learner characteristics, instructional process design elements,
and learner satisfaction among adult learners in a postsecondary online environment with
at least one physical facility in Missouri. This study evaluated the predictive value of 14
predictor variables; six adult learner characteristics and eight instructional process design
elements on the criterion variable of learner satisfaction. The API was used to isolate and
measure the presence or absence of the adult learner characteristics of (a) intrinsic
motivation to learn, (b) prior experience, (c) need to know, (d) readiness to learn, (e) selfdirected learning and (f) orientation to learn and the instructional process design elements
of (g) preparing the learner, (h) climate setting, (i) mutual planning, (j) diagnosis of
learning needs, (k) setting of learning objectives, (l) designing the learning experience,
(m) learning activities, and (n) evaluation in the online classroom. A stratified random
sample was selected from the population of online postsecondary students who are over
the age of 24 and attend an HLC-NCA accredited institution of higher learning with at
21
least one physical facility in the state of Missouri (Appendix A) and consisted of a
minimum of 194 learners per a G*Power analysis (see Appendix J). Provosts from
selected schools were enlisted to endorse and send e-mails with an electronic link to an
online survey that combines the 66-item API (Holton et al., 2009; see Appendix C & D),
which measures six adult learner characteristics, eight instructional process design
elements, and demographic details, and the 5-item Satisfaction subscale of the LSTQ for
measuring learner satisfaction (Gunawardena et al., 2010; see Appendix E & F) with their
most recent online course. Hierarchical regression analysis (Aiken & West, 1991; Miles
& Shevlin, 2001) was used to assess the relationships, if any, between the predictor
variables and the criterion variable (Cohen et al., 2003; Hair et al., 2009).
22
Chapter 2: Literature Review
The purpose of this quantitative correlation study is to investigate relationships
between adult learner characteristics, instructional process design elements, and learner
satisfaction among adult learners in a postsecondary online environment with at least one
physical facility in Missouri. The specific problem addressed is the low satisfaction
among adults in postsecondary courses (Donavant, 2009; Huang et al., 2012; Watkins,
2005) since learner satisfaction has been considered the largest determinant in reducing
online dropout (Chen & Lien, 2011; Kozub, 2010; Martinez-Caro, 2009). Past
researchers have affirmed that when specific learner characteristics and instructional
process design elements are present learner satisfaction is increased (Cox, 2013; Gilbert
et al., 2013; Knowles et al., 2005), which may reduce the incidence of dropout (Beqiri et
al., 2011; Lee & Choi, 2011). Therefore, online dropouts may be mitigated (Ali &
Ahmad, 2011; Alshare et al., 2011; Boling et al., 2011; Chen & Lien, 2011; Morrow &
Ackermann, 2012; Omar et al., 2011; Travis & Rutherford, 2012) by establishing which
learner characteristics and instructional process design elements affect learner satisfaction
(Donavant, 2009; Gunawardena et al., 2010; Holton et al., 2009; Hunag et al., 2012;
Taylor & Kroth, 2009b).
The foundational problem for this study is the larger dropout rates experienced by
providers of eLearning programs. The remaining sections of the literature review will
introduce a historical overview of studies on eLearning prior to 2009 and focus on the
prevalence of dropout in eLearning programs, with associated factors and theories that
appear to contribute to this problem; learner, course or program, and environmental
factors. Finally, a review of what the literature says about learner satisfaction and its
23
mitigating effect on dropout, along with the factors that appear to produce and stimulate
learner satisfaction in online courses and programs will be conducted.
Documentation
The search for pertinent literature for this research was accomplished in two
stages. Initially, searches were conducted through Northcentral University’s Roadrunner
Search, utilizing the keywords of e-learning, online learning, computer-assisted learning,
web-based learning, and distributed learning, along with andragogy, adult learning,
satisfaction, and dropout within the interval 2009 through 2014. Literature reviewed
from these searches was obtained from the EBSCOhost, ERIC, ProQuest, Sage Journals,
and SpringerLink databases. Careful review and selection of articles was appropriate to
the subject of adult online learning and learner satisfaction, the literature articles were
further searched for apposite perspectives.
Historical Overview
One of the original definitions of eLearning was detailed by the Higher Education
Funding Council for England (2005), which stated that eLearning uses “technologies in
learning opportunities, encompassing flexible learning as well as distance learning; and
the use of information and communication technology as a communications and delivery
tool, between individuals and groups, to support students and improve the management of
learning” (as cited in Andrews & Haythornthwaite, 2007, p. 2). In higher education,
eLearning became a predominant form of postsecondary education (Sandlin, 2005).
Adults over 25 are the fastest growing learner demographic (Bye, Pushkar, & Conway,
2007; Wilson, 2005) because of the advantages of eLearning. In 2008, 25% of higher
education students took at least one online course (Shea & Bidjerano, 2010) with 4
24
million online higher education students in the U.S. (Lee & Choi, 2011). In 2010, the
percentage of online learners in Australia was 43% of all learners (O’Toole & Essex,
2012), and in 2010 six million higher education students (31%) in the U.S. took at least
one online course (Travis & Rutherford, 2013). Online learning more effectively meets
the needs of many postsecondary students than traditional settings (Fahy, 2008; Gibbons
& Wentworth, 2001; Lam & Bordia, 2008).
Prior to about 2005 online learning subscribed to the no significant difference
paradigm; that the outcomes from online learning—motivation, achievement,
satisfaction, grades—were functionally equivalent with traditional courses (Shea &
Bidjerano, 2010). However, since that time, multiple studies have indicated that learners
fared better online than in a traditional classroom (Lo et al., 2011; London & Hall, 2011;
Means, Toyama, Murphy, Bakia, & Jones, 2009; Revere & Kovach, 2011; Wilson &
Allen, 2011; Zhao, Lei, Yan, Tan, & Lai, 2005). These improvements in learning
outcomes from eLearning will be detailed in the sections that follow. Later studies also
indicated that the difference between earlier studies of no significance and contemporary
studies of marked difference is that technology has advanced to provide affordances for
enhanced interaction and support metacognitive learning strategies (Ally, 2008; Mehta,
Clayton, & Sankar, 2007; Shea & Bidjerano, 2010). The affordances of the technical
advances indicated by these studies will be discussed in the next section, while their
benefits will be meticulously investigated in the section after that. Some of these
affordances include the ability to learn at a distance, which is helpful for learners who
might otherwise not be able to attend class because of disability (Kaliski et al., 2012;
Moisey & Hughes, 2008), live in remote areas (Donavant, 2009; Michinov, Brunot, Le
25
Bohec, Juhel, & Delaval, 2011; Travis & Rutherford, 2013), are too far away to travel to
class physically (Bhuasiri et al., 2012; Hsu & Shiue, 2005; Park & Choi, 2009; Wang,
Vogel, & Ran, 2011), are homebound, or those who due to scheduling conflicts cannot be
in two places at once (Connors, 2005; Gibbons & Wentworth, 2001; Martinez-Caro,
2011; Tallent-Runnels et al., 2006). Further affordances will be discussed in greater
detail in the two sections following.
Many learners have reported the lack of face-to-face interactions with instructors
to be the most unattractive feature of eLearning (Diaz & Entonado, 2009; Donavant,
2009; Yang & Cornelious, 2005). Yang and Cornelious (2005) recommended that
instructors needed to augment interaction and developing a sense of community to ensure
quality despite the lack of face-to-face interaction with an instructor. Diaz and Entonado
(2009) noted that the function of the instructor is the same in both traditional and online
formats, but the amount of planning that goes into activities and interactions in eLearning
is greater because of the lack of visual contact. In both contexts, elevated interactions
between instructor and learner have been shown to be the driving force (Abrami et al.,
2010; Ali & Ahmad, 2011; Bradley, 2009) and most important factor (Chickering &
Gamson, 1987; Lam & Bordia, 2008) impelling learner motivation, while more
controlling and directed environments decrease motivation and performance (Rovai et al.,
2007). Because of the physical separation online, if there is little to no social interaction
learners feel isolated (Pigliapoco & Bogliolo, 2008). Muilenburg and Berge (2005)
found that this lack of interaction is “the single most important barrier to students
learning online” (p. 35). Pigliapoco and Bogliolo (2008) determined, however, that
eLearning does not have to be physically isolating, and that isolation in both traditional
26
and online settings is affected more by perceived quality of the course and personal
motivations.
Levy (2007) in a study of dropout in eLearning courses noted “substantial
differences” (p. 186) between the number of learners who drop out of traditional courses
and those who drop out of online courses. A number of persistence models of retention
are specific to online learning. Kember (1989a, 1989b) developed a model based on
Tinto’s (1975, 1993) traditional persistence model, and focused on learner demographics,
learner motivation, academic proficiency, and social determinants. Both Tinto and
Kember posited that academic and social integration triggered either persistence or
dropout (Lint, 2013). Kember identified that online learners may also because of external
attributes or academic incompatibility (Lint, 2013; Stavredes & Herder, 2014). Bean and
Metzner (1985, 1987) also developed a persistence model, and proposed that
nontraditional learners are usually older and less likely to be influenced by the social
environment. Their model was heavily based on academic performance, demographics
and goals, and environmental factors, with less of an emphasis on social integration than
previous models (Alley, 2011; Bean & Metzner, 1985, 1987). In addition, Rovai (2003)
proposed a third persistence model for eLearning that consisted of learner characteristics
and learner skills as prior-to-admission constructs, and internal and external factors as
after-admission constructs. Rovai’s model was further extended by Packham, Jones,
Miller, and Thomas (2004), when they tested the earlier model. This final model consists
of the need for course quality, course flexibility, course design, and quality content as
essential internal factors (Ekstrand, 2013).
27
Online Technological Advances
The context of this study is online learning, so the technological advances
contributing to this context are essential to understand. This leads to a need to
comprehend and fathom the benefits that derive from eLearning to gain an even greater
perspective of the milieu for this study. Studies from the eLearning literature have
sought to identify factors that engender eLearning success; some of these factors
constitute the variables that will be explored.
This study will be conducted online and assesses the processes of online learning,
or eLearning. When the American Society of Training and Development planned their
first Internet training course in 1996, they called the training eLearning, and thus a whole
new educational training medium was born (Chen & Lien, 2011). Christensen (2013)
identified a phenomenon whereby markets are transformed. Most companies produce
products for the high end of their market, due to higher profit margins (Christensen,
2013). This creates an entry point for products and services at the lower end of a market
(Christensen, 2013). Because these products or services have lower profit margins and
tend to niche markets, they often fail or remain small (Christensen, 2013). Periodically, a
disruptive innovation comes along that succeeds, completely transforming the market;
replacing the previous product or service (Christensen, 2013). With eLearning’s
exponential growth and increasing support, some researchers see it as a disruptive
innovation to the educational field (Christensen, 2013; Christensen, Johnson, & Horn,
2008; Kim & Frick, 2011). In this section, various definitions of eLearning will be
explored, and the symphony of names that it is called will be identified. This section will
conclude with information regarding the growth in the use of eLearning in higher
28
education and how eLearning’s prospects are viewed by instructors, administrators, and
researchers.
In the literature, eLearning is known by many different terms and definitions.
Variously called eLearning (London & Hall, 2011; Malik & Khurshed, 2011), online
learning (Gibbons & Wentworth, 2001; Kaliski, Booker, & Shumann, 2012), distributed
learning (Ferguson & DeFelice, 2010; Gunawardena et al., 2010), web-based learning
(Kupczynski et al., 2011; Lee, 2010), distance learning (Bolliger & Halupa, 2012; Cheng
et al., 2011), network learning (Lo, Ramayah, & Hong, 2011), technology-based learning
(Fidishun, 2011), computer-mediated learning (Holbert & Karady, 2009; Hrastinski &
Jaldemark, 2012), technology-mediated learning (Ali & Ahmad, 2011; Gupta & Bostrom,
2009; Kear, Chetwynd, Williams, & Donelan, 2012), distance education (Er, Ozden, &
Arifoglu, 2009; Kim & Frick, 2011), “learning via structured isolation” (Ferratt & Hall,
2009, p. 426), cyber education (Joo et al., 2010), online collaborative learning (Biasutti,
2011; Hoic-Bozic et al., 2009), and virtual learning (Belair, 2012; Deulen, 2013),
eLearning has proven difficult to name and define. Hoic-Bozic et al.’s (2009)
representation of eLearning was “the use of new multimedia technologies and the Internet
to improve the quality of learning by facilitating access to resources and services, as well
as remote exchanges and collaboration” (p. 19). The common features of these two
definitions are the use of technology, improving learning, and the use of technology for
students to interact and collaborate.
Ho and Kuo’s (2010) definition simplified the previous two, but contained the
same elements; “e-learning is the use of technological tools, primarily those that can be
made available over networks such as the Internet, for education” (p. 23). Chen and Lien
29
(2011) emphasized that eLearning is about interaction between groups and between
individual learners. Ismail, Idrus, Baharum, Rosli, and Ziden (2011) agreed with the use
of technology over networks but emphasized the purpose of eLearning is “increasing the
knowledge, skills, and productive capabilities of the learners” (p. 49) through a
collaborative process of people learning from each other. Bradford and Wyatt (2010)
considered eLearning as “a delivery system of teaching and learning, when the teacher
and student experience separation by physical distance and time, using alternative media
resources” (p. 108). This latest definition is very similar to Oncu and Cakir’s (2011) and
Behar’s (2011) emphasis on the separation of instructor and learner where technology
mediates the communication. Wang et al.’s (2010) definition focused on the learner and
the use of technology “to deliver information and instructions to individuals” (p. 167).
Because of the emphasis in these definitions on the technology, but little agreement on
how the education or learning is to be accomplished, some have lamented that the
eLearning literature is more technology-based than theory-based (Malik & Khurshed,
2011).
eLearning has become more and more prevalent in developed countries. In the
United States (U.S.), for example, research showed that only 9% of the population is not
connected in some way (Hoskins, 2012; Zickhur, 2011). Even in developing countries
access to mobile technology is increasing rapidly (Bhuasiri et al., 2012; Fahy, 2008).
eLearning is widely accepted (Lykourentzou, Giannoukos, Mpardis et al., 2009), crucial
(Travis & Rutherford, 2013), the fastest growing platform (O’Toole & Essex, 2012), and
has continued to outpace traditional education delivery (Kupczynski et al., 2011).
Because of the success and the increasing number of online learners, Bell (2011) declared
30
that old learning theories do not apply to the new medium and that there is a need for
deep-seated modifications in instructional methodology (Hoic-Bozic et al., 2009). Others
question the effectiveness of the medium (Joo et al., 2011) and emphasize that course
design and pedagogy (technological pedagogical knowledge) trumps technology
(Bradley, 2009; Fahy, 2008; Walther, Gay, & Hancock, 2005) and that the e portion of
eLearning is only a tool for the conveyance of learning (Andrews & Haythornthwaite,
2007; Clapper, 2010), even if that tool “can make the learning process easier and enhance
its outcome” (Hsieh & Cho, 2011, p. 2025).
Because of the introduction of technology into the classroom, and the affordances
of that technology, it is possible for instructors to develop teaching strategies to cater to
the individual differences of learners. In a quantitative survey study (N = 1,811), Jeffrey
(2009) used confirmatory factor analysis to determine ten learning orientations of online
learners. Five of the learning orientations (mastery effort, time-poorness, assessment
focus, competitiveness, and listening) were stand-alone measures of learner differences,
and five (concrete–abstract reasoning, working alone–working collaboratively, textual–
visual, extrinsic–intrinsic motivation, and dependent–independent learner) were
continuum based. A second order factor analysis was conducted on the ten learning
orientations, and three learning pathways were derived: cognitive voyaging (comprising
relativistic reasoning and independent learning), industrious pragmatism (comprising
mastery effort, competitiveness, and assessment focus), and multimedia collaboration
(comprising visual learning, listening, collaborative, extrinsic motivation, and timepoorness). Jeffrey found differences between generational groups and ethnicities in both
31
the first order and second order factors, supporting her premise that learners have
different approaches to learning that may be identified.
In the literature, eLearning has gone by many names and has been defined in a
number of differing ways. Different names have been proposed because each researcher
attempted to clarify what is meant when a learner learns when they are not in the physical
presence of the instructor. Most names subsumed either learning or education, because
that is the intent of the process. Learning or education is generally modified by
indicating the method by which the non-proximal learning is to take place; electronically
(e or virtual), through technology (using a computer or the web); or as a descriptor of the
lack of proximity to the instructor (distance or distributed), or of the position the learner
is placed by the technology (isolation). Each name conveys something essential about
eLearning but in turn leaves out much. The definitions of eLearning are similar in their
focus on technology and of its use by individuals or groups to convey information and
resources, to facilitate interaction and collaboration, and ultimately to improve learning.
Online learning has experienced amazing growth over the past decade, especially in
higher education, with the growth of mobile technology and increased bandwidth in most
countries. The reason for this growth is ascribed to the usefulness of these technologies
and the affordances these technological tools grant to education. These benefits and
affordances are enumerated and explored in the next section.
Purported Benefits of eLearning
Almost every research study into eLearning identifies one or more benefits that
accrue to institutions and to learners. Variously seen as a training medium (Baran et al.,
2011; Cheng et al., 2011; Hoic-Bozic et al., 2009), an instructional strategy (Cheng et al.,
32
2011; Kirschner, Sweller, & Clark, 2006), and a learning environment (Ekmekci, 2013;
Ke & Kwak, 2013; Lee et al., 2011), eLearning has various useful properties to the
institutions that implement eLearning (Dykman & Davis, 2008a; Kiliç-Çakmak, 2010),
and to the learners who use it as a way of connecting to learning (Connors, 2005;
Haythornthwaite et al., 2007). Bhuasiri et al. (2012) conducted a Delphi study with 82
eLearning experts to determine the dimensions and factors that most contribute to the
success of eLearning. After two rounds, the authors concluded that there were six
dimensions to online success, comprising 20 different factors. These dimensions and
factors will be explored in further detail in the appropriate sections and subsections
below. In this section, the benefits to institutions of higher education will be enumerated
and explored, and the corresponding affordances of the technology will be discussed and
placed in context. Then a discussion of the usefulness and benefits of the various
technologies for learners will be identified and described as well as an in-depth
discussion of the affordances attributed to eLearning and its associated technology.
Institutional benefits and affordances. The benefits and affordances of
eLearning to institutions of higher learning distill to three categories. eLearning
promotes an increase in number of students without many of the associated costs (Boling
et al., 2011; Cheng et al., 2011; Desai et al., 2008; Haythronthwaite et al., 2007; Lee,
Redmond, & Dolan, 2008). For example, Cheng et al. (2011) identified technology
innovation as a means for institutions of higher learning to improve their competitive
advantage. Desai et al. (2008) attributed eLearning to the creation of mega-universities
with student populations of 100,000 plus. Haythornthwaite et al. (2007) acknowledged
that online learning does not necessitate the building of additional facilities or parking
33
lots. eLearning creates an environment where the quality of instruction may be
monitored and enhanced (Abrami et al., 2010; Dykman & Davis, 2008a; Lo et al., 2011).
In this section, these categories will be reviewed and supported from the literature.
The growth in the number of postsecondary learners is likely due to the
realization that “postsecondary education has become the threshold requirement for a
middle-class family income” (Carnevale, Smith, & Strohl, 2010, p. 13). For colleges and
universities, eLearning provides a way to meet this growing demand, while rewarding
those postsecondary institutions as well. For institutions of higher education eLearning
affords the opportunity to expand the student body and include large numbers of students
and alleviate crowding on campus (Ferguson & DeFelice, 2010; Kuleshov, 2008;
Lykourentzou, Giannoukos, Mpardis et al., 2009), without the associated building costs
of providing classrooms and parking spaces (Bhuasiri et al., 2012; Brown, 2012). Some
institutions have taken this affordance and opened to a global market (Boling et al., 2011;
Desai et al., 2008; Hsieh & Cho, 2011; Jackson, Jones, & Rodriguez, 2010), with some
institutions reaching hundreds of thousands of students because of a diminishing or
absent need for physical presence (Ali & Ahmad, 2011; Kiliç-Çakmak, 2010;
Lykourentzou, Giannoukos, Nikolopoulos, Mpardis, & Loumos, 2009; Nikolaki &
Koutsouba, 2012).
For many institutions, eLearning has been recognized as a way to reduce
expenditures by decreasing marginalized costs (Biasutti, 2011; Ekmekci, 2012; Jackson
et al., 2010; Major, 2010) and as a lower cost alternative to on-campus traditional
learning (Bhuasiri et al., 2012; Bradford & Wyatt, 2010; Caine, 2010; Crawford-Ferre &
Wiest, 2012; Driscoll et al., 2012). Others have determined that eLearning increases
34
institution profits (Gupta & Bostrom, 2009; Racović-Marković, 2010; Ross-Gordon,
2011) with very promising financial forecasts (Chaves, 2009; Gibbons & Wentworth,
2001). In addition to the fiscal rewards of eLearning to the institutions, many have
determined that it also provides opportunities to meet the diverse needs of contemporary
students and alternative ways to meet degree requirements without requiring students to
attend full time (Er et al., 2009; Fidishun, 2011; Jackson et al., 2010; Hsu & Shiue,
2005). As discussed in a previous section, the majority of online students are adults over
the age of 24 and for many years have been labeled nontraditional (Bye et al., 2007;
Cercone, 2008; Ke & Xie, 2009; Wilson 2005). In recent years traditional, full-time,
single, nonworking students who have entered postsecondary education directly from
secondary education now constitute about 27% of the student population (Ross-Gordon,
2011); so, traditional has become nontraditional (Al-Asfour, 2012; Bolliger & Halupa,
2012; Kaliski et al., 2012; Sulčič & Lesjak, 2009).
Online learning allows institutions benefits for the classroom, including the ability
to reuse (Bhuasiri et al., 2012; Chen & Lien, 2011; Fahy, 2008; Ferratt & Hall, 2009) and
better organize course materials (Cercone, 2008; London & Hall, 2011), better quality
oversight of course delivery (Callens, 2011; London & Hall, 2011), nonconventional use
of teaching staff (Bhuasiri et al., 2012; Pigliapoco & Bogliolo, 2008), and the
coordination of learning through the use of learning management systems (Bhuasiri et al.,
2012; Paechter, Maier, & Macher, 2010). Christie and Jurado (2009) cautioned,
however, that mere implementation of a learning management system does not equate to
success. They found in their research on learning platform implementations that a great
deal of preparation for students and instructors is essential to get the most out of the
35
systems, and encourage use. Online learning allows instructors to access their course
room from anywhere, meaning that they do not have to live in proximity to the physical
campus, and could be hired part-time or as an adjunct to teach specific courses or
material (Bolliger & Halupa, 2012; Kuleshov, 2008). It is now possible for supervisors
to review the quality of the communication, assignments, and interactions of the
instructor with his or her students and the quality of the students’ work across classes
(Bhuasiri et al., 2012; Racović-Marković, 2010). This ability to review and compare the
quality of instructors’ work can be motivating to instructors (London & Hall, 2011; Park
& Choi, 2009; Racović-Marković, 2010), increase course quality (Guilbaud & JereomeD’Emilia, 2008; Hoic-Bozic et al., 2009), and engender a more effective learning
environment (Abrami et al., 2010; DeLotell, Millam, & Reinhardt, 2010; Diaz &
Entonado, 2009; Er et al., 2009; Gonzalez-Gomez et al., 2012). Learning management
systems provide for effective teaching, learning, evaluation, and administration of online
students (Archambault et al., 2010; Bradford & Wyatt, 2010; Hsieh & Cho, 2011; Kear et
al., 2012; Lu & Chiou, 2010; Martinez-Caro, 2011; Oncu & Cakir, 2011). These benefits
to institutions address their motivation to offer online courses, and to improve the quality
of such offerings as they are able.
Learner benefits. Falloon (2011) indicated that the interaction available because
of eLearning “improves attitudes, encourages earlier completion of coursework, improves
performance on tests, allows deep and meaningful learning opportunities, increases
retention rates, and builds learning communities” (p. 188). The remainder of this section
will enumerate many useful features of online learning; some specific to the learner,
36
some specific to the content, still others attach to the instructor or co-learners, and yet
others improve education as a whole.
eLearning allows individuals to achieve their educational goals (Deil-Amen,
2011; Gibbons & Wentworth, 2001; Kellogg & Smith, 2009). Authors have indicated
that online learning improves learning management (Andrews & Haythornthwaite, 2007),
improves computer and technological literacy (Fahy, 2008; Kiliç-Çakmak, 2010; Schultz,
2012), is more affordable (Al-Fahad, 2010), and accommodates different learning styles
(Al-Fahad, 2010; Diaz & Entonado, 2009; O’Bannon & McFadden, 2008; Schultz, 2012).
Kiliç-Çakmak (2010) for example, found that eLearning improved learners control belief,
or their perception of access to skills, resources, and opportunities, which then increased
their informational and motivational self-efficacy. Schultz (2012), based on his findings,
recommended that a learning styles inventory be completed by each learner prior to that
start of any online program to better facilitate and accommodate learner needs. Based on
the theoretical frameworks of a number of researchers, eLearning instructors tend to take
a more learner-centered approach to education (Anderson, 2008a; Lu & Chiou, 2010;
Sharples, Taylor, and Vavoula, 2007; Smith, 2005); promote critical thinking (Anderson,
2008a; Driscoll et al., 2012; Oncu & Cakir, 2011; Phelan, 2012), deep learning (DeLotell
et al., 2010; Lear et al., 2010; Yang & Cornelious, 2005), problem-solving skills (Boling
et al., 2011; DeLotell et al., 2010; McGrath, 2009; Paechter et al., 2010), collaborative
learning (Diaz & Entonado, 2009; Paechter et al., 2010; Yen & Abdous, 2011), and
learning pleasure (Holmberg, 1989; Hussain, 2013; Simonson, Schlosser, & Hanson,
1999). The self-paced nature of eLearning has been found to accentuate learner
autonomy (Boling et al., 2011; Nikolaki & Koutsouba, 2012; Rovai, Ponton, Wighting, &
37
Baker, 2007; Tallent-Runnels et al., 2006) and active involvement (Lam & Bordia, 2008;
Nikolaki & Koutsouba, 2012; O’Bannon & McFadden, 2008); this increase in learner
control engenders self-regulation (Abrami et al., 2010; Paechter et al., 2010; Shea &
Bidjerano, 2010) and self-motivation (Alshare et al., 2011; Lee & Choi, 2011), which
London and Hall (2011) showed was 19% more effective in increasing learning
outcomes.
eLearning enables additional benefits for learners through different
representations and navigability of content (Archambault et al., 2010). Claims about
online content include that eLearning improves learner’s focus on content (Archambault
et al., 2010; London & Hall, 2011), grants timely access to up-to-date, rich content
(Dykman & Davis, 2008a; Er et al., 20009; Kaliski et al., 2012; Lu & Chiou, 2010;
Wang, Vogel, & Ran, 2011), and allows for much more flexibility in navigation and
control of content (Archambault et al., 2010; Beqiri et al., 2010; Lo et al., 2011). In a
qualitative case study, Archambault et al. (2010) sought to assist instructors on melding
their instruction with Web 2.0 tools, and then determine what changed in their
perceptions and teaching, and what needed to be changed in the online implementation.
They determined that through the inclusion of social networking tools there were more
quantitative and qualitative interactions between the learners and the instructor and that
these interactions fostered a change in the way instructors taught and in the content they
presented. They further found that the instructor was “more or a ‘partner in learning’
than a facilitator, [and recommended that] instructors view the students as contributors of
knowledge, and thus allow them to participate in the creation of content” (Archambault et
al., 2010, p. 10). Content may be in the form of text, pictures, graphics, discussion
38
boards, instant messaging, e-mail, audio, or video (Anderson, 2008b; Fahy, 2008; Kear et
al., 2012; Lu & Chiou, 2010), and can be disseminated either synchronously or
asynchronously (Ally, 2008; Er et al., 2009; Malik & Khurshed, 2011; Russ, Mitchell, &
Durham, 2010; Shih, Feng, & Tsai, 2008). Content, when presented well, can be tailored
to present just the right amount of information without information overload (Diaz &
Entonado, 2009; George, 2013) to ensure optimal learning (Bhuasiri et al., 2012;
Nikolaki & Koutsouba, 2012; Rey & Buchwald, 2011). Online learning has been shown
to be especially beneficial for subjects that require abstract conceptualizations or
reflective observations, but less helpful for concrete experiences (Diaz & Entonado,
2009).
The availability and interactivity of the instructor are considered the most
significant contributors to eLearning success (Abdous & Yen, 2010; Baran et al., 2011;
Falloon, 2011; Omar et al., 2011). In a quantitative quasi-experimental survey study at a
public 4-year university in the U.S. (N = 496), Abdous and Yen (2010) found that for
each unit increase of instructor–learner interaction, as perceived by the learner, there was
a corresponding increase (β = .943) in learner satisfaction and learning outcomes;
measured by course grade. Instructors in online courses are encouraged to facilitate
learning (Fidishun, 2011; Guilbaud & Jerome-D’Emilia, 2008; Ke & Xie, 2009) and are
expected to be more interactive than in traditional settings (Abdous & Yen, 2010;
Falloon, 2011). Learners have many more avenues to contact and work with instructors
in eLearning, which can promote relationship building (Fahy, 2008; Ryan, Connolly,
Grummell, & Finnegan, 2009; Simonson et al., 1999) and learner engagement (Abdous &
Yen, 2010; Archambault et al., 2010; Revere & Kovach, 2011; So & Bonk, 2010).
39
Online learning also allows for physically separated learners to meet, interact, help each
other, form a community of learners (Anderson, 2008a; George, 2013; London & Hall,
2011; Sharples et al., 2007), and overcome the isolation that can be a part of eLearning
(Ferratt & Hall, 2009; Jackson et al., 2010; Mancuso, Chlup, & McWhorter, 2010; Shea
et al., 2006). Some proponents of eLearning identify the ability for learners to
collaborate as a key element to successful online learning (Ismail et al., 2011; Ismail et
al., 2010; Martinez-Torres, Toral, & Barrero, 2011; Sims, 2008). The various forms of
interactions that make “learners in online settings significantly outperform their peers in
traditional classrooms” (Shea & Bidjerano, 2010, p. 1721) are discussed in great detail in
the following section.
Learner affordances. Technology allows connected and contextual learning.
Though technology connects learners with content, with their instructor, and with other
learners it is only the delivery mechanism (Andrews & Haythornthwaite, 2007; Antonis,
Daradoumis, Papadakis, & Simos, 2011; Cheng et al., 2011; Ismail et al., 2011; Kim &
Frick, 2011); what the technology allows learners to do is far more important. As Gupta
and Bostrom (2009) identified, “it is not the technology or the features of technology that
are important, but rather the structural dimensions that the technology provides, which
influence learning effectiveness” (p. 695), and it is these dimensions that will be
discussed in this section.
In the literature, there are six major affordances that accrue to learners because of
the technologies identified above. These affordances are: (a) no boundaries, (b)
flexibility, (c) personalized learning, (d) learner autonomy and control, (e) collaboration
and community, and (f) interaction. In the paragraphs that follow each of these
40
affordances will be explored; although a discussion of the last affordance will be
postponed to the following section on factors that bring eLearning success.
No physical limitations. Beyond doubt, the most noted affordance of eLearning
is the ability to link learner and instructor who are physically separated, but are connected
by technology. The ability to learn anywhere (Al-Fahad, 2010; Callens, 2011; DeLotell
et al., 2010; Desai et al., 2008; Ekmekci, 2013; Fidishun, 2011; Ismail et al., 2010), at a
distance (Ali & Ahmad, 2011; Boling et al., 2011; Er et al., 2009; Russ et al., 2010; Shih
et al., 2008), when learners and instructor are geographically diverse (Al-Fahad, 2010;
Desai et al., 2008; Guilbaud & Jerome-D’Emilia, 2008; Hsu & Shiue, 2005), affording
wider access to all (Bradford & Wyatt, 2010; Ismail et al., 2010; Jackson et al., 2010;
Major, 2010), is mentioned again and again and is part of the definition of eLearning
(Bradford & Wyatt, 2010; Cho & Lien, 2011; Ho & Kuo, 2010; Hoic-Bozic et al., 2009).
No time limitations. Any time learning is the second most mentioned affordance
of eLearning. The ability to learn at a time of the learners choosing is the number one
reason for the growth of online learning (Bradford & Wyatt, 2010; DeLotell et al., 2010;
Ferguson & DeFelice, 2010; Lear et al., 2010). Schools have acceded because learners
have demanded more flexible schedules (Albert & Johnson, 2011; Beqiri et al., 2010;
Donavant, 2009; Lykourentzou, Giannoukos, Nikolopoulos et al., 2009), which provides
access to learning anytime the learner is near a PC, tablet, or smart phone (Ismail et al.,
2011; Russ et al., 2010). Any time provides learners with the flexibility to study more
efficiently, upon demand (Antonis et al., 2011; Lee & Choi, 2011; Sims, 2008);
effectively removing barriers of time (Ally, 2008; Guilbaud & Jerome-D’Emilia, 2008;
Mancuso et al., 2010; Shih et al., 2008). Combined with anywhere learning, flexibility
41
makes learning extremely convenient (Beqiri et al., 2010; Ismail et al., 2010; Sulčič &
Lesjak, 2009) and time saving (Al-Fahad, 2010; Lam & Bordia, 2008). Anywhere–
anytime, or nomadic learning, affords the learner the ability to save in travel time and
costs (Al-Fahad, 2010; Callens, 2011; Park & Choi, 2009; Sims, 2008) and participate in
learning despite work constraints (Cercone, 2008; Crawford-Ferre & Wiest, 2012;
Lykourentzou, Giannoukos, Mpardis et al., 2009; Michinov et al., 2011) or family
constraints (Diaz & Entonado, 2009; Park & Choi, 2009; Rovai et al., 2007; Sulčič &
Lesjak, 2009) that might otherwise make it impossible for learners to enroll in classes.
Personalized learning. eLearning is adaptive, meaning that when implemented
effectively, learning can be customized and individualized to the experiences and input of
the learner (Cheng et al., 2011; Guilbaud & Jerome-D’Emilia, 2008; Ke & Kwak, 2013;
London & Hall, 2011). Doing so makes learning more personally relevant (Holmberg,
1989; Karge et al., 2011; Tallent-Runnels et al., 2006), which has been shown to increase
learner motivation (Bradley, 2009; McGrath, 2009). Karge et al. (2011), for example,
identified five adult learning strategies that increased learner participation and concurrent
motivation. Since education is becoming more and more essential for financial survival
in the modern technological world (Bye et al., 2007; Hoic-Bozic et al., 2009), relevancy
and motivation can engender lifelong or sustained learning (Bass, 2012; Blaschke, 2012;
Chan, 2010; Gill, 2010; Kim & Frick, 2011; Knowles, 1980), allowing the individual to
maintain their adaptability and flexibility in a fast changing world (Bennett, Bishop,
Dalgarno, Waycott, & Kennedy, 2012; Doherty-Restrepo, Hughes, del Rossi, & Pitney,
2009; Reushle & Mitchell, 2009; Smith, 2005), preventing individual economic
marginalization (Ross-Gordon, 2011; Scanlon, 2009), and encouraging personal
42
development (Donavant, 2009; Haythornthwaite et al., 2007; Potter & RockinsonSzapkiw, 2012).
Learner autonomy and control. Self-paced learning calls for self-motivated
learners (Dykman & Davis, 2008a; Ferratt & Hall, 2009; Hussain, 2013) who are selfregulating (Ke, 2010; Lee et al., 2011; Savery, 2010; Shea & Bidjerano, 2010). Much
eLearning can be done at the learner’s pace (Al-Fahad, 2010; Chen & Lien, 2011; Shih et
al., 2008; Donavant, 2009) and is under the learner’s control (Anderson, 2008b; Callens,
2011; Tallent-Runnels et al., 2006), and when successful, drives generative or learnerdriven learning (Ayers, 2011; Hannay, Kitahara, & Fretwell, 2010; Kiener, 2010; London
& Hall, 2011). Research supports that “what the learner does is more important than
what the teacher does” (Bahr & Rohner, 2004, p. 2). In an essay on critical realism,
Ayers (2011) argued that learner needs are “both real and socially constructed” (p. 354).
By making the learner responsible for their own learning, eLearning encourages learner
autonomy (Boling et al., 2011; Stein, Calvin, & Wanstreet, 2009; Wilson & Allen, 2011),
active involvement (Bradley, 2009; Chickering & Gamson, 1987; Nikolaki & Koutsouba,
2012), and self-direction (Abrami et al., 2010; Conceicao, 2002; DeLotell et al., 2010;
Fidishun, 2011).
Collaboration and community. For some pedagogical theories, collaboration and
community are more important than others, but the technologies of eLearning do provide
for learners to collaborate with other learners or with their instructor (Ali & Ahmad,
2011; Baran et al., 2011; Martinez-Torres et al., 2011; Ruey, 2010). Online learning
brings people together (Mancuso et al., 2010; Nagel, Maniam, & Leavell, 2011), while
concealing many of the social cues that might inhibit communication (Brown, 2012;
43
Taran, 2006), engendering a more bias free environment (Yang & Cornelious, 2005). An
emphasis on interaction can overcome feelings of isolation that can occur in a solitary
learning environment (Ferratt & Hall, 2009; Jackson et a., 2010; Mancuso et al., 2010;
Shea et al., 2006), and can assist in building a community of inquiry (Abdous & Yen,
2010; George, 2013; Sharples et al., 2007) among learners.
Interactions. Another major affordance of eLearning is the ability of the learner
to interact with content (Desai et al., 2008; George, 2013; Guilbaud & Jerome-D’Emilia,
2008; Gupta & Bostrom, 2009), with his or her peers (Abdous & Yen, 2010; Chen &
Lien, 2011; Desai et al., 2008; Sims, 2008), and with the instructor (Ally, 2008; Baran et
al., 2011; Kupczynski et al., 2011; Omar et al., 2011; Rovai et al., 2007). Some
researchers have asserted that eLearning has a higher degree of interaction than does
traditional learning (Abrami et al., 2010; Archambault et al., 2010; Boling et al., 2011;
Falloon, 2011). Boling et al. (2011), for example, conducted a qualitative, descriptive,
case study, and determined that text-based online courses with limited learner–learner
interactions and an individualized learning focus “were less helpful than those courses
and programs that were more interactive” (p. 3) because learners reported feeling
disconnected from the content, instructor, and other learners, while expressing greater
dissatisfaction. Interactions between the learner and the content can include the
integration of learning with experience (Barrett, Higa, & Ellis, 2012; Gill, 2010; TallentRunnels et al., 2006), increasing learner relevance (Ally, 2008; Gill, 2010; O’Toole &
Essex, 2012). Technology facilitates access to content (Anderson, 2008b; Hoic-Bozic et
al., 2009; Kaliski et al., 2012; Martinez-Torres et al., 2011), as well as providing more
access to information that can be maintained in real-time (Ally, 2008; Wang et al., 2011),
44
while providing flexibility in material and topic coverage (Er et al., 2009; George, 2013;
Holbert & Karady, 2009; Tallent-Runnels et al., 2006). Learners have a greater chance
for mutual exchange of ideas or information online with other learners (Barrett et al.,
2012; Dykman & Davis, 2008a; Sharples et al., 2007; Racović-Marković, 2010).
Assisting other learners fosters relationship building (Bradley, 2009; O’Bannon &
McFadden, 2008), feelings of rapport (Adamson & Bailie, 2012; Blanchard et al., 2011;
Gilbert et al., 2013; Holmberg, 1989), and supportive relationships (Deil-Amen, 2011;
Ryan et al., 2009; Sharples et al., 2007; Taran, 2006) that can become a community of
support (Archambault et al., 2010; Russ et al., 2010; Sharples et al., 2007). The learner is
able to engage with the instructor more personally online (Major, 2010; Revere &
Kovach, 2011), enhancing communication (Anderson, 2008b; Pigliapoco & Bogliolo,
2008; Travis & Rutherford, 2013) because of real-time, immediate, individualized, and
multidirectional communication and feedback (Archambault et al., 2010; Boling et al.,
2011; Er et al., 2009; Hoic-Bozic et al., 2009; Russ et al., 2010). This interaction can
easily bridge the transactional distance between the learner and the instructor (Abdous &
Yen, 2010; Desai et al., 2008; Fahy, 2008), while affording the instructor the ability to
provide immediate and timely feedback to the learner (Alshare et al., 2011; Falloon,
2011; Lee, 2010; Lee & Choi, 2011).
eLearning affords learners great temporal freedom for their learning; granting
flexibility (Callens, 2011; Michinov et al., 2011), boundaryless learning (Ekmekci, 2013),
enhanced interactivity and interaction (Martinez-Torres et al., 2011; Nagel et al., 2011),
and increasing learner autonomy (Chen & Lien, 2011; Lee et al., 2011), innovation
(Blaschke, 2012; Ke & Kwak, 2013), and involvement (Boling et al., 2011, Falloon,
45
2011). eLearning removes time barriers and increases the efficiency of studying because
learning can occur at any time, 24 hours a day, 7 days a week, 365 days a year
(Crawford-Ferre & Wiest, 2012). eLearning removes physical limitations to learning,
making it possible for greater numbers of learners to participate because of the
convenience of learning anywhere (Cho & Lien, 2011). eLearning grants learners more
control over their learning by affording an ability to learn at their own pace (Karge et al.,
2011). Finally, eLearning enhances communication between learner and content,
between learner and learner, and between learner and instructor, providing opportunities
for greater engagement, support, and personalization (London & Hall, 2011). While it is
important to know what technology may provide, it is even more important “to have deep
understandings of how people learn” (So & Bonk, 2010, p. 189) so that educators may
understand what factors engender success in eLearning. It is to this topic that the next
section turns.
Factors that Bring eLearning Success
A number of factors persistently appear in the literature regarding learner success
with online learning. Most of these factors deal with one form of interaction, or another
and have been determined to have a positive effect on learner learning (Abrami et al.,
2010; Hrastinski & Jaldemark, 2012; Kiliç-Çakmak, 2010; Pigliapoco & Bogliolo, 2008)
and is critical to learner success (Barrett et al., 2012; Ke, 2010; Racović-Marković,
2010). Lack of interaction appears to engender learner dissatisfaction and even
withdrawal (Abdous & Yen, 2010; Chaves, 2009; Savery, 2010). Interaction is a form of
communication with the intent of affecting behavior (Lear et al., 2010) and consistent and
timely communications in eLearning increases success and learner persistence
46
(Archambault et al., 2010; Bradford & Wyatt, 2010; Ekmekci, 2013); the quality of the
communication and the degree of interaction is one predictor of completion (Er et al.,
2009; Ferguson & DeFelice 2010; Shea et al., 2006).
Communication in the sense of interaction does not necessarily represent dialog
but is viewed and expressed as presence (Oncu & Cakir, 2011). The different presences
are learner centered (DeLotell et al., 2010; Ryan et al., 2009), allowing for individualized
paths of development (Ekmekci, 2013; Hoic-Bozic et al., 2009) that focus on learning
instead of teaching (Bradley, 2009; Gonzalez-Gomez et al., 2012; Sims, 2008) and grant
ownership to the learner for their learning (Alewine, 2010; Fidishun, 2011; Ghost Bear,
2012; Martinez-Caro, 2011). In a two phase quasi-experimental study to determine
differences in online learner satisfaction and performance between a 15-week course and
a 5-week course, the major finding was “that connectedness to the course, either by
participating collaboratively with other students or by interacting with the professor, will
likely impact student satisfaction” (Ferguson & DeFelice, 2010, p. 75) the most. The
researchers indicated that the performance (final grade) of learners in the shorter course
was significantly better than in the longer course. They also found that learners in the
shorter course were more satisfied with interactions with fellow students in the shorter
course; conversely, learners in the longer course were much more satisfied with
interactions with their instructor. Ferguson and DeFelice (2010) noted a limitation of the
study was that learners in the shorter course were summer students; therefore the
demographics between the two groups could have been different. In a two-group
experiment with prisoners attending class to get their GEDs, the experimental group was
found to present fewer negative behaviors and were more active in the classroom because
47
of a program emphasizing inmates taking responsibility for their learning, than was the
control group (Alewine, 2010). Fidishun (2011), in an essay on integrating technology
into a curriculum with adult learners, identified that success comes from design that is
learner-centered and interactive, with instructors facilitating learner’s self-direction. The
presences of eLearning will be introduced in this section, but further explicated in the six
subsections following. The three presences that are most essential for eLearning success
are; teaching presence, social presence, and cognitive presence (Anderson, 2008a;
George, 2013; Hoskins, 2012).
Teaching presence represents the processes most individuals associate with
education (George, 2013; Wang, 2010). Teaching presence focuses on the instructor–
learner relationship (Ekmekci, 2013; Pelz, 2010), which in eLearning is the main
predictor of learner success and satisfaction (Baran et al., 2011; Joo et al., 2011;
Simonson et al., 1999; Tuquero, 2011), encompasses a reduced didactic role for the
instructor (Bradley, 2009; Chaves, 2009), and a strong connection with learners (Boling
et al., 2011; Ekmekci, 2013; Hannay et al., 2010). For teaching presence, visibility of the
instructor and vertical interactions are key (Anderson, 2008a; Bradley, 2009), along with
the encouragement of discourse and contact (Ekmekci, 2013; Fahy, 2008; Joo et al.,
2011; Ke, 2010) and the promotion of active learning (Cornelius, Gordon, & Ackland,
2011; Hoic-Bozic et al., 2009; O’Bannon & McFadden, 2008).
Social presence comprises a number of eLearning success factors discussed
below, and figures prominently into learner–learner interactions (Guilbaud & JeromeD’Emilia, 2008; Hoskins, 2012; Savery, 2010), immediate real world application of
learning (Ghost Bear, 2012; Lee et al., 2011; Potter & Rockinson-Szapkiw, 2012), and
48
learner motivation (Conceicao, 2002; Er et al., 2009; Morrow & Ackermann, 2012; Sims,
2008). Successful learner–learner interactions are based on the idea of supportive
learning (Bradley, 2009; Ismail et al., 2011; Ross-Gordon, 2011; Ryan et al., 2009),
where learners can learn from each other because of their real world experiences
(Ferguson & DeFelice, 2010; Haythornthwaite et al., 2007; Lee et al., 2011; MartinezCaro, 2011) and build a community of learning (Pelz, 2010; Shea & Bidjerano, 2010;
Travis & Rutherford, 2013) to promote high quality online learning (Cercone, 2008;
Hoic-Bozic et al., 2009; Ke, 2010; Ke & Kwak, 2013; Rhode, 2009). Some educational
theorists have emphasized the social side of eLearning, indicating that it is vital to the
success of learners (Anderson, 2008a; Deil-Amen, 2011; Ke, 2010; Sharples et al., 2007),
in building knowledge and skills, while co-constructing knowledge through horizontal
interactions (Guilbaud & Jerome-D’Emilia, 2008; Ryan et al., 2009; Sinclair, 2009) and
collaboration (Cercone, 2008; Paas & Sweller, 201; Smith, 2005; Tallent-Runnels et al.,
2006). Education transforms previous experience through learning (Anderson, 2008b;
Ryan et al., 2009; Wang & Kania-Gosche, 2011), and does this best if learners can
immediately apply what they have learned to their life (Ally, 2008; Henning, 2012;
Reushle & Mitchell, 2009; Stern & Kauer, 2010), which is considered by some to be
critical to eLearning outcomes (DeLotell et al., 2010; Ghost Bear, 2012; Glassman &
Kang, 2010; Keengwe & Georgina, 2011).
Shea and Bidjerano (2010) conjectured that while teaching presence and social
presence may influence cognitive presence directly, they believed that there was another
important presence comprising learning presence that had a greater influence on learning.
In a quantitative cross-sectional survey study of 42 universities with 3,165 student
49
participants, the researchers examined the Community of Inquiry Framework and its
relationship to a nascent theoretical construct called online learner self-regulation.
Proposed elements of this construct explored in this study were self-efficacy and effort
regulation. Using structural equation modeling, Shea and Bidjerano found a strong
correlation between teaching, social, and cognitive presence and self-efficacy. The
element of self-efficacy, however, moderated cognitive presence.
Cognitive presence includes learner–content interaction, reflection, and learner
motivation and engagement (Cacciamani et al., 2012; Hoskins, 2012; Joo et al., 2011;
Wang, 2010). The materials or content of a course provide the catalyst for learning
(George, 2013; Savery, 2010). If a learner takes the time to study (Jeffrey, 2009; Omar et
al., 2011), synthesizes the facts and ideas generated by the content, the instructor, and
other learners (Bransford et al., 2005; Ke, 2010; Potter & Rockinson-Szapkiw, 2012), and
spends an appropriate amount of time on task (Amrein-Beardsley & Haladyna, 2012;
Dziuban & Moskal, 2011; O’Bannon & McFadden, 2008), learning will occur. While the
content provides a “sound foundation of validated knowledge” (Sharples et al., 2007, p.
223), learning requires the participation or engagement of the learner (DeLotell et al.,
2010; Lear et al., 2010; Shea & Bidjerano, 2010) and can be fostered and enhanced
through active learning techniques (Amrein-Beardsley & Haladyna, 2012; Smith, 2005).
Motivation, goals, or a set purpose to learn are necessary for appropriate participation
(Cornelius et al., 2011; Hodge et al., 2011; Kupczynski et al., 2011). Researchers have
identified that for deep learning to occur requires learners to reflect on their activities,
interactions, and experiences as well as the content (Canning, 2010; DeLotell et al., 2010;
Galbraith & Fouch, 2007; Yang & Cornelious, 2005).
50
Cacciamani et al. (2012) sought through a blended (online and on-campus)
learning activity to determine the effect that participation, instructor style, and reflection
had on knowledge building online. The study participants came from two universities in
Italy and consisted of two phases. Learners were set into groups based on the number of
messages they posted during class, instructors were assigned a facilitator style to enact
online (oppositional vs. supportive), and personally reflective statements were asked of
learners and answered in a shared space. The researchers found that high levels of
participation were also associated with advanced knowledge building, a supportive
facilitator style engendered more advanced knowledge building than did the
confrontational style, and participants who engaged in the reflective questions also tended
to engage in advanced knowledge building (Cacciamani et al., 2012).
Other factors that are less mentioned, but noted in the literature are learners need
to be technologically efficacious (Belair, 2012; Bhuasiri et al., 2012; Gupta & Bostrom,
2009), learner context must be perceived as easy to use (Alshare et al., 2011; Joo et al.,
2011; Lee, 2010; Martinez-Caro, 2011), and assessments matched to the ability of
learners (Kiener, 2010; MacLean & Scott, 2011; Savery, 2010). Alshare et al. (2011), in
a quantitative survey design with 674 college students, used structural equation modeling
to determine whether system quality, information quality, comfort with online learning,
self-management of learning, and perceived web self-efficacy had a predictive effect on
learner satisfaction and intention to use online learning. The researchers found that
information quality and comfort with online learning were the most predictive of learner
satisfaction (β = .45 and β = .25, respectively). A more inclusive discussion of the major
factors of eLearning success will be included in the subsections below. In those sections,
51
the benefits and necessity for (a) a healthy learner–instructor relationship, (b) encouraged
learner–learner interactions, (c) engagement in learner–content interactions and
reflection, (d) collaboration and the development of a sense of community, (e) application
centered real world learning, and (f) the necessity of learner motivation will be
expounded and further delineated.
Learner–instructor relationship. Teaching presence has been defined as “the
design, facilitation, and direction of cognitive and social processes for the purpose of
realizing personally meaningful and educationally worthwhile learning outcomes”
(Anderson, Rourke, Garrison, & Archer, 2001, p. 5). According to the literature,
visibility is crucial to the establishment of teaching presence (Anderson, 2008b; Bradley,
2009; Savery, 2010) with admonitions that the instructor be available and accessible to
students (Ali & Ahmad, 2011;Belair, 2012; Boling et al., 2011; Jackson et al., 2010; Lee,
2010), which drives the learning process (DeLotell et al., 2010; Driscoll et al., 2012).
Teaching presence does so by fashioning and supporting both social and cognitive
presence (Hoskins, 2012; Joo et al., 2011). Teaching presence sparks cognitive presence
and allows learners to perceive and note social presence (Ekmekci, 2013; George, 2013;
Joo et al., 2011). Many of the sidereal factors of teaching presence will be reviewed in
subsequent sections, but in this section will be considered the environmental, learning
relationship factors of teaching presence, along with findings of how teaching presence
engages cognitive presence, triggers social presence, and reduces the instructors didactic
role, while identifying the results of the instructor–learner on effectiveness, motivation,
success, learning, and satisfaction.
52
According to Knowles (1980) a critical function of the instructor is to create a rich
learning environment that is supportive of learning (Hussain, 2013). This often entails
assembling course content so as to elicit engagement from the learner (Ekmekci, 2013).
Further elucidation of this factor of teaching presence will be discussed in the course
factors of dropout.
Joo et al. (2011) sought to determine the structural relationships between presence
(cognitive, social, and teaching), perceived ease of use and usefulness, and learner
satisfaction and persistence at an online Korean university. Utilizing structured equation
modeling, the researchers attempted to verify eight relationships including whether social
and cognitive presence mediates teaching presence and learner satisfaction and
persistence. The authors presented participants with two surveys; one had been modified
from two other instruments to measure the predictor variables of teaching presence,
social presence, cognitive presence, perceived usefulness, and perceived ease of use and
one had been modified from another instrument to measure the criterion variables of
learner satisfaction and persistence. A confirmatory factor analysis was conducted on the
results to ensure internal validity within the constructs. The initial fit between the
variables was also computed and determined to be a good fit (TLI = .961, CFI = .976,
RMSEA = .069). Direct effects that were not significant on persistence (teaching
presence, social presence, cognitive presence, perceived usefulness, and perceived ease of
use) and not significant on learner satisfaction (social presence) were removed from the
initial model. The modified model was compared using structural equation modeling to
the initial model, and no significant difference was found (TLI = .966, CFI = .976,
RMSEA = .065), prompting the acceptance of the modified model.
53
The findings from Joo et al.’s (2011) study indicated that perceived usefulness
and perceived ease of use contributed directly to learner satisfaction (β =.262). Teaching
presence contributed directly to social presence (β =.811), cognitive presence (β =.648),
and learner satisfaction (β =.238) and indirectly to cognitive presence (β =.241), learner
satisfaction (β =.234), and persistence (β =.329). Social presence contributed directly to
cognitive presence (β =.297), while cognitive presence contributed directly to learner
satisfaction (β =.263) and indirectly to persistence (β =.184). Finally, learner satisfaction
was the only variable that contributed directly to persistence (β =.697). While several
implications may be derived from Joo et al.’s study, it confirmed the importance of the
learner–instructor relationship on learner satisfaction, and ultimately persistence.
Martinez-Caro (2011) conducted a study to determine factors that make eLearning
both effective and satisfying to the learner. Fifteen graduate and postgraduate class
sections were surveyed (N = 425) regarding eight independent variables; age, gender,
employment, prior experience with eLearning, online flexibility, instructor–learner
interaction, learner–learner interaction, and method of class delivery (wholly online,
various blended modes). Using structural equation modeling, Martinez-Caro determined
the effect of the independent variables on the dependent variables; learning and
satisfaction. She found that learning was a significant moderator variable for learner
satisfaction (ɣ = 0.73) and that the most predictive, significant dependent variable on
learning was instructor–learner interaction (ɣ = 1.77). The next most predictive,
significant variables to directly affect learning were learner–learner interact (ɣ = 0.53)
and eLearning flexibility (ɣ = 0.52). Martinez-Caro concluded “according to the model
tested; interaction is key to effective eLearning” (p. 578).
54
Learning relationship. The relationship between the instructor and the learner
comprises many factors in a successful educational experience. Since learning is always
incumbent on a relationship with an instructor (Wang & Kania-Gosche, 2011), there are
certain elements that the instructor is responsible for. In online classes, it is expected that
the instructor must be substantially involved (Crawford-Ferre & Wiest, 2012; Wilson,
2005), communicate high-expectations of learners (Hannay et al., 2010; O’Bannon &
McFadden, 2008; Smith, 2005), use humor (Baran et al., 2011; Bye et al., 2007; Chaves,
2009; Jackson et al., 2010), and ask thought-provoking questions (DeLotell et al., 2010;
Martinez-Caro, 2011; Taran, 2006). Other expected competencies of eLearning
instructors are taking responsibility for learners learning (Hussain, 2013), fostering
learner centeredness (Smith, 2005) and learner independence (Falloon, 2011; Hoic-Bozic
et al., 2009; Hussain, 2013; Moisey & Hughes, 2008), and providing structure to the
course materials (Diaz & Entonado, 2009; Paechter et al., 2010). Other primary factors
noted in the literature include precipitating a strong personal connection between
instructor and learner, fairness, and engendering motivation; which will be explored next.
Teacher presence is highly learner centered, and instructors are expected to have
an interest in and respect for their students (Chen & Chih, 2012; Connors, 2005;
Knowles, 1980, 1994; Savery, 2010); demonstrating concern and care for them (Bhuasiri
et al., 2012; Chickering & Gamson, 1987; Discoll et al., 2012; Wilson, 2005). Some
techniques for establishing this strong and expressive connection with learners (Baran et
al., 2011; Boling et al., 2011; Ekmekci, 2013; Hannay et al., 2010) is to call learners by
name (Chaves, 2009; Jackson et al., 2010; Knowles, 1980), treating them as unique
individuals (Cercone, 2008; Knowles, 1980; Wilson, 2005), and really listening to what
55
they say (Knowles, 1980; O’Bannon & McFadden, 2008; Scanlon, 2009). Researchers
have suggested that instructors build relationships with individual learners (Barber, 2012;
Dykman & Davis, 2008b; Wilson, 2005) and cultivate and sustain them (Deil-Amen,
2011; Dykman & Davis, 2008b) by having authentic conversations that encourage
reflection and promote integration of learning (Barber, 2012; Belzer, 2004; DeLotell et
al., 2010). Others have suggested the personalization of eLearning by helping learners
identify strengths and areas of potential growth (Crawford-Ferre & Wiest, 2012; Eneau &
Develotte, 2012; Smith, 2005), helping learners define their learning needs (Hussain,
2013; Ismail et al., 2010; Reushle & Mitchell, 2009; Ruey, 2010), and tailoring
instructional materials and methods to suit the needs of each learner (Chan, 2010; Holton
et al., 2009; Kobsiripat, Kidrakarn, & Ruangsuwan, 2011), while contacting those who
are disruptive or not participating (Adamson & Bailie, 2012; Smith, 2005) is
fundamental.
Part of creating a connection with learners is the need for integrity by the
instructor that features respect (Blaschke, 2012; Bradley, 2009; Cercone, 2008; Knowles,
1980), openness (Sandlin, 2005; Williams, Karousou, & Mackness, 2011; Wilson, 2005),
and fairness (Bhuasiri et al., 2012; Hannay et al., 2010; Savery, 2010) in their dealings
with all learners. In part, this means creating activities and materials that challenge
without overwhelming learners; as Caine (2010) said “challenging but still manageable”
(p. 5). It is this fair, but challenging atmosphere that elicits engagement for learners
(Ekmekci, 2013; Finn, 2011; Rey & Buchwald, 2011).
The concept of motivation will be explored later in the context of a factor that
brings eLearning success and as an adult eLearning characteristic that can mitigate
56
dropout, but in this paragraph the focus is on elements that researchers have identified as
things that instructors can do. DeLotell et al. (2010) suggested that instructors need to
approach their materials, learning objectives, and teaching with enthusiasm; ensuring that
the interaction is interesting (Scanlon, 2009), while embedding motivational units
throughout (Caine, 2010; Deil-Amen, 2011; DeLotell et al., 2010; MacLean & Scott,
2011).
The learning relationship stimulates and spawns learning in an eLearning
environment. To succeed the instructor must be considerably invested in enhancing the
relationships with learners through personal and individualized interest, while holding
high standards, maintaining integrity, and seeking to enhance motivation. As Joo et al.’s
(2011) study showed, teacher presence engages cognitive presence, and it is the
instructor’s role in facilitating that topic that is probed next.
Engaging cognitive presence. Although cognitive presence has four phases, this
section examines the first; the triggering event (Darabi, Arrastia, Nelson, Cornille, &
Liang, 2011; Hoskins, 2012). In the triggering event, the learner comes in contact with
content and attempts to make sense of it, explore it, and identify applications (Darabi et
al., 2011). One of the most important elements of learning is time on task (AmreinBeardlsey & Haladyna, 2012; Dibiase & Kidwai, 2010; Dziuban & Moskal, 2011;
O’Bannon & McFadden, 2008). Simply stated, if one does not spend time attempting to
learn, then one will not learn. While an instructor cannot effectively make a learner
learn, they can engender a desire to spend more time on task by including certain
elements into their teaching. Some of the most important elements of instructor
contribution to engaging cognitive presence and time on task are providing feedback,
57
interactive communication, prompt and timely responses to learner inquiries, assignment
feedback, encouraging reflection and application, and evaluating. Each of these elements
will be scrutinized below.
Michinov et al. (2011) investigated the influence of the mediating effect that
participation has between procrastination and learning success, as well as the effect that
procrastination has on motivation and desire to drop out. They postulated that there
would be a negative correlation between procrastination and participation and
performance. Procrastination was measured using a self-report scale, participation was
measured by counting the number of posts to a discussion forum, and performance was
measured by scores on a case study report. Michinov et al. found that procrastination had
a direct and an indirect effect on performance. Low participation was found to be a direct
predictor of low performance. The researchers also found that low procrastinators
maintained their motivation throughout the course and were less likely to drop out, while
high procrastinators showed a quadratic trend; motivation decreased over time and then
increased at the end of the course, while desire to drop out increased through the midpoint of the course and then decreased. The implications of the study were that low
participation has a deleterious effect on performance; therefore instructors should apply
motivational strategies or scheduled deadlines to improve the participation of all learners.
One specific strategy the researchers proposed was to “provide learners with feedback to
enable them to compare their level of participation with that of others and particularly
with higher achieving learners” (Michinov et al., 2011, p. 249).
Providing feedback is a form of interaction between instructor and learner
(Driscoll et al., 2012; Ruey, 2010) that has been shown to lead to learner satisfaction
58
(Jackson et al., 2010). Instructors are variously counseled to provide prompt (AmreinBeardsley & Haladyna, 2012; Lear et al., 2010; O’Bannon & McFadden, 2008; TallentRunnels et al., 2006), timely (Ekmekci, 2013; Jackson et al., 2010; Kiener, 2010; Lee,
2010; Savery, 2010), regular (Michinov et al., 2011), relevant (Chaves, 2009), corrective
(Bradley, 2009; Pelz, 2010), and high-quality (Falloon, 2011; Kellogg & Smith, 2009)
feedback. The immediacy of feedback to the learner assists learning and other learning
outcomes (Jackson et al., 2010; Lee et al., 2011; Sinclair, 2009).
For successful facilitation of eLearning, an increased personal level of
communication with learners is required (Travis & Rutherford, 2013). Online learning
provides increased facility for two-way interactions, which is a crucial feature of the
learning process (Archambault et al., 2010; Desai et al., 2008). Researchers have
indicated that instructors can utilize this increased capability for interaction (Er et al.,
2009; Lee et al., 2011; Nagel et al., 2011; Omar et al., 2011) to provide timely comments,
showing their awareness of what learners have said or are doing (Bradford & Wyatt,
2010; Savery, 2010) and increasing dialog between instructor and learner (Bradley, 2009;
Schultz, 2012).
Online learners will query instructors and other learners for answers, feedback,
encouragement, and clarification because they have questions (Ferratt & Hall, 2009).
Phelan (2012) noted the importance of immediacy of the instructor’s responses to these
queries, while Lee et al., (2011) identified that immediacy influenced both learning
outcomes and satisfaction for eLearning learners. Ekmekci (2013) encouraged
instructors to model “good online communication and interactions” (p. 34), meaning to
59
provide prompt (Ali & Ahmad, 2011; Joo et al., 2011) and timely (Lee et al., 2011; Lee
& Choi, 2011; Ruey, 2010) responses to others online.
Assignments provide a way to determine learning. Researchers have proposed
that instructors handle the feedback to learners regarding assignments in encouraging
ways. Since feedback is designed to improve learning by appraising the learner of their
current level of achievement (Paechter et al., 2010; Savery, 2010), it acts as “an ‘inherent
catalyst’ for all self-regulated activities” (Lee et al., 2011, p. 161). As such, the instructor
should give specific (Lee et al., 2011; Ruey, 2010), constructive (Boling et al., 2011;
DeLotell et al., 2010; Lee et al., 2011; Smith, 2005), individualized (Boling et al., 2011;
Korr, Derwin, Greene, & Sokoloff, 2012), timely (Ekmekci, 2013; Savery, 2010; Smith,
2005), unbiased (George, 2013; Lee et al., 2011), and positive (Belair, 2012; MartinezCaro, 2011) feedback. Lee et al. (2011) also specified that feedback spotlight the
learning objective and not the individual.
Researchers have emphasized encouraging learners to reflect and tie their
personal experience into their learning (Blaschke, 2012; Hussain, 2013). This application
of new knowledge and integrating it with prior experience is a primary factor of
eLearning success and will be discussed in a succeeding section. Instructor promoted
reflection is one focus of eLearning (Smith, 2005). Focusing on problem solving in real
world situations establishes learning relevance and deep learning connections (Blachke,
2012; Hannay et al., 2010) for the learner. Finally, the online instructor provides
evaluation and assessment of learning (Bradley, 2009; Chaves, 2009; Pelz, 2010).
The instructor–learner relationship enlists cognitive presence for the learner
(Darabi et al., 2011; Hoskins, 2012). To optimize learning the instructor should provide
60
prompt and timely feedback (Amrein-Beardsley & Haladyna, 2012; Driscoll et al., 2012;
Ruey, 2010), interactive communication (Archambault et al., 2010; Bradford & Wyatt,
2010; Travis & Rutherford, 2013), immediacy in responses and communication
(Ekmekci, 2013; Lee et al., 2011; Phelan, 2012), specific and constructive feedback
(Falloon, 2011; Kellogg & Smith, 2009), encourage reflection and the bringing of real
life examples into the classroom (Blaschke, 2012; Hussain, 2013), and provide an
assessment of learning (Bradley, 2009; Chaves, 2009; Pelz, 2010). In a mixed method
design incorporating a survey, personal interviews, and a focus group Hussain (2013)
determined that learner satisfaction is helped by instructor encouragement to reflect,
technical and social skills, and evaluation and assessment skills. These elements are
possible online because of a reduction in the didactic role of the instructor (Bradley,
2009; Chaves, 2009) and an emphasis on facilitation and interactivity, which will be
examined next.
Reduced didactic role. The affordances of eLearning and the learning needs of
learners call for a less directive role for instructors (Bradley, 2009; Chaves, 2009). The
primary elements that constitute an instructor’s reduced didactic role noted in the
literature are sharing of knowledge (Boling et al., 2011; Scanlon, 2009), a pedagogical
role (Archambault et al., 2010; Henning,2012; Wang et al., 2011), interaction style (Ali
& Ahmad, 2011; Cacciamani et al., 2012), with a major emphasis on the facilitator role
(Bradley, 2009; Rodrigues, 2012), and the implementation of scaffolding (Caine, 2010;
Tsai, 2011). Inclusion of these elements presents a learner-centered experience with a
greater chance of learning and satisfaction (Archambault et al., 2010; Fidishun, 2011).
61
The acquisition of knowledge and its application to life events is still the purpose
of education in an online setting, but the focus of many researchers is on engendering in
the learner the capability and interest for independent learning (Cornelius et al., 2011;
Hoic-Bozic et al., 2009; Hussain, 2013; Jackson et al., 2010) where learner and instructor
are “co-learners and co-decision makers in the teaching-learning process” (Wilson, 2005,
p. 62). In this environment, it is still necessary for the instructor to share his or her
expertise (Boling et al., 2011; Connors, 2005; Scanlon, 2009) and be an expert in the
field of study (Anderson, 2008a; Connors, 2005; Hannay et al., 2010; Paechter et al.,
2010), while also acting more like an equal and consultant (Connors, 2005; Paechter et
al., 2010; Wang & Kania-Gosche, 2011).
In addition to subject-matter expertise, instructors require pedagogical expertise in
eLearning environments (Archambault et al., 2010; Henning, 2012; Wang et al., 2011),
more so than in a traditional classroom (Scanlon, 2009; Shea & Bidjerano, 2010).
Because of the interactivity of eLearning many researchers have encouraged a shift from
a teaching focus to a learning focus (Cercone, 2008; DeLotell et al., 2010; Ghost Bear,
2012; Knowles, 1980); others have proposed a learner and learning process focus rather
than a subject matter focus (Cox, 2013; McGrath, 2009; Oncu & Cakir, 2011; Wilson,
2005). Cox (2013) investigated the teaching orientation of a group of language
instructors (N = 25) in Peru using a mixed methods design, and found that instructors
with over 10 years of experience were more learner focused than those with less
experience. The qualitative portion of the study determined that experienced instructors
felt that adults learned better in an atmosphere of respect and expectation, rather than
with a focus on the instructor. Other pedagogical elements that online instructors need
62
are emphasis on time on task (Alewine, 2010; O’Bannon & McFadden, 2008; Pelz, 2010;
Smith, 2005), clarity in setting expectations (Driscoll et al., 2012; Ekmekci, 2013;
Savery, 2010; Smith, 2005) and in expression (Chyung & Vachon, 2005; Jackson et al.,
2010; Wilson, 2005), while working to stimulate the learners interest (Abrami et al.,
2010; Pigliapoco & Bogliolo, 2008), motivation (Abdous & Yen, 2010; Chen & Chih,
2012; Marschall & Davis, 2012; Paechter et al., 2010), interactivity (Ali & Ahmad, 2011;
Cacciamani et al., 2012; Muirhead, 2004; Tolutiene & Domarkiene, 2010), and reflection
on content (Ross-Grodon, 2011; Savery, 2010). Helpful strategies for this inspiration are
affirming the personal dimensions of the instructor–learner relationship (Muirhead, 2004;
Savery, 2010), establishing psychological intimacy with learners (Chaves, 2009), and
development of an effective online syllabus (Savery, 2010; Smith, 2005). In addition,
instructors need frequent contact with students (Chaves, 2009; Travis & Rutherford,
2013), promote active learning (Cornelius et al., 2011; Hoic-Bozic et al., 2009;
O’Bannon & McFadden, 2008; Smith, 2005), while encouraging participation in every
activity (O’Bannon & McFadden, 2008). Huang et al. (2012) conducted a quantitative
study to test the mediating effect of prior knowledge on learning style and online learning
performance using 219 undergraduates. The researchers found support for (a) online
participation being a mediating construct between learning style and performance, (b)
that learners with a sensory learning style tend to participate more frequently and for
longer durations, and (c) that prior knowledge had a moderating relationship between
participation and learning performance in terms of passive participation only.
Reduction in direct instruction and teaching requires equal portions of knowledge
and expertise, pedagogical strategies and techniques, and interaction styles (Bradley,
63
2009; Rodrigues, 2012). The primary role change noted in the literature for instructors is
the assumption of a facilitating mode in eLearning that utilizes scaffolding (Anderson,
2008a; Bradley, 2009; Michinov et al., 2011). Scaffolding provides learners with support
and guidance without directing (Caine, 2010; Chaves, 2009; Lee et al., 2011; TallentRunnels et al., 2006; Tsai, 2011). Scaffolding fosters self-directed learning by
transitioning the responsibility of learning to the learner (Bradley, 2009; Cercone, 2008;
Rodrigues, 2012; Wilson, 2005).
Though scaffolding is a technique for facilitating learning, adult eLearning
theorists’ cynosure is the facilitator role played by instructors (Archambault et al., 2010;
Boling et al., 2011; Bradley, 2009; Dykman & Davis, 2008a; Fidishun, 2011; Henning,
2012; Knowles, 1995; Martinez-Caro, 2011; McGrath, 2009; Wilson, 2005). As a
facilitator of learning the instructor acts as a resource for self-directed learners (Knowles,
1984), expedites engagement in activities (Paechter et al., 2010), promotes the
interaction, communication, and discourse between learners (Bradley, 2009; Fahy, 2008),
cultivates learner decision making (Hussain, 2013), guides toward content without
managing (Ali & Ahmad, 2011; Blaschke, 2012; Connors, 2005; McGrath, 2009), and
nurtures learner problem-solving abilities (Hussain, 2013). Goddu (2012) concluded that
facilitation encouraged self-direction and internal motivation for adult learners; while
Chaves (2009), DeLotell et al. (2010), and Jackson et al. (2010) indicated that learning
grounded in personal examples contributed to deep learning.
As discussed, the reduction in direct teaching and the emphasis on guidance and
facilitation in eLearning changes the way that instructors establish relationships with
learners. Another way that learning changes online because of the wherewithal of online
64
interactions is the potential for greater participation of learners and instructors in learning
(Bhuasiri et al., 2012; Driscoll et al., 2012; Ismail et al., 2011), and it is the instructor’s
job to encourage dynamic interrelationships (Kellogg & Smith, 2009; Wilson, 2005),
which triggers social presence. This subject is perused next.
Triggering social presence. Contemporary theories of learning and the
affordances of eLearning encourage a participatory experience online. It is the
responsibility of the instructor to trigger social presence to facilitate participation,
collaboration, and interactivity among and between learners.
DeLotell et al. (2010) identified a relationship between deep learning, social
interaction, and learner retention. They determined that while social presence fostered
learning that held the learner’s interest, engendered application of the learning, and
heightened understanding, the process of social presence began with the instructor. The
authors also confirmed the link between learner engagement, learner satisfaction, and
learner retention. From these findings, DeLotell et al. suggested that instructors in online
courses include the following actions to promote deep learning and social interactions; (a)
solve problems collaboratively, (b) use and encourage the use of examples, (c) give and
encourage the giving of appropriate feedback, (d) utilize motivational strategies to boost
self-esteem, and (e) require active participation and inclusion of rationale, explanations,
and justification when interacting with others in the classroom.
Importance of instructor–learner interaction. Implementing the instructor–
learner strategies discussed above has certain consequences according to researchers.
The results of teaching presence and the successful engagement of cognitive presence,
reduced didactic role, and social presence on learning effectiveness, motivation, success,
65
learning, and satisfaction are enumerated below. Travis and Rutherford (2013) identified
that learner–instructor interaction is twice as important as learner–learner interaction.
While Belair (2012) found that communication with instructors was consistently one of
the top five contributing factors for online success but that teacher-initiated interaction
had a more positive impact on learning than did learner-initiated interactions. The
consequences of the instructor–learner relationship on the learner are discussed next.
Martinez-Caro (2011) found a positive relationship between teacher involvement
and learning efficiency, while others indicated this relationship was a crucial element of
online teaching effectiveness (Anderson, 2008a; Dykman & Davis, 2008b; Ferguson &
DeFelice, 2010). Jackson et al. (2010) determined that the level of interaction of the
instructor with his or her students was the paramount factor in online learning. Improved
instructor–learner relationships decidedly have a positive impact on learning
effectiveness (Belair, 2012; Travis & Rutherford, 2013).
Yen and Abdous (2011) investigated the relationship between faculty engagement
and learner satisfaction and achievement. By using a quantitative survey design with 482
participants, the researchers measured two research variables; faculty engagement and
learner satisfaction; and measured the final variable, achievement, using grades received
from the registrar. There was a predictive relationship between faculty engagement and
learner satisfaction, F(1, 480) = 195.06, p < .05, R2 = .289. The authors also found a
predictive relationship between faculty engagement and final grade, χ2 (1, 482) = 35.54, p
< .05. From this, Yen and Abdous corroborated other findings that the level of
engagement by the instructor tended to increase learner satisfaction and learning (Belair,
2012; Martinez-Caro, 2011; Travis & Rutherford, 2013)
66
Not only do high-quality instructor–learner relationships inculcate motivation, but
they affect (Ali & Ahmad, 2011), are a key predictor (Baran et al., 2011; Joo et al., 2011;
Simonson et al., 1999; Tuquero, 2011), and a critical dimension (Yen & Abdous, 2011)
of learner success; having a moderate (0.32) effect size on achievement (Abrami et al.,
2010). The instructor–learner relationship has also been shown to contribute to better
learning outcomes (Shea & Bidjerano, 2010) because this relationship constitutes a
fundamental need for learning (Ali & Ahmad, 2011). In a correlational study (N = 245)
to determine factors of eLearning students’ satisfaction, Ali and Ahmad (2011) found that
both learner–instructor interactions (β = .583) and the instructor’s performance (β = .721)
were positively and significantly correlated to learner satisfaction. Several researchers
have determined that the quality of this relationship is the strongest predictor of
eLearning learning (Abdous & Yen, 2010; Martinez-Caro, 2011; Yen & Abdous, 2011)
and to subjective perceptions of learning (Lo, 2010).
Learner satisfaction in an online environment will be explored in detail in a later
section, but as was reported above, Yen and Abdous (2010) found a predictive
relationship between satisfaction and instructor engagement. Jackson et al. (2010)
revealed that learner satisfaction was directly affected by the quality of the interactions
between learner and instructor. Abdous and Yen (2010) identified the instructor–learner
relationship as one of three significant factors that contributed to learner satisfaction and
other learning outcomes. Finally, other research revealed this relationship as a key
predictor of learner satisfaction (Ferguson & DeFelice, 2010; Shea et al., 2006; Yen &
Abdous, 2010).
67
In this section, the elements for a good instructor–learner relationship were
discussed in terms of teaching presence, cognitive presence, reduced didactic role of
online instructors, and social presence. The consequences of this relationship were also
enumerated. While the instructor–learner relationship has been demonstrated as a key
predictor of success and satisfaction in education, there are other relationships that may
engender the motivation, and desire to learn within the learner. In the next section, the
learner–learner relationship will be analyzed and explicated.
Learner-learner interactions. Unlike the ubiquity of acclaim for the necessity
of the instructor–learner relationship, agreement about the learner–learner relationship is
less pervasive. To properly place learner–learner interactions within the six primary
factors that bring eLearning success it is necessary to explore seven topics; social
presence, the meeting of social needs, constructivist approaches, the ways and byplays of
interaction, collaboration and group cohesiveness, communication tools, and the benefits
that have been found to accrue from these relationships. Educators designate the
importance of the learner–learner relationship in eLearning on a continuum from being
more important for learning than the instructor–learner relationship to having little to no
significance to online learning.
Social presence. According to Joo et al. (2011) social presence eliminates
affective gaps between learners through open communication, truthful emotional
expressions, and group cohesion and is often mistaken for interaction. While social
presence may include interactions, its focus accentuates communication with others
through the use of language (Bahr & Rohner, 2004; Joo et al., 2011) and discourse
(Anderson, 2008a; Jackson et al., 2010; Nummenmaa & Nummenmaa, 2008). Social
68
presence can boost learning, facilitation of material, and interaction but is only required
to a limited degree between learners (Oncu & Cakir, 2011; Wang, 2010).
Meeting social needs. Education is traditionally a group activity. Online
learners, however, while being members of a class are not physically in the presence of
others. Boling et al. (2011) indicated that feelings of isolation tended to manifest as
disconnection from not only other learners, but also the instructor and the content.
Dialog and interaction diminish perceived isolation (Fahy, 2008; Henning, 2012; Lee et
al., 2011; Omar et al., 2011). In a quantitative survey study with postgraduate learners,
Omar et al. (2011) found that as learner–learner interactions increased, feelings of
isolation decreased from the learner’s perspective. Kellogg and Smith (2009), on the
other hand, found that if learners had stable, independent supportive relationships
feelings of isolation were not manifest, nor were learners as motivated to engage in inclass relationships. This indicated that interactions within an eLearning classroom fulfill
a peripheral social or cultural need (Cheng et al., 2011; Conceicao, 2002; Kellogg &
Smith, 2009), while also inducing deep learning (DeLotell et al., 2010).
Constructivist approaches to learning. Theory is the heart of education. The
most generally acknowledged theory of educational learning, especially online, is
constructivism (Hoic-Bozic et al., 2009). Chickering and Gamson (1987) detailed
principles of optimal teaching, and suggested that learning is performed better as a team,
than individually. According to constructivist theory “learners work together in a
collaborative space to create shared meaning and to reflect and think about how they
learned and how to apply it in practice” (Blaschke, 2012, p. 66). Learners construct
knowledge and personal meaning (Ally, 2008; Bradley, 2009; Jeffrey, 2009;
69
Nummenmaa & Nummenmaa, 2008; Oncu & Cakir, 2011) socially (Deulen, 2013;
Sinclair, 2009; Vygotsky, 1978) through interaction, problem solving (Kirschner et al.,
2006; Ruey, 2010; Shih et al., 2008; Williams et al., 2011) and discussion (CrawfordFerre & Wiest, 2012; Jeffrey, 2009). This form of environment “keeps students active,
constructive, collaborative, intentional, complex, contextual, conversational, and
reflective” (Bradley, 2009, p. 22), which is what is explored next.
Interaction and interactivity. Much of the literature on constructivist techniques
and practice appear to be more theoretical than empirically based. In the literature, there
are many articles that explain what should occur when learners interact, but few identify
the benefits or disadvantages of putting constructivism into practice. In this subsection,
the theoretical implications of interaction and discourse will be explored, followed by a
report on the empirical studies both pro and con for interactivity in the eLearning
classroom. After this, different communication tools will be enumerated, with a
conclusion of the advantages reported for interaction in the literature.
Theoretically, learning is about making connections (Anderson, 2008b), informal
learning with others (Cercone, 2008), communication (Sharples et al., 2007), interaction
between learners (Abrami et al., 2010; Driscoll et al., 2012; Ismail et al., 2011; Wilson,
2005), participation (Hrastinski & Jaldemark, 2012), and peer support (Lee et al., 2011).
Because of the prevalence of constructivist thought, designers of online courses are
encouraged to make them interactive (Bradley, 2009; Kellogg & Smith, 2009), because
this increases social presence (Joo et al., 2011), increases online participation
(Archambault et al., 2010; Omar et al., 2011), and is critical for quality education
(Martinez-Caro, 2011) because of its implications on collaboration and the engagement
70
of learners (Oncu & Cakir, 2011). According to the theorists, just being together and
fully engaged, foments learning (Ferratt & Hall, 2009; Hoic-Bozic et al., 2009); indeed
some claim that “the student may learn as much, or more, from each other as they do
from the professor and the textbook” (Martinez-Caro, 2011, p. 574). To some, however,
interaction was not sufficient (Fahy, 2008), and discourse was required (Jeffrey, 2009).
The actions of articulating thoughts, opinions, and experiences, of looking at the
world views and thinking patterns of others and providing feedback and critique, and
refining and reflecting upon one’s own ideas that knowledge is constructed, and transient
reality is known (Archambault et al., 2010; Crawford-Ferre & Wiest, 2012; Ferratt &
Hall, 2009; Jeffrey, 2009; Martinez-Caro, 2009; Sharples et al., 2007). Boling et al.
(2011) posited that lecture and reading of texts and information does not provide
sufficient proficiency for becoming an expert. Ali and Ahmad (2011) identified that
interactive discourse was best for topics that require reflection, brainstorming, and
discussion and proposed that this appears “to be one of the most important features of
distance courses” (p. 122). Other authors concurred that learning was more effective
when discourse between learners and with the instructor are part of education (Blaschke,
2012; Henning, 2012; Oncu & Cakir, 2011; Revere & Kovach, 2011) because discourse
engendered critical thinking (Driscoll et al., 2012) and fostered problem-solving skills
and cognitive growth (Jeffrey, 2009). In a mixed methods study seeking to codify a
formula to implement an effective eLearning system, Cabrera-Lozoya et al. (2012)
implemented an online system to improve learner grades, scores on short questions,
scores for problem solutions, and their final scores by providing an additional
communication channel. The results supported that there was a significantly positive
71
effect on grades when learners were able to communicate questions electronically in realtime during lectures, without slowing the pace of the course at all.
Despite the legions of articles espousing constructivism (Shih et al., 2008),
empirical findings regarding the efficacy of interaction and discussion in eLearning
classes, however, is mixed as noted below. On one hand, some qualitative studies found
positive benefits from interactive learning activities. Hrastinski (2008) stated that
eLearning course success derived from interactivity, but did not elaborate, while
Martinez-Caro (2011) indicated that the benefits were well documented, but did not
expound on that documentation. Scanlon (2009) proposed that in a higher education
setting interaction between learners was as significant to learning as interaction with
teachers, but concluded “ultimately it is teachers who determine the learning environment
in the classroom” (p. 41). Chaves (2009) identified learner–learner interaction as a
beneficial strategy for learning because more knowledgeable and experienced students
could scaffold learners with less experience or knowledge. Ferguson and DeFelice
(2010) reported that those who participated in a course had more positive emotional
experiences than did non-participators. Blaschke (2012) identified a positive correlation
between interactivity, reflection, and the use of mobile learning, while Martinez-Caro
concluded that learner–learner interactions were important, even if they were not as
significant as instructor–learner interactions.
Vogel-Walcutt et al. (2011) compared instructional strategies, to determine which
best optimized learning; cognitive load theory or constructivism when teaching complex
skills in an applied problem-based scenario. Participants were divided into two online
training groups who were introduced to a task using two instructors, engaged in two
72
activities, had further training with a third instructor, and then engaged in four additional
activities over a 7 to 11 day period. The only difference between the groups was the
manner in which instruction was given, with the cognitive load group receiving directed
instruction on the task and the constructivist group being scaffolded on the task.
The findings from the Vogel-Walcutt et al.’s (2011) study showed that there was
no significant difference between cognitive load and constructivist instructional strategies
for tasks that were conceptual, procedural, declarative, or involved decision-making
skills. Skills that required the integrating of knowledge, however, were retained better by
learners receiving directed instruction. While the results were mathematically significant,
the effect size was very small (ES = 0.04), leading the researchers to conclude, “Such
results encourage consideration of cost, ease of delivery, and teaching time required. The
resource burden imposed by constructivist approaches, coupled with the lack of empirical
support, makes it difficult to recommend its use” (Vogel-Walcutt et al., 2011, p. 142).
In scientific studies, both Rhode (2009) and Ke (2010) had findings that
contradicted the need for social learning and the idea that collaboration compensated for
active instructor involvement or well-designed content. In a mixed methods exploratory
study, Rhode (2009) sought to determine preferences and capture experiences of learners
regarding different forms of interaction. He found “the top ten elements ranked highest
by participants all involved interactions either with the content or instructor,” and
“consistently ranked elements involving interactions with other learners as lowest in
comparison to all the possible choices” (Rhode, 2009, p. 6). Vogel-Walcutt et al.’s
(2010) study found that problem-based strategies were less efficient for the integration of
learning in applied domains. Gunawardena et al. (2010) in a mixed method design
73
exploring learner satisfaction predictors in a multinational’s online educational program,
concluded that “learners preferred learner–instructor and learner–content interaction
rather than learner–learner interaction” (p. 210). Finally, Kellogg and Smith (2009)
determined from a student survey that the element that learners learned the least amount
from was interactions with other students; claiming three consistent themes: flexibility
intrusion, interaction dysfunction, and time inefficiency. So, despite almost universal
acclaim, researchers and students are not completely convinced in the necessity of
socially negotiated spaces (Barros, 2013; Ke & Carr-Chellman, 2006). Though
researchers are not convinced of the efficacy of learner–learner interactions there are
numerous mentions of the efficacy of small group work and collaboration, which is
covered next.
Collaboration and group cohesiveness. The working of learners together in
group projects, peer tutoring or teaching, role-plays, supports learner engagement
(Bradley, 2009; Ferratt & Hall, 2009; Lee et al., 2011; Phelan, 2012) and has confirmed
learning benefits (Crawford-Ferre & Wiest, 2012; Joo et al., 2011; Paechter et al., 2010).
Kellogg and Smith (2009), on the other hand, found no significant correlation between
peer teaching and perceived learner learning. The remarks of other learners have some
affect (Revere & Kovach, 2011) in shaping how ideas and concepts are perceived and
understood (Caine, 2010). If learners are instructed in cooperative strategies, have
previous experience with group work, the number of learners interacting is small, and
learners are committed to their peers’ success, collaboration is enhanced (Abrami et al.,
2010; Ismail et al., 2011; Martinez-Caro, 2011). These factors constitute group cohesion
74
and encourage the assisting of fellow learners (Chickering & Gamson, 1987; Cornelius et
al., 2011; Williams et al., 2011).
Kellogg and Smith (2009) conducted a mixed methods study on the efficacy of
learner–learner interactions in eLearning. Their study consisted of an extensive literature
review, a qualitative survey of online learners pursuing an MBA, then a quantitative data
analysis, and finally an in-depth qualitative analysis of a single course. For the initial
survey, the authors received a 12% response rate of 110 responses to open-ended
questions. For the final in-depth analysis, 208 learners participated. In this particular
online course, 80.4% learned most from interactions with learner–content (independent
study), while 14.4% identified learning most from learner–learner interactions (group
work). In response to what learners learned the least from, 64.5% identified group work,
while 30.3% noted independent study. In a final qualitative data analysis the researchers
found three reasons for the lack of enthusiasm for learner–learner interactions; 37% of
the responses indicated that group work was less time efficient than independent study,
37% indicated problems working with group members such as confusion, free-riders, and
the lack of consensus, and 22% indicated difficulties in coordinating group meeting
times. Kellogg and Smith concluded that the type of individuals who choose online
courses often do so because of the benefits of convenience and flexibility, which are the
features that group work takes away.
Communication tools. Online learning may be presented asynchronously through
the exchange of printed or electronic media, prerecorded audio or video, discussion
boards, or e-mail (Abdous & Yen, 2010; Abrami et al., 2010; Er et al., 2009; Ferratt &
Hall, 2009; Shih et al., 2008). Online learning may be presented synchronously through
75
podcast, video-feed, audio-conference, website viewing, chat rooms, or webcam
conversations (Crawford-Ferre & Wiest, 2012; Deulen, 2013; Falloon, 2011; Kear et al.,
2012). Online learning may be presented in some combination of asynchronous and
synchronous mechanisms (Crawford-Ferre & Wiest, 2012; Kellogg & Smith, 2009;
Muniz-Solari & Coats, 2009; Travis & Rutherford, 2013). Some writers have argued that
synchronous technologies promoted learning better than asynchronous technologies
(Blaschke, 2012; Boling et al., 2011; Ferratt & Hall, 2009; Kear et al., 2012), while
others have argued the opposite (Bradley, 2009; Pelz, 2010). Er et al. (2009) enumerated
the benefits of both synchronous and asynchronous technologies. The benefits of
asynchronous technologies were (a) accessing content independent of time and place, (b)
an ability to prepare answers before answering, (c) uninterrupted expression of thoughts
and knowledge, and (d) materials can be reaccessed as necessary. The benefits noted
with synchronous technologies were (a) high motivation of learners because of enhanced
interactivity, (b) immediate feedback and guidance, (c) instructor flexibility to shape
content as needed, and (d) the external motivation provided by meeting at a specific time.
Both sets of technologies provided benefits to learners, and these benefits will be
developed next.
Benefits of the learner–learner relationship. There are certain reported benefits
of the learner–learner relationship in the literature. Many seem to be assumptions based
on theory lacking rigorous empirical support (Kellogg & Smith, 2009), but all noted
benefits will be reported in this subsection. The most reported benefit of learner–learner
interaction is learner satisfaction. Ali and Ahmad (2011) determined that the richness
and level of communication affected learner satisfaction with a course. In the main, the
76
literature connotes a positive relationship between the amount and quality of interactions
and the positive satisfaction of learners (Anderson, 2008a; Biasutti, 2011; Ferguson &
DeFelice, 2010; Pelz, 2010). The next most reported benefit of collaboration between
learners was in the development of personal skills. The skills noted were real-life
experience (Ally, 2008; Cox, 2010), self-esteem (Fidishun, 2011; Omar et al., 2011;
Wang & Kania-Gosche, 2011), sharper thinking (Chickering & Gamson, 1987; RacovićMarković, 2010), intellectual agility (Kellogg & Smith, 2009), metacognitive skills (Ally,
2008; Shea & Bidjerano, 2010), deeper understanding (Chickering & Gamson, 1987;
Desai et al., 2008), and a greater capacity for synthesis and integrating thought (Bradley,
2009; Ke, 2010; Kellogg & Smith, 2009). Improved performance and achievement were
listed as benefits of this relationship (Ali & Ahmad, 2011; Biasutti, 2011), with Abrami
et al. (2010) calculating an average effect size of +0.49 from their meta-analysis of 10
quantitative studies. Other benefits reported from the learner–learner relationship were
increased motivation (Ali & Ahmad, 2011; Anderson, 2008a; Omar et al., 2011),
participation (Anderson, 2008a; Kiener, 2010), and socio-emotional connections (DeilAmen, 2011 Kellogg & Smith, 2011). Kim and Frick (2011) determined that the amount
of interaction needed by individual learners for motivation varies depending on
idiosyncratic preferences.
While the learner–learner relationship does not have as ubiquitous a claim on
learning as does the instructor–learner relationship, this section has demonstrated that it
has its benefits and properties. The learner–learner relationship engenders social
presence (Jackson et al., 2010; Joo et al., 2011) and often meets the social needs of
learners in eLearning (Cheng et al., 2011; DeLotell et al., 2010; Kellogg & Smith, 2009).
77
Some learning theories, like constructivism, embrace learner–learner collaboration and
communication as necessary for learning (Bradley, 2009; Hoic-Bozic et al., 2009).
Online learning has a variety of tools and techniques that empower these relationships,
and the literature reports personal, motivational, achievement, and satisfaction benefits
from engaging in interrelationships in the eLearning classroom (Blaschke, 2012;
Henning, 2012; Oncu & Cakir, 2011; Revere & Kovach, 2011). The necessity for an
instructor and interactions with other learners is apparently sine qua non for higher
education (Driscoll et al., 2012; Jeffrey, 2009). However, the only relationship that is
absolutely indispensable for learning to take place is the learner and what is to be learned.
This relationship and the importance of reflection on this relationship will be considered
in the next section.
Learner-content interaction and reflection. Although most can be taught, very
few learners truly know how to learn (Knowles, 1975; Ghost Bear, 2012). Abrami et al.
(2010) indicated that learner–content interaction had three parts; studying the material,
relating it to one’s own experience and knowledge, and applying it to current problems.
Learner–content interaction results in certain boons that will be explored in this section,
along with the process of this interaction, and the means by which it may be made
effective. The necessity for reflection to engender deep learning, which is the purpose of
learning, will then be reviewed.
A learner’s “interaction with the content, or subject matter, is what makes
learning possible” (Travis & Rutherford, 2013, p. 32) and is the preeminent characteristic
of education (Abdous & Yen, 2010; Ali & Ahmad, 2011; Chaves, 2009). It is through
this relationship that learners garner intellectual and cognitive intelligence from the
78
material perused (Bradley, 2009; Lear et al., 2010). It was Moore (1989) who originally
posited that there were three primary types of interaction in effective distance learning.
He proposed learner–content as the most important because, without it, there is no
education (Moore, 1989).
A learner’s relationship with the material of a course may take the form of
brainstorming (Joo et al., 2011; Kear et al., 2012), information exchanges (Hrastinski &
Jaldemark, 2012; Nagel et al., 2011), reading textual information (Abrami et al., 2010;
Ally, 2008), being well-structured (Abrami et al., 2010; Lee & Choi, 2011), providing
clarification (Ferratt & Hall, 2009; Joo et al., 2011), simulations (Abrami et al., 2010; Ali
& Ahmad, 2011; Clapper, 2010), and include relevant tasks (Abrami et al., 2010; TallentRunnels et al., 2006) or content (Bradley, 2009; Ke & Kwak, 2013). This relationship is
often seen as self-directed (Chan, 2010; Chaves, 2009; Dibiase & Kidwai, 2010) or
requiring a self-motivated learner (Dykman & Davis, 2008a; Gibbons & Wentworth,
2001; Hussain, 2013), while others proposed introducing material that continually
challenged the learner to greater development of cognitive skills (Jeffrey, 2009; Mehta et
al., 2007; Wilson, 2005).
Abrami et al. (2010) conducted a meta-analysis of 74 empirical studies on the
interactions between learners, learners and their instructors, and learners and content.
They found that interaction positively affects achievement with an average effect size of
0.38. Each of the types of interactions had a positive correlation with learner
achievement; learner–learner interactions had an average effect size of 0.49, learner–
content 0.46, and learner–instructor 0.32. The researcher’s findings corroborated the
importance of interaction, and of learner–content interaction. The researchers concluded
79
with a number of suggestions for the design of eLearning, including three self-regulation,
three multimedia, and four collaborative learning principles, along with six motivational
design principles codified from the various evidence-based approaches they discovered in
their analysis.
In an experimental pretest–posttest survey using a Mann-Whitney test for
analysis, Mahle (2011) sought to determine motivation levels for learners randomly
assigned to three interaction groups; low-level, reactive, and proactive. The content and
objectives for the course were identical for the three groups, but the activities differed in
the amount of interactivity. The low-level group received print media and clicked on
hyperlinks to assess understanding, with no feedback. The reactive group typed their
answers to the reading material and was provided with immediate and effusive so that
they could evaluate their own responses. The proactive group consisted of a creative
scenario-based activity based on the reading material. Mahle found that there was a
significant difference in satisfaction and in learning gain between the low-level group and
both the reactive and proactive groups, but there was no significant difference between
the reactive and proactive groups. The findings confirmed that increases in interactivity
effect learner outcomes and satisfaction, while the reactive group was also positively
associated with retention; the other two groups were not.
The interaction between learner and content enjoins cognitive presence (Hoskins,
2012; Joo et al., 2011; Oncu & Cakir, 2011), which can generate a sense of community
(George, 2013; Joo et al., 2011; Oncu & Cakir, 2011); comprising the exploration,
construction, and integration of new knowledge (Hoskins, 2012; Joo et al., 2011; Sinclair,
2009), the proposal and weighing of countering solutions (Ghost Bear, 2012; Sinclair,
80
2009; Tallent-Runnels et al., 2006), and finally the resolution of cognitive dissonance
(Abrami et al., 2010; Joo et al., 2011; Sinclair, 2009). In a qualitative case study
exploring the experience of practicing instructors in an online course, Sinclair (2009)
derived three main themes; true learning is brought about through cognitive dissonance
that results in reflection and new shared knowledge, the asynchronous nature of the class
was beneficial for engendering reflection and deeper consideration, and the development
of community provided a greater emphasis on both the process and the experiences of
learning. It is cognitive presence, fomented by learner–content interaction, that
engenders critical thinking (Anderson, 2008a; Oncu & Cakir, 2011), metacognition (Ally,
2008; Nikolaki & Koutsouba, 2012; Phelan, 2012), changes in a learner’s understanding
(Abrami et al., 2010; Chaves, 2009; Moore, 1989), deep learning (Caine, 2010; DeLotell
et al., 2010; Yang & Cornelious, 2005), changes in perspective (Abrami et al., 2010;
Moore, 1989), autonomous thinking (Cercone, 2008; Ferguson & DeFelice, 2010;
Mezirow, 1997), increases in decision-making skills (Hussain, 2013; Stern & Kauer,
2010; Vogel-Walcutt et al., 2010), higher order thinking (Anderson, 2008a; Joo et al.,
2011; London & Hall, 2011), and changes in the learner’s cognitive structure (Abrami et
al., 2010; Moore, 1989); the avowed targets of higher education. Barber (2012), in a
qualitative grounded theory approach, sought to empirically investigate integration of
learning and how “undergraduates bring knowledge and experiences together” (p. 590).
After 194 interviews, Barber determined that there were three categories of integration
and that these three categories increased in cognitive involvement: Connection involved
noting similarities between two concepts, application uses knowledge from a different
context in present circumstances, and synthesis comprises new insight. The author made
81
recommendations for improving learning by intentionally designing learning
opportunities that utilize and encourage these experiences and increase interaction with
content. Interaction with content, mixed with previous experience, produces
transformative learning (Conceicao, 2002; Green & Ballard, 2011; Ke, 2010), and this
melding of prior experience and knowledge with new knowledge is enhanced through
reflection (Blaschke, 2012; Galbraith & Fouch, 2007; O’Bannon & McFadden, 2008).
Among the factors assumed to promote successful eLearning, critical reflection is
said to be essential to the education process (Blaschke, 2012; Ghost Bear, 2012), helps
with the integration of concepts (Barber, 2012, Joo et al., 2011), is needed (Ally, 2008;
Anderson, 2008b; Chickering & Gamson, 1987) and promotes learning (Cox, 2010;
O’Bannon & McFadden, 2008), meta-learning (Baskas, 2011; Strang, 2009), higher order
thinking (Bradley, 2009; Sinclair, 2009), while boosting the desire to learn (Abela, 2009;
Baskas, 2011; Dykman & Davis, 2008c). Critical reflection, or double-loop learning, is a
process of questioning assumptions (Adamson & Bailie, 2012; Bass, 2010; Blaschke,
2012; Mezirow, 2000) and encouraging the inclusion of a learner’s past experiences into
the learning experience to stimulate transformative learning (Cercone, 2008; RossGordon, 2011; Varmecky, 2012). In a qualitative phenomenological study, that
interviewed 25 students regarding ways to encourage “students to take responsibility and
direct their own learning” (Canning, 2010, p. 59), the researcher found that reflection not
only encouraged learning but that it was motivated by a desire to increase skill
development and contribute to best practices. Cox (2010) quoted Merriam et al. (2007),
who paraphrased Dewey by affirming that “experiences that provide learning are never
just isolated events in time. Rather, learners must connect what they have learned from
82
current experiences to those in the past as well as see possible future implications” (p.
65). This melding of past experience and knowledge with new knowledge allows the
learner to reevaluate his or her beliefs, values, and understanding, while acknowledging
uncertainty or conflicts between these beliefs and values (Baran et al., 2011; Reushle &
Mitchell, 2009; Sinclair, 2009) to make sense of the world (Amrein-Beardsley &
Haladyna, 2012; Cercone, 2008; Wang et al., 2011), establish relevant connections
(Barber, 2012; Hannay et al., 2010; Henning, 2012), and bring practical solutions to real
world situations (Ally,2008; Blaschke, 2012; Ghost Bear, 2012; Ross-Gordon, 2011). A
beneficial exercise in reflection is to pay attention to what one says he or she does versus
what one actually does (Ekmekci, 2013). Two types of learning reflection are mentioned:
Praxis, or reflection in action, is a part of active learning (Bradley, 2009; Galbraith &
Fouch, 2007; Hughes & Berry, 2011; Wilson, 2005), where a part of the learner is
observing and reflecting on the strategies being used in problem solving and experiential
methods (Caine, 2010; Green & Ballard, 2011; London & Hall, 2011); while Phronesis,
or reflection on action (Caine, 2010; Muth, 2008; Ruey, 2010; Ryan et al., 2009; Shea &
Bidjerano, 2010), is a part of inductive learning, where the learner gains insights into
previously experienced activities and feelings (Clapper, 2010; Galbraith & Fouch, 2007;
Kiener, 2010) and revises idiosyncratic practices based on his or her extraction of
meaning, thereby transferring learning into new contexts (Barrett et al., 2012; Ho & Kuo,
2010; O’Bannon & McFadden, 2008). Thus, through praxis the learner can learn while
doing and through phronesis the learner can in quiet contemplation (O’Bannon &
McFadden, 2008), incubating (Caine, 2010), or through journaling (Ally, 2008; Blaschke,
2012; Muirhead, 2004) come to an understanding and synergy (Bass, 2010; Ekmekci,
83
2013). Bass (2010), in his literature review, noted that all major adult learning theories
expect adults to utilize their past experiences and incorporate that knowledge into
acquiring new knowledge. Interestingly, all philosophical traditions; Humanist,
Progressive, and Radical emphasize the usefulness and salience of reflection in the
understanding of learning (Green & Ballard, 2011; Wang & Kania-Gosche, 2011).
Learner–content interactions and reflection are both considered essential for
learning and for education. Without content or material learning is not possible, and only
surface learning can occur without reflection. Each of the three previous sections on
eLearning success factors was originated in thought by Moore (1989), and although they
were presented in separate sections it was demonstrated that there was an amount of
interaction between them. The same is true of the remaining sections of eLearning
success factors; collaboration and development of a sense of community was introduced
in the learner–learner interaction section, the need for real world applications of learning
were touched on in the learner–content section, and it has been shown that learner
motivation is engendered to some extent by each of the interactions discussed.
Researchers and authors have acknowledged their benefits enough that it makes sense to
explore each more thoroughly separately.
Collaboration and development of a sense of community. As was noted in
both the benefits of eLearning and learner–learner interaction sections, collaboration is
considered by many as indispensable for eLearning. A derivative of the tenant that
collaboration is essential to eLearning is that this interaction and collaboration can
develop in the learner a sense of community with the instructor and other learners and
that this sense of community brings even greater benefits to the learner. The concepts of
84
collaboration; what it is, the environment that fosters it, its essentiality, and its
corresponding benefits and impedances, and sense of community; with its theoretical
base, how it should be facilitated, its mandatory conditions, and its product will be
investigated in detail. Shih et al. (2008) determined that collaborative learning and
interactive learning environments were the two most prominent topics researched in the
five Social Sciences Citation Index journals they reviewed. This is why these topics are
featured as one of the six factors that bring eLearning success.
Collaboration. Ally (2008) declared that to facilitate constructivist learning
instructors should use cooperative and collaborative learning. According to this
paradigm, learning is not individualistic but requires interaction (Dewey, 1938/1997;
Hoic-Bozic et al., 2009; Jeffrey, 2009; Ruey 2010), sustained communication (DeLotell
et al., 2010; George, 2013; Ke, 2010; Oncu & Cakir, 2011) or discourse (Anderson,
2008a; Bradley, 2009; Pelz, 2010), and active participation (Adamson & Bailie, 2012;
Archambault et al., 2010; Hoic-Bozic et al., 2009) between learners and with the
instructor (Hoic-Bozic et al., 2009; Kim & Frick, 2011; Park & Choi, 2009) to construct
meaning (Bradley, 2009; George, 2013; Joo et al., 2011; Oncu & Cakir, 2011), solve
problems (Bradley, 2009; Hoic-Bozic et al., 2009), complete projects (Henschke, 2008;
Hoic-Bozic et al., 2009; Lee et al., 2011) and is expected by Millenials (Werth & Werth,
2011). Proper collaboration involves the exchange of information or intellectual assets
(Bhuasiri et al., 2012; Cheng et al., 2011; Hrastinski & Jaldemark, 2012), effective
management of tasks (Dibiase & Kidwai, 2010; Joo et al., 2011; Michinov et al., 2011),
along with encouraging social support (Bye et al., 2007; Hrastinski & Jaldemark, 2012;
Schultz, 2012). Proper information exchange is viewed as cognitive presence in the
85
community of inquiry framework, while encouraging social support and diminishing
transactional distance is a part of social presence (Fahy, 2008; Hrastinski & Jaldemark,
2012), and the management of tasks is a part of teaching presence. Some researchers
have determined, however, that not every learner is as interested in or helped by
collaboration, and “interaction should be adjusted to individual needs and preferences”
(Fahy, 2008, p. 170), while some learners are disinterested in the social elements of the
classroom but can be extremely active in their interactions with the content (Hrastinski &
Jaldemark, 2012). Vogel-Walcutt et al. (2010) determined that while learning was
equivalent between a cognitive load theory experience and a constructivist experience,
“the resource burden imposed by constructivist approaches, coupled with the lack of
empirical support, makes it difficult to recommend its use with [the military], at least
when time is held constant, and the learners are novices” (p. 142).
Rich collaboration requires a specific environment. The learner must feel safe
and comfortable to express themselves (Anderson, 2008a; Cacciamani et al., 2012; Er et
al., 2009), there must be an authentic and meaningful context (Bradley, 2009; Caine,
2010; Goddu, 2012), with resonant dialog (Boling et al., 2011; McGrath, 2009; Stein et
al., 2009) and helpful feedback from peers and the instructor (Boling et al., 2011;
DeLotell et al., 2010; Sinclair, 2009) that allow learners to take ownership of their own
learning (Alewine, 2010; Fidishun, 2011; Martinez-Caro, 2011; Werth & Werth, 2011).
These conditions facilitate social presence among learners, and without them the social
climate essential for learning may not exist (Deulen, 2013) and learners may be unwilling
to express differences, share their views and thoughts, or disclose their disagreements
(Anderson, 2008a; McGrath, 2009; Pelz, 2010). Often mentioned in the construction of
86
social presence is the need to have learners who have more experience than other learners
(Boling et al., 2011; Bransford et al., 2006; Cheng et al., 2011; Connors, 2005).
Ceaselessly researchers and authors have lauded the necessity for collaboration.
Smith (2005) stated that learner–learner coaction is imperative, and Cercone (2008)
opined that an informal, respectful, and collaborative environment is a requirement for
learning to occur. Pigliapoco and Bogliolo (2008) concluded that without collaboration
that decreases transactional distance a learner’s psychological sense of community and
feelings of isolation could degrade to the point of social estrangement and alienation that
increases academic failure, increased absenteeism, and dropout. In his literature review
of educational theorists, Anderson (2008b) reviewed the necessity of creating a
community of learning, which Bradley (2009), Milheim (2011), and Boling et al. (2011)
echoed. The sine qua non of multi-directional communication (Boling et al., 2011; Russ
et al., 2010) and bonding between learners (Bradley, 2009) in the form of collaboration is
almost universal.
Some benefits of collaboration include increased completion rates (Anderson,
2008b) and higher quality teaching presence (Sims, 2008), but the most oft-repeated boon
from collaboration are better learning outcomes (Boling et al., 2011; Chaves, 2009; HoicBozic et al., 2009; Jeffrey, 2009; Kupczynski et al., 2011) and the development of critical
social skills. Better learning outcomes include increases in critical thinking (Anderson,
2008b), escalated insights (O’Bannon & McFadden, 2008) and cognitive presence (Shea
& Bidjerano, 2010), learning effectiveness is boosted (Pigliapoco & Bogliolo, 2008),
augmentation of learner perceptions of learning (Kellogg & Smith, 2009), and fructifying
interest and motivation (Boling et al., 2011; Jeffrey, 2009; Phelan, 2012).
87
The greatest noted benefit of collaboration in the literature is the establishment of
social presence (Bradley, 2009; Desai et al., 2008; London & Hall, 2011) and
development of social skills (Anderson, 2008b; Bradley, 2009; O’Bannon & McFadden,
2008). In a mixed method naturalistic case study approach, Ke (2010) documented both
learner online learning experience and instructors teaching practices to “the nature and
interactions of teaching, cognitive, and social presence” (p. 808). Ke found that effective
teaching presence was required to create community and that the stronger a learner’s
sense of community the higher their level of satisfaction. This benefit to learners was
described by Pigliapoco and Bogliolo (2008) as “a feeling that members have to
belonging, a feeling that members matter to one another and to the group, and a shared
faith that members’ needs will be met through their commitment to be together” (p. 61),
will decreasing feelings of isolation (Kellogg & Smith, 2009; Travis & Rutherford,
2013). Others note a synergy that the output of the group is greater than what could be
achieved individually (George, 2013; Kroth, Taylor, Lindner, & Yopp, 2009) granting
increased power, performance, and value. Several authors have remarked that the group
skills gained in collaborative educational environments are transferable to occupational
settings (Allen, Crosky, McAlpine, Hoffman, & Munroe, 2009; Chaves, 2009; Jeffrey,
2009). Allen et al. (2009) reported on the introduction of an online group project for
first-year engineers to ground the learners in a real-world, authentic engineering problem.
Among other findings, they reported that the collaboration successfully developed
generic skills that could be used in learner’s careers; communication, conducting online
research, problem solving, and working effectively in teams. Chaves (2009) identified
that interaction and collaboration are skills that may be learned and are useful in
88
occupational settings. Jeffrey (2009) proposed that advanced collaboration skills prepare
the learner for the workplace.
The primary disadvantages of active collaboration identified in the literature are
loss of flexibility (Ke & Hoadley, 2009; Kellogg & Smith, 2009), loss of independence
(Cornelius et al., 2011; Phelan, 2012), and involvement in group projects with less than
dedicated peers (Nagel et al., 2011). Because of the need to coordinate schedules for
team members to work together, the major advantage of working any time in eLearning
disappears (Desai et al., 2008; Kellogg & Smith, 2009). A sizable minority of learners in
eLearning environments despise activities that minimize their flexibility (Kellogg &
Smith, 2009; Kim & Frick, 2011), especially since “several studies suggest that the
convenience and flexibility of online learning have a paramount influence on the
learner’s motivation for online learning” (p. 4). Michinov et al. (2011) remarked
“learners put off collaborative parts until the end of the semester when the assignment
was compulsory and did not complete it at all when it was voluntary” (p. 249). Another
affordance of eLearning is the engendering of a self-directed, autonomous learner
(DeLotell et al., 2010; Ferratt & Hall, 2009; Ke & Hoadley, 2009; Michinov et al., 2011).
Chu, Chu, Weng, Tsai, and Lin (2012), in a quantitative survey study with 593
participants in Taiwan, used structural equation modeling to determine the relationship of
self-directed learning readiness to learning or thinking styles based on Masirow’s
transformational theory. The researchers found that self-directed learning readiness acts
as a moderator to online learning, and significantly influences technical learning interest
(interpretation and challenge, relevance to life, ease of use, and multiple sources),
dialectical learning interest (online preference, inquiry learning, and learner negotiation),
89
and emancipatory learning interest (reflection and critical judgment). They found that
self-directed learning readiness effects technical learning interest (β = .60) much more
than dialectical (β = .20) or emancipatory (β = .15) learning interests. Collaboration
tends to constrain independence (Cornelius et al., 2011). Collaboration requires learners
who are willing to work together and are willing to participate (Alewine, 2010; Alshare et
al., 2011; Chickering & Gamson, 1987; Cornelius et al., 2011; Yen & Abdous, 2011);
learners who are unwilling to participate can be calamitous for other learners in their
group (Cornelius et al., 2011; Ke, 2010; Nagel et al., 2011).
Sense of community. Collaboration contributes to a sense of community, which
is characterized by learners who are committed to learning, participating, and
contributing to the community by providing a sense of belonging and trust (Anderson,
2008b). Pigliapoco and Bogliolo (2008), citing Rovai, noted four dimensions to
community: spirit, or the bond between learners; trust, or the confidence learners have in
each other; interaction, which encompasses many of the features discussed above
enhanced with a feeling of safety; and common expectations, or a commonality in
purpose. These components have also been noted and expanded upon by other authors.
The camaraderie and bond between learners allow social presence (Anderson, 2008b;
Ferratt & Hall, 2009; Ke & Hoadley, 2009; Lear et al., 2010; Lee et al., 2011) and
effective learning (Kellogg & Smith, 2009; Shea & Bidjerano, 2010). To interact freely,
learners must feel safe and confident that they will be sustained rather than derogated in
their learning (Ke & Hoadley, 2009; Moisey & Hughes, 2008; Pelz, 2010). Thus,
learners are able to “share their ideas, experiences, knowledge, and insights to support
and challenge one another” (London & Hall, 2011, p. 761). The need for interaction has
90
been discussed at length in the previous sections and shown to be essential to the creation
of community. Finally, learners come together in the spirit of learning to gain something,
and these common expectations and purpose allow deeper learning (Cornelius et al.,
2011; Hrastinski & Jaldemark, 2012; Ke & Hoadley, 2009; Moisey & Hughes, 2008;
Shea & Bidjerano, 2010). These dimensions are reflected slightly differently in the
community of inquiry framework with its focus on teaching presence, cognitive presence,
and social presence, but the creation and maintaining of a community to extend and
enhance learning is equivalent (Ke, 2010; Kellogg & Smith, 2009; Shea & Bidjerano,
2010).
Pigliapoco and Bogliolo (2008) in a case study regarding the influence of the
online teaching environment on psychological sense of community with 107 participants
from two programs; one online and the other traditional. The authors used comparative
analysis and regression analysis to investigate the effects of psychological sense of
community on learning outcomes. Through the use of partial correlation coefficients,
Pigliapoco and Bogliolo were able to derive the relative contribution of each factor on the
outcomes. The researchers determined that the differences between persistence in online
programs versus traditional programs come down to learner demographics and that
eLearning does not necessarily impair psychological sense of community.
Whether the community is called a community of inquiry, a community of
learning, or a learning community, the literature emphasizes that it is one of the roles of
the instructor to facilitate this sense of community within the learner (Baran et al., 2011;
Lear et al., 2010; Smith, 2005). Several authors have also noted the role that media plays
in the facilitation of community. Hrastinski and Jaldemark (2012), for instance, noted
91
that synchronous communication has several advantages over asynchronous
communication; among them, more rapid flow of information between learners and
teams, more sociability, and promotion of this sense of community. Falloon (2011) was
also intrigued by the affordances of synchronous communication, noting enrichment of
learner engagement, motivation, and feedback, while supporting community development
and learner identification with the community. Russ et al. (2010) concurred that
synchronicity can reduce social distance in the online classroom between participants.
Ferratt and Hall (2009) bemoaned that, in an asynchronous eLearning classroom, many of
the benefits of community in a face-to-face environment are absent or lost. Ke and
Hoadley (2009) and Fahy (2008) agreed that the appropriate technology was important
for group related interactivity to be worthwhile. Indeed, Fahy concluded “community
becomes a process, not merely a place (Cannell, 1999) in which ‘structured and
systematic’ social interaction, using media, is essential to significant learning” (p. 170).
Not only are researchers in eLearning near unanimous in their demand for
community, but they have been very helpful in ferreting out the actions and behaviors
that inculcate this sense of community that produces an effective eLearning environment
(Jackson et al., 2010). Chaves (2009) reiterated that the learning community must be the
nucleus of eLearning and that this community must be instilled at the classroom, the
program, and the institution levels thereby expanding learning transfer, learner
interactions, and learner retention. Smith (2005) and Ke (2010) identified that the
catalyst for community is teaching presence. Teaching presence in turn engenders
cognitive presence through collaboration (Cornelius et al., 2011; Gupta & Bostrom, 2009;
Smith, 2005) and social presence through social support and learner reciprocity and
92
cooperation (Cornelius et al., 2011; Hrastinski & Jaldemark, 2012; Smith, 2005). Online
learning makes all of this possible because it can be fostered by the technology (Ferratt &
Hall, 2009; Ke & Hoadley, 2009; Stein et al., 2009), enhances accountability (Driscoll et
al., 2012; Gupta & Bostrom, 2009; Schultz, 2012), and facilitates online identity and
clear roles (Abrami et al., 2010; Gupta & Bostrom, 2009; Shea & Bidjerano, 2010).
Collaborative communities require mutual engagement to foster cognitive presence
through peer interaction (Ally, 2008; Anderson, 2008b; Hodge et al., 2011; Phelan,
2012), interdependence (Gupta & Bostrom, 2009; Hodge et al., 2011), information
exchange (Eneau & Develotte, 2012; Hodge et al., 2011; Hrastinski & Jaldemark, 2012),
engaging activities (Holmberg, 1989; Hrastinski & Jaldemark, 2012; Rovai et al., 2007),
as well as instructor–learner interaction (Ally, 2008; Kupczynski et al., 2011; Phelan,
2012). Collaborative communities also require social support through team feedback
(Cornelius et al., 2011; Gupta & Bostrom, 2009), cohesion (Boling et al., 2011; Hodge et
al., 2011; Shea & Bidjerano, 2010), and a goal emphasis of joint enterprise with common
norms and goals (Gupta & Bostrom, 2009; Hodge et al., 2011; Pelz, 2010).
Researchers have identified three major returns to learners from community: (a)
learning outcomes and cognitive presence, (b) interpersonal skills and social presence,
and (c) retention and satisfaction. While each of these has been touched on in previous
sections, community is specifically named as the genesis of these boons in many articles.
Studies in eLearning have identified that community produces “the largest effect size on
learning outcomes” (Gupta & Bostrom, 2009, p. 697) based on a meta-analysis; though
the effect sizes showed high variance. Since community fosters cognitive presence, it has
been shown to produce significant learning (Shea & Bidjerano, 2010) and goal
93
achievement (Phelan, 2012). Phelan (2012) also listed critical thinking skills and nonspecific learning outcomes as cognitive benefits derived from community.
Socially, several benefits have been reported from a sense of community;
including noteworthy collective ubiety and decreased burnout (Anderson, 2008b;
Pigliapoco & Bogliolo, 2008), the development of interpersonal skills (Anderson, 2008b;
Cornelius et al., 2011), team-based skills (Gupta & Bostrom, 2009), and creating
collaborative networks that can help learners professionally (Pigliapoco & Bogliolo,
2008). The most prominent benefit mentioned is in the prevention of isolation and
increased learner support. Because eLearning provides mechanisms for learners to make
a personal connection with the instructor and other learners (some would say better
mechanisms than are available in a traditional setting), a reduction in social isolation
(Dorin, 2007; Falloon, 2011; Kear et al., 2012; Revere & Kovach, 2011) or transactional
distance (Fahy, 2008; Hrastinski & Jaldemark, 2012; Pigliapoco & Bogliolo, 2008) is
possible but requires facilitation because of the lack of physical interactions (Pigliopoco
& Bogliolo, 2008). This facilitation should take the form of encouraging learners to help
each other as discussed in the preceding subsections (Pigliapoco & Bogliolo, 2008;
Revere & Kovach, 2011) and can result in “a virtuous spiral” (Phelan, 2012, p. 34) where
learner helping and engagement motivates even more helping and engagement (Lear et
al., 2010).
Attrition and retention of learners in online programs will be explored in a
subsequent section, but the literature identifies that in the development of community the
likelihood of attrition decreases (Karge et al., 2011; Phelan, 2012; Smith, 2005). Learner
satisfaction will also be probed in an ensuing section, but the correlation between
94
satisfaction and retention is attested in relationship with sense of community; as the sense
of community increases among learners so does their satisfaction with the course,
program, and institution (Baran et al., 2011; Biasutti, 2011; DeLotell et al., 2010; Drouin,
2008; Phelan, 2012). Thus, it is important to ensure that learners are engaged in the
learning and feel part of the group, collaborating and sharing in a sense of togetherness.
In the next section, an additional key to learner’s success with eLearning will be
introduced; the need for learning to be applicable and germane to the learner’s life.
Immediate real world application of learning. The more realistic and relevant
the learning, the more likely learners have been found to grasp the concepts and apply it
in new ways (Ally, 2008; Chaves, 2009; Dorin, 2007; Lam & Bordia, 2008; Stern &
Kauer, 2010). Ally (2008) and Wang and Kania-Gosche (2011) encouraged the use of
case studies, real to life situations and examples, practical simulations, and meaningful
activities to extend and deepen learning. In this regard, other authors have suggested that
learning should focus on performance (Donavant, 2009; Lee et al., 2011), meta-learning
(Bransford et al., 2005), real-world problems (Chan, 2010; Gill, 2010), be related to one’s
own experience (Behar, 2011; DeLotell et al., 2010; Glassman & Kang, 2010; Knowles,
1980) and that of others (Ally, 2008), and most importantly, seek ways to apply it to
one’s daily life (Chickering & Gamson, 1987; Galbraith & Fouch, 2007; Ghost Bear,
2012; Knowles, 1980; Lee et al., 2011; O’Toole & Essex, 2012).
By becoming involved in learning and seeking for ways to apply learning
immediately into their daily experience, learning is more easily assimilated and employed
by learners (Chaves, 2009; Stern & Kauer, 2010). By encouraging the inclusion of the
learner’s beliefs, ideas, and experiences in learning that involves true to life scenarios, the
95
learner cultivates the ability to solve imminent, pertinent conundrums (Chan, 2010;
Chaves, 2009; Glassman & Kang, 2010; Wang & Kania-Gosche, 2011). Immediate, real
world application of learning allows learners to practice (DeLotell et al., 2010; Lee et al.,
2011) and rehearse ways of applying learning (Ghost Bear, 2012; Knowles, 1980; Potter
& Rockinson-Szapkiw, 2012) and the online environment provides infinite:
opportunities for learners to plunge ever deeper into knowledge resources,
providing a near limitless means for them to grow their knowledge and find their
own way around the knowledge of the discipline, benefiting from its expression in
thousands of formats and contexts. (Anderson, 2008b, p. 49)
Real world learning’s locus is on the learner and expansion of their knowledge,
understanding, and skills rather than the abstract learning models of the institution or the
instructor (Ghost Bear, 2012). Donavant (2009), in a three phase quasi-experimental
quantitative study, found that there is very little eLearning research “involving training
relative to the current occupation of the adult learner” (p. 227). In his study, however, he
found that professional development learning took place as well online, as in a traditional
setting. Real world application of learning also tends to strengthen learner motivation
(Dorin, 2007; Potter & Rockinson-Szapkiw, 2012). Learner motivation is the final key
for eLearning success and will be explored in the next section.
Learner motivation. The last factor repeatedly identified as a key factor of
successful learning online in the literature is learner motivation. Motivation was defined
as the “perceived value and anticipated success of learning goals at the time learning is
initiated and mediated between context (control) and cognition (responsibility) during the
learning process” (Garrison, 1997, p. 26). In this subsection, the topic of motivation will
96
be broken into three discussions; the factors that contribute to it, the reason it is requisite
for learning success, and the ramifications from it.
According to expectancy theory, motivation is based on dual perceptions (Abela,
2009; Gorges & Kandler, 2011; Vroom, 1994). The first perception is that one is
motivated because the completion of a task conveys a desired reward, or is extrinsic
(Bhuasiri et al., 2012; Hoic-Bozic et al., 2009), and that the reward exceeds the effort that
is expended (Abrami et al., 2010). Gorges and Kandler (2011) in an experiment with
language learning in Germany, found that either of the components, expectation of
success or value from the learning experience, could provide enough incentive to make a
decision. In some cases, however, the reward may be intrinsic, based on affective
involvement, or enjoyment (Bhuasiri et al., 2012; Bye et al., 2007; Nummenmaa &
Nummenmaa, 2008), which is driven by interest (Abrami et al., 2010). In a study of
motivational components between traditional and nontraditional learners, Bye et al.
(2007) determined that both age and interest were predictors of intrinsic motivation.
They also found that nontraditional learners (those over age 27) had higher mean scores
in intrinsic motivation than did traditional learners. Bye et al. concluded that there is “a
greater need among nontraditional students to simply enjoy the process of mastering new
skills in the classroom” (p. 155). Interest has been shown to be an intrinsic motivator
consisting of persistence, escalated intellectual activity, and concentrated thought
(Nummenmaa & Nummenmaa, 2008) that is sustaining (Abrami et al., 2010). Alewine
(2010) identified that the largest barrier to adult motivation in the classroom are their
attitudes, self-perceptions, and dispositions. In line with this, the second perception is
that one is motivated because the learner feels that he or she is capable of performing the
97
necessary tasks, or has sufficient self-efficacy in his or her competency and abilities
(Abela, 2009; Abrami et al., 2010). As Garrison (1997) indicated, this element could be
because the learner feels more in control of his or her learning, or is taking added
responsibility (Abrami et al., 2010) through goal setting or goal accepting (Abrami et al.,
2010; Nummenmaa & Nummenmaa, 2008).
Contributors to motivation. Learner motivation can be augmented by instructor
expectations; through the support of other learners, instructors, and the institution; learner
involvement in learning; real world practical applications and reflection on learning; the
learner’s emotional state and perceptions; and the class environment. Each of these
elements will be discussed in more detail below. Chickering and Gamson (1987) and
Hannay et al. (2010) all identified that setting high standards for learners and expecting
them to meet those standards is an exemplary strategy for motivating learners to greater
achievement. Interspersed in the subsections above were statements of the benefits and
necessity for learners to interact and strive to build a sense of community within an
online classroom and program. Among the benefits listed as a consequence of successful
community building was motivation. Omar et al. (2011) concluded that higher learner
interaction resulted in higher levels of motivation. Kim and Frick (2011) stated that
motivation is enhanced through social presence, while Karge et al. (2011) emphasized
that community enhanced academic motivation. Styer (2007) found that motivation may
or may not be affected by social presence depending on the preferences and social needs
of the individual. Others, however, affirmed the contributory nature of collaboration and
community to learner motivation (Boling et al., 2011; Cheng et al., 2011; Paechter et al.,
2010; Pigliapoco & Bogliolo, 2008). Park and Choi (2009) encouraged the instructor to
98
praise and commend learners as a way to enhance their motivation, while Holmberg
(1989) identified multiple ways that distance learning can support learner motivation
through the creation of feelings of rapport, facilitation, active learning assignments, and
effective and effusive communication.
Consistent with Holmberg’s (1989) view of the need for learner involvement to
engender motivation, several researchers have identified the need for involvement in the
classroom (Kiener, 2010); suggesting that the more energy the learner devotes both
physically and psychologically to the class material the more engaged and motivated he
or she will be (Astin, 1984; Greener, 2010; Yen & Abdous, 2011). Others have
suggested that motivation increases as the learner is not only actively involved in
activities, but also in determining how the course objectives will be met (Wang & KaniaGosche, 2011; Werth & Werth, 2011), or even in the delivering of them (Anderson,
2008a). In previous subsections, it was demonstrated that reflection upon and immediate
application of learning has dividends for eLearning and learners; among these dividends
is motivation (Abela, 2009; Baskas, 2011; Knowles, 1980; Park & Choi, 2009; Potter &
Rockinson-Szapkiw, 2012). Thus, motivation is a by-product of learner–learner
interactions, learner–content interactions, reflections, collaboration, a sense of
community, and immediate real world applications of learning—the previously discussed
keys to success in eLearning—and those are the keys to success because of their
influence on motivating the learner to persist and increase achievement. According to the
literature, the instructor–learner relationship is even more motivating.
According to Lam and Bordia (2008), “student-teacher contact in eLearning
classes is the most important factor in student motivation and involvement” (p. 135; see
99
also Chickering & Gamson, 1987; Dykman & Davis, 2008c; O’Bannon & McFadden,
2008; Travis & Rutherford, 2013). Without this interaction and encouragement from the
instructor learners tend to minimize their efforts, with subsequent degradation of learning
quality and motivation (Dykman & Davis, 2008c). The instructor is identified as the
party within the classroom with an obligation to embed motivational elements into his or
her teaching (DeLotell et al., 2010), to give relevant feedback and communication
(Paechter et al., 2010; Travis & Rutherford, 2013), “to enhance and maintain the learner’s
interest, including self-direction and self-motivation” (Abrami et al., 2010, p. 8), and
encourage learner engagement (Hrastinski & Jaldemark, 2012; Omar et al., 2011;
Paechter et al., 2010). These actions by the instructor are “recognized as a driving force
for persuading student’s motivation and the achievement of learning outcomes” (Ali &
Ahmad, 2011, p. 121).
Besides the factors that engender eLearning success, there are two additional
factors that have been reported as contributors to learner motivation. These factors
include learner perceptions and emotional state, and the environment of the class.
Learner perceptions can be either positive or negative about a course or program, and
include attitudes (Albert & Johnson, 2011; Alewine, 2010; Marschall & Davis, 2012;
Michinov et al., 2011), needs (Bhuasiri et al., 2012; Kellogg & Smith, 2009; Marschall &
Davis, 2012), and affect (Ho & Kuo, 2010; Joo et al., 2011; Kim & Frick, 2011;
Marschall & Davis, 2012; Nummenmaa & Nummenmaa, 2008). Negative perceptions
negatively influence motivation (Muilenburg & Berge, 2005) and if these attitudes stem
from the learner’s perception of course difficulty may also increase anxiety (Kim &
Frick, 2011). On the other hand, positive perceptions of the learner may contribute to
100
heightened interest (Alewine, 2010; Bhuasiri et al., 2012), increased participation
(Alewine, 2010), elevated motivation (Albert & Johnson, 2011; Joo et al., 2011;
Nummenmaa & Nummenmaa, 2008), and even “a state of complete absorption or
engagement in an activity [that] refers to optimal experience” (Ho & Kuo, 2010, p. 25)
known as flow. Thus, attitudes and perceptions greatly influence motivation (Alewine,
2010).
Finally, the learning environment has been shown to affect levels of motivation in
learners (Bradley, 2009; Falloon, 2011; Kim & Frick, 2011; Park & Choi, 2009).
Reushle (2006) identified that motivation is sustained in an environment that is “positive,
supportive, safe, tolerant, respectful, nurturing, and participatory” (p. 5; see also Bradley,
2009; Shinsky & Stevens, 2011). It appears that the more control ceded to the learner
over what and how it is learned online, increases feelings of temporal freedom and
motivation (Anderson, 2008b; Ho & Kuo, 2010; Kim & Frick, 2011), while the more
control exerted by the instructor results in inferior learner performance and attitudes and
dwarfing motivation (Park & Choi, 2009; Rovai et al., 2007). Other environmental
factors that have been noted to affect motivation were the use of synchronous eLearning
technologies (Er et al., 2009; Falloon, 2011), which was shown to enhance motivation;
and cognitive overload (Ally, 2008; Kalyuga, 2011; Kim & Frick, 2011; Wang et al.,
2011), which has been shown to diminish motivation.
Motivation is requisite for learning success. Bhuasiri et al. (2012) sought to
identify the essential factors for accepting eLearning in developing countries utilizing the
Delphi method with instructors and technology experts. The authors found six
dimensions and 20 factors that were critical to the success of eLearning acceptance. In
101
developing countries, Bhuasiri et al. found that the extrinsic motivation of external
rewards was more important for adoption than intrinsic motivation.
“Learning occurs as a result of motivation, opportunities, an active process,
interaction with others, and the ability to transfer learning to a real-world situation.
Whether or not technology makes a difference depends on how it is used for motivation”
(Oblinger & Hawkins, 2006, p. 14). Regardless of other factors in a classroom, without a
motivated student, no learning can occur (Ally, 2008; Conceicao, 2002; Muilenburg &
Berge, 2005). Scholars in cognitive load have determined that motivation, intellectual
effort, and learning are closely associated because the more motivated the learner, the
greater will be the investment in intellectual effort, which precipitates scholarship
(Fulmer & Frijters, 2009; Orvis, Horn, & Belanich, 2008; Rey & Buchwald, 2011;
Schnotz, Fries, & Horz, 2009). Both extrinsic and intrinsic motivation has been found to
have an inarguable and affirmative effect on learning and other positive learner
characteristics (Bhuasiri et al., 2012; Law, Lee, & Yu, 2010). Attempting to determine
what factors are more motivating in an eLearning, Law et al. (2010) surveyed 365
computer programming students regarding items that could have a positive effect on
motivation. Through factor analysis, these items were categorized into eight factors;
individual attitude, challenging goals, clear direction, reward and recognition,
punishment, social pressure, effect of the management system, and efficacy. All of the
factor loadings were found to be adequate except for the punishment factor. Through ttest analysis, all factors were found to provide a positive learning effect. Through a
stepwise regression challenging goals (β = 0.429), social pressure (β = 0.262), and
individual attitude (β = 0.122) were the factors that significantly contributed to efficacy,
102
accounting for over 50% of the variance. On the other hand, the most motivating factors
on learning were individual attitude, clear direction, and reward and recognition.
Motivation is more important in eLearning because of the lack of physical presence
(Conceicao, 2002) and because of the self-directed nature of many older adults,
motivation is essential for their learning (Dorin, 2007).
An enormous amount of literature acclaims the necessity for self-motivation
within the learner for him or her to be successful in the online environment. Ally (2008)
posited that designers of eLearning materials and environments should incorporate both
intrinsic and extrinsic motivation strategies because all learners are not equally driven by
either. Others noted that online environments tend to require more self-motivation from
the learner to be successful (Lam & Bordia, 2008; O’Bannon & McFadden, 2008; Omar
et al., 2011; Paechter et al., 2010) or observationally that this was how eLearning “is”
(Bradley, 2009; DeLotell et al., 2010; Dykman & Davis 2008a; Ekmekci, 2013; Ferratt &
Hall, 2009; Kim & Frick, 2011; Shih et al., 2008). Finn (2011) encouraged the use of
useful and relevant experiences in the online classroom to enhance and enrich learners’
motivation and environment. Without self-motivation the attempts of the instructor to
foster participation, motivation, and learning may be unsuccessful, but with selfmotivation it is likely that the learner will have the necessary interest to participate, ask
questions, and invest the necessary mental capital to succeed and learn (Goddu, 2012;
Omar et al., 2011). Consequently, Morrow and Ackermann (2012) determined that
learners with self-motivation and goals were also more likely to persist in their education
programs.
103
Ramifications of motivation. A lack of motivation in the learner has a number of
consequences, just as heightened motivation in the learner has consequences. A shortage
of learner motivation has been tied to increased attrition and dropout (Kim & Frick, 2011;
Morrow & Ackermann, 2012; Rovai et al., 2007), and decreased motivation has been
correlated with compulsory adult instruction or training (Donavant, 2009) and loss of
interest (Er et al., 2009). Contrarily, plentiful motivation corresponds to satisfaction with
courses and programs (Belair, 2012), deep learning (DeLotell et al., 2010), increased
energy, efficiency, and intensity while learning (Alewine, 2010; Jeffrey, 2009), and
learner persistence and retention (Morrow & Ackermann, 2012).
Kim and Frick (2011) studied 368 adult learners in both corporate and higher
education settings, and attempted to determine the factors for three types of motivation;
motivation to begin, motivation during the course, and positive change in motivation.
They identified that the primary factor for motivation to begin was relevance, although
age (older learners were more motivated to begin) and technical competence (lessened
cognitive load) were other positive factors. They further determined that two factors
were associated with increasing motivation during the course; initial motivation to start
and the belief that “e-learning is right for me” (p. 16). Finally, only one factor predicted
positive changes in motivation, which was the motivation during the course, and a
positive change in motivation was positively correlated with course satisfaction.
According to the literature, online success is clinched if the factors for eLearning
success are present. Learning in higher education requires a relationship with an
instructor (Wang & Kania-Gosche, 2011); one where the instructor expresses interest and
respect (Chen & Chih, 2012), concern and care (Bhuasiri et al., 2012; Driscoll et al.,
104
2012), and enthusiasm (DeLotell et al., 2010) for the learner. This relationship must
engender cognitive presence through timely feedback (Amrein-Beardsley & Haladyna,
2012; Ekmekci, 2013; Michinov et al., 2011), enhanced communication (Schultz, 2012;
Travis & Rutherford, 2013), and fair evaluation (Blachke, 2012; Pelz, 2010). This
relationship online differs from traditional settings where the instructor is expected to be
a facilitator of learning (Abrami et al., 2010; Boling et al., 2011; Hussain, 2013; Lee et
al., 2011). Contemporary educational literature accentuates the need for instructors to
trigger, encourage, and maintain social presence in the classroom (Crawford-Ferre &
Wiest, 2012; DeLotell et al., 2010; Ke, 2010). The instructor–learner relationship and the
interactions associated with it have consistently been found to be essential to eLearning
success (Baran et al., 2011; Belair, 2012; Ferguson & DeFelice, 2010; Joo et al., 2011;
Martinez-Caro, 2011; Travis & Rutherford, 2012; Yen & Abdous, 2011). The learner
should also be encouraged to form relationships with other learners. This has the benefit
of meeting his or her social needs (Boling et al., 2011; Henning, 2012; Omar et al., 2011),
and is extolled by a large number of supporters as essential to eLearning success because
of the constructivist approach (Crawford-Ferre & Wiest, 2012; Deulen, 2013; Hoic-Bozic
et al., 2009; Oncu & Cakir, 2011). Constructivism, as the most acknowledged theory of
educational learning, emphasizes interactivity, interaction, and collaboration for learning
to occur (Hrastinski & Jaldemark, 2012; Ismail et al., 2011; Lee et al., 2011; MartinezCaro, 2011; Phelan, 2012; Revere & Kovach, 2011). A large amount of literature
contradicts the essentiality of the learner–learner relationship, but often finds that it can
still contribute to the learner’s experience (Barros, 2013; Gunawardena et al., 2010; Ke &
Carr-Chellman, 2006; Kellog & Smith, 2009; Vogel-Wallcutt et al., 2010). Not all online
105
learning classrooms, programs, and institutions promote these six success factors. A
learner’s interaction with content and reflection upon that content were explored as
success factors in eLearning. This relationship is essential to learning (Abdous & Yen,
2010; Abrami et al., 2010; Ali & Ahmad, 2011; Lear et al., 2010) and triggers cognitive
presence (Hoskins, 2012; Joo et al., 2011; Oncu & Cakir, 2011). Reflection was found to
be important to tie a learner’s previous experience and knowledge into his or her new
learning (Baran et al., 2011; Reushle & Mitchell, 2009; Sinclair, 2009). In conjunction
with the learner–learner relationship, the benefits of collaboration and development of a
community of learning were also explored in relationship to the online classroom.
Though there is contention regarding the emphasis that should be placed on these factors,
there is general consensus that it can be beneficial to learning (Boling et al., 2011;
Bradley, 2009; Deulen, 2013; Milheim, 2011). The fifth factor contributing to eLearning
success is the need for exercises and learning to be relevant and immediate, which has
immediate benefits (Chan, 2010; Ghost Bear, 2012; Glassman & Kang, 2010; Potter &
Rockinson-Szapkiw, 2012). Finally, the need to encourage and for the learner to have
extant motivation was shown to be necessary for eLearning success (Alewine, 2010; Kim
& Frick, 2011; Morrow & Ackermann, 2012). Online learning retention remains much
lower than traditional classrooms (Allen & Seaman, 2011, 2013; Brown, 2012; Lee &
Choi, 2011; Wilson & Allen, 2011). In the next section, the factors that contribute to
failure in eLearning for the learner will be explored.
eLearning and Dropout
Much of the literature reporting the high levels of online dropout has remained
anecdotal (Al-Fahad, 2010; Joo et al., 2011; Levy, 2007; Lykourentzou, Giannoukos,
106
Nikolopouos et al., 2009; Park & Choi, 2009; Patterson & McFadden, 2009; Sulčič &
Lesjak, 2009; Willging & Johnson, 2009), but has been comprehensively and consistently
reported over the past decade. Levy (2007) noted that he found rates of online dropout
between 25% and 60%. Lykourentzou, Giannoukos, Nikolopoulos et al. (2009) found
that online rates were approximately 25-40%, as compared to on-campus rates of 1020%. Park and Choi (2009) noted that dropout rates were higher for online learners,
citing one study that showed a 70% dropout rate for a corporate training program, but
otherwise cited no specific numbers. The 70% dropout figure was also cited by Sulčič
and Lesjak (2009) for company online training. Willging and Johnson (2009) determined
“that online programs may be less desirable for certain students than the more traditional
face-to-face type of instruction” (p. 115) because of lower completion rates for learners in
online classes. They cited a study where the completion rates were 90.3% for a face-toface course and 72.2% for a comparable online course. Patterson and McFadden (2009)
also determined that dropout occurs more frequently online than on-campus, with rates
up to 50% in eLearning; though the authors determined that other factors were probably
operating other than lack of quality (see also Nichols & Levy, 2009; Nistor & Neubauer,
2010). Al-Fahad (2010) noted that high dropout is cause for skepticism regarding
eLearning, citing a dropout comparison of 32% online versus 4% in a face-to-face
classroom. In a Korean study, Joo et al., (2011) found that on-campus learners dropped
out at a 1-3% rate, while online learners did so at an 18% rate. Antonis et al. (2011)
experienced in their study of 22 sections of an online computer science course that 382
(48.6%) learners dropped the course and did not complete. Other researchers have
mentioned the higher than traditional dropout rate of online learners, but cited no specific
107
numbers (Cacciamani et al., 2012; Ekmekci, 2013; Ismail et al., 2010; Kim & Frick,
2011; Picciano, Seaman, & Allen, 2010; So & Bonk, 2010).
While these numbers seem alarming, the problem is that quite often they measure
different things. Many of the higher numbers reflect individual classes or specific
programs, but do not reflect whether the learner returned to the program, class, or even
another program or school at a later time. The problem of attrition is ubiquitous in
education since most institutions of higher education only have a 50-60% 4-year retention
rate overall (Nandeshwar et al., 2011) and theories have been proposed to seek ways to
increase retention. Further, because the lost revenue and incurred costs from even a
single dropout costs the institution tens of thousands of dollars (Nandeshwar et al., 2011),
higher education is interested in determining ways to minimize this scourge. The call of
researchers to address the problem of attrition is discussed next.
While online dropout has been called “a difficult and perplexing phenomenon”
(Levy, 2007, p. 187), the nearly unanimous clarion call to determine its causes and
factors also resonates within the literature (Cacciamani et al., 2012; Oncu & Cakir, 2011;
Travis & Rutherford, 2013; Tuquero, 2011). It is expected that by determining the causes
of attrition, models may be constructed to increase eLearning completion rates (Levy,
2007), while better using institutional resources and curtailing waste (Lee & Choi, 2011).
Research has shown that a majority (75%) of learners who drop out expect to return to
school in the future, but rarely return to the same school (Smith, 2005), while about twothirds of dropouts leave for non-academic reasons (Morrow & Ackermann, 2012). Levy
(2007) noted that while attrition has been a regular concern for institutions of higher
learning, there has been meager consideration of the disparity in online attrition rates.
108
Park and Choi (2009) concurred with Levy, noting “only a dozen research studies have
empirically explored this issue and no consensus have been reached for which factors
have definite influences on the decision” (p. 209) to leave. Lee and Choi (2011)
determined that the generalizability of most studies is limited because they focus only on
a single program or course. Lee (2010) identified that there are a number of persistence
models but that none had been extensively validated online so that the purported factors
could be validated, or new factors identified to fill gaps in the models. Park and Choi
(2009) found that the more robust and comprehensive persistence model for online
learning was Tinto’s but that Bean and Metzner’s model explained more retention
variance.
Lykourentzou, Giannoukos, Nikolopouos et al. (2009) proposed a prediction
model for dropout and retention based on machine learning techniques. They noted that
prediction models tended to be based on two types of data; time-invariant characteristics
and time varying learner attributes. The researchers identified that machine learning
techniques “have been successfully applied to solve various classification problems” (p.
953), and incorporated three in their prediction model; probabilistic ensemble simplified
fuzzy adaptive resonance theory map, support vector machines, and feed-forward neural
networks. These techniques are used to train software to learn and generally involve a
training phase, in which data is analyzed to produce a predictive model, which is then
used during a testing phase to corroborate the model. For this experiment, six predictive
models were utilized; the three independent machine learning techniques and three
combinations of the techniques. The predictive models were tested over eight points.
The first point was based on time-invariant demographical data, and the other seven
109
points were applied at equal time points during the conducting of two courses. The
researchers found that demographic data is less accurate as a predictor (41-63%) than
time variant data collected during the duration of the courses (70-88% early in the
courses and 95-100% later in the courses).
In a meta-analysis of attrition studies, Lee and Choi (2011) determined that there
were three main categories of dropout, consisting of nine groups of factors. In the
succeeding subsections, two of these main categories and seven of the groups of factors
of learner dropout will be scrutinized because they accord with the variables that will be
tested in the study. In the remainder of this section, all of the categories and the groups
of factors will be explored as an overview of the suspected and known elements
contributing to dropout in online learning. Using the constant comparative method, Lee
and Choi randomly selected factors to determine their similarity or dissimilarity with one
another, which produced 44 nonoverlapping and unique factors that researchers have
identified contribute to dropout. From these 44 factors were extracted nine groups of
similar factors, which were further distilled into three categories. The three categories
were labeled (a) Student factors, (b) Course or Program factors, and (c) Environmental
factors (Lee & Choi, 2011) and will be expanded on in the paragraphs to follow.
The student factors category subsumed the groups of (a) academic background,
(b) relevant experiences, (c) skills, and (d) psychological attributes (Lee & Choi, 2011).
This category contains demographic factors like age, gender, ethnicity, marital status
(Lee & Choi, 2011; Lykourentzou, Giannoukos, Nikolopoulos et al., 2009; Park & Choi,
2009); as well as academic factors such as GPA, previous academic performance, SAT
and other standard tests (Levy, 2007; Lykourentzou, Giannoukos, Nikolopoulos et al.,
110
2009). It looks at a learner’s experience in terms of educational level, the number of
previous eLearning classes, relevant experience within the field of study, and inclination
to take online classes (Beqiri et al., 2010; Levy, 2007; Pigliapoco & Bogliolo, 2008).
Skills that appear to affect persistence include time management, an ability to balance
different life demands, flexibility, and technology or computer experience and confidence
(Henning, 2012; Kim & Frick, 2011; Michinov et al., 2011; Park & Choi, 2009; Willging
& Johnson, 2009). Finally, psychological attributes that affect learner persistence include
a learner’s locus of control, motivation level and type, commitment to goals, selfefficacy, and satisfaction and love of learning (Bhuasiri et al., 2012; Kellogg & Smith,
2009; Levy, 2007; Paechter et al., 2010; Rovai et al., 2007).
The course or program factors category subsumed the groups of (a) course design,
(b) institutional support, and (c) interactions (Lee & Choi, 2011). Course design included
factors like team building activities and program quality (Deil-Amen, 2011; Kuleshov,
2008; Pigliapoco & Bogliolo, 2008; Travis & Rutherford 2013; Tuquero, 2011), while
institutional support included administrative support, program orientation, tutorial
assistance, and the support infrastructure afforded learners (Dykman & Davis, 2008c;
Ferratt & Hall, 2009; Hoskins, 2012; Kellogg & Smith, 2009; Moisey & Hughes, 2008).
The advantages of interactions have been previously discussed, but the lack of interstudent and faculty interactions and relationships can contribute to attrition, along with a
lack of learner participation (Deil-Amen, 2011; DeLotell et al., 2010; Ekmekci, 2013;
Hoskins, 2012; Rovai et al., 2007; Sulčič & Lesjak, 2009).
The environmental factors category consisted of two groups; work factors and
supportive environment (Lee & Choi, 2011). While these topics will not be further
111
discussed following this section it is important for the understanding of the concept of
retention to understand that there are factors outside of the school environment, and at
times completely outside of the control of the learner than can affect learner persistence.
Levy (2007) identified that older learners who are employed and work more hours are
less likely to complete. He also identified a lack of time as a factor in dropout. Park and
Choi (2009) noted that adult learners often attribute their dropout to changes in their job
or increased workload that occurs before or during a course. They also observed that the
presence or absence of peer or family support can make a huge difference in whether a
learner persists. For this reason, Park and Choi (2009) determined that eLearners are
more influenced by environmental factors than are traditional learners. Other external
factors of attrition were recognized by Rovai et al. (2007), who indicated that “financial
cost, disruption to family life, and a lack of employer support” (p. 414) are contributors.
Several authors suggested that the availability of financial aid influences dropout
(Lykourentzou, Giannoukos, Nikolopoulos et al., 2009; Rovai et al., 2007; Willging &
Johnson, 2009). Thus, the environmental factors of work commitment might include
changes in workload or pressures at work, changes in financial or familial status, while
emotional support may include life circumstances or challenges, and support from
friends, work, and family.
Learner Factors of Dropout
Prior to this more specific exploration of the previously mentioned factors of
dropout and supposed eLearner characteristics the literature consistently identifies age as
a factor in attrition. Lassibille (2011) noted that the opportunity costs of older learners
are higher, while the time that they can enjoy the benefits of their education is shorter,
112
therefore, based on cost-benefit theory older learners would be more likely to dropout.
He recognized that findings in other research to support this theory are inconsistent, but
Lassibille and Navarro Gómez (2008) found that each year of age in a technology
program resulted in a 17% increase in dropout probability for online learners as opposed
to only 8% for traditional learners. Martinez-Caro (2011) enumerated several studies that
showed that age had no significant relationship with learning outcomes in eLearning. In
a study of 89,473 online learners, Newell (2007) found that age, gender, ethnicity, and
financial aid eligibility were predictors of retention, while many earlier studies in
eLearning found age to be a significant predictor, none in the past 5 years did so (Dibiase
& Kidwai, 2010; Lee & Choi, 2011; Lykourentzou, Giannoukos, Nikolopoulos et al.,
2009; Park & Choi, 2009).
Lassibille (2011) posited that since females earn their degrees faster and are more
likely to graduate that gender should be a factor of retention and attrition. He identified
that the findings of various studies are inconsistent and that most often no gender effect is
detected between learners. Martinez-Caro (2011) also reported of conflicts in study
findings, some arguing “females encounter more difficulties than males in e-learning,”
and others determining “females had a more positive attitude toward e-learning courses
than did males” (p. 573). Gonzalez-Gomez et al. (2012) tested the effect of gender on
eLearning satisfaction and concluded that instructors, in planning their lessons, should
consider the differences between genders. They especially noted, among their sample of
1,185 learners, female learners were significantly more satisfied with online courses, and
were more influenced by (a) teaching methods and planning, (b) active participation, and
(c) teacher participation. In a quantitative survey study with 1,325 participants, Bradford
113
and Wyatt (2010) sought to identify whether demographic differences correlated with
learner satisfaction. They found that neither ethnicity nor gender was a significant
indicator of satisfaction, and “error values account for almost all of the effect. [Which]
implies that something else is at work: the demographics have little to do with what
influences” satisfaction (Bradford & Wyatt, 2010, p. 113).
Several researchers have determined that academic factors of the learner may
enhance persistence or promote dropout. Many researchers, for example, have noted the
strong positive correlation between a learner’s GPA and their persistence in eLearning
(Driscoll et al., 2012; Dziuban & Moskal, 2011; Jackson et al., 2010; Lapsley, Kulik,
Moddy, & Arbaugh, 2008; Lassibille, 2011). Others have demonstrated the predictive
capacity of a learner’s previous academic performance on retention and future academic
performance (Dziuban & Moskal, 2011; Lassibille, 2011; Lee & Choi, 2011;
Lykourentzou, Giannoukos, Nikolopoulos et al., 2009). Recent research supports that
academically anemic learners are more likely to choose online courses, while strong
learners academically are more likely to choose face-to-face learning (Driscoll et al.,
2012; Lee & Choi, 2011).
Other learner characteristics that have been shown to affect persistence in
eLearning include technical efficacy (Bennett et al., 2012; Kiliç-Çakmak, 2010; London
& Hall, 2011; Travis & Rutherford, 2013), self-regulation (Cox, 2013; Driscoll et al.,
2012; Rodrigues, 2012; Shea & Bidjerano, 2010; Wilson & Allen, 2011), and relevant
experience (Barber, 2012; Hodge et al., 2011; Lee & Choi, 2011). Kiliç-Çakmak (2010)
found that successful eLearners use metacognitive learning strategies that enhance their
“information literacy self-efficacy perception and self-efficacy belief” (p. 201). In
114
another study, using a mixed methods design to explore predictors of learner satisfaction
and transfer of learning, Gunawardena et al. (2010) surveyed 19 learners in a
multinational corporate online educational program and found that the largest predictor of
satisfaction was self-efficacy. Though they acknowledged the limitation of such a large
sample, they also noted the very high R2 (0.884) of this observation. Rakap (2010)
sought to determine the impact of computer skills on eLearning success, and in a survey
of 46 adult learners determined that learners with more advanced computer skills did
significantly better in an online course than did those with some or no computer skills (r
= .462, p = .01). In a mixed methods, cross-sectional study of 84 doctoral students,
Bolliger and Halupa (2012) found that anxiety with the use of the Internet or computers
had a significant negative correlation (r = -.50, p < .001) with learner satisfaction.
Some researchers have shown that persistence increases as the level of education
(Beqiri et al., 2010; Levy, 2007; Park & Choi, 2009) and the number of previous online
courses (Beqiri et al., 2010; Lee & Choi, 2011; Martinez-Caro, 2011; Pigliapoco &
Bogliolo, 2008) increases. Lykourentzou, Giannoukos, Nikolopoulos et al. (2009) also
reported that learners with more relevant experience in a field tend to persist in online
courses in that field than do novices (see also Beqiri et al., 2010; Hodge et al., 2011; Lee
& Choi, 2011). Beqiri et al. (2010) investigated the factors affecting online learner
satisfaction utilizing a quantitative cross-sectional survey with 240 participants. The
authors conducted a number of one-tailed t-tests and multiple regression to determine
predictors of learner satisfaction. According to their findings learners will be more
satisfied with their courses if they were older (graduate students were more satisfied than
undergraduates), male (gender had a significant effect), and liked online courses; these
115
three predictors explaining 50.72% of the variance. Ke and Kwak (2013) were also
interested in how age, education level, and ethnicity affected perceived instructor support
and peer interactions, which were expected to affect learner satisfaction with online
learning and the specific course. The researchers surveyed 392 learners from a major
university in the U.S. and utilized structural equation modeling to determine which
predictors were significant. Satisfaction with eLearning was negatively correlated with
education level (β = -0.27); the more educated, the less learner’s liked learning online,
but was positively correlated to both perceived instructor support (β = 0.24) and
perceived peer interaction (β = 0.14). Satisfaction with the course, however, was only
significantly correlated with perceived instructor support (β = 0.43).
In additional to relevant experience the literature identifies that certain skills
increase the chances that learners will persist in online courses (Lo, 2010). These
relevant skills include time management (Dibiase & Kidwai, 2010; Kim & Frick, 2011;
Michinov et al., 2011), an ability to balance life’s demands (Alshare et al., 2011; DeilAmen, 2011; Lee & Choi, 2011; Michinov et al., 2011), computer training or experience
(Abdous & Yen, 2010; Al-Asfour, 2012; Al-Fahad, 2010; Ho & Kuo, 2010;
Lykourentzou, Giannoukos, Nikolopoulos et al., 2009; McGlone, 2011; Rakap, 2010),
and information literacy (Alshare et al., 2011; Bhuasiri et al., 2012; Bolliger & Halupa,
2012; Kiliç-Çakmak, 2010; Lee & Choi , 2011; Lee et al., 2011; Nagel et al., 2011).
Russ et al. (2010) found that instructors believed that success in eLearning required selfmotivation and the ability for the learner to work independently. These findings were the
result of a quantitative survey study with 58 career and technical instructors from the
southeastern U.S. and were based on mean analysis of the appropriate survey questions.
116
Finally, studies show that psychological characteristics of learners have an effect
on whether they persist or drop out of eLearning. The main attitudes that affect retention
are locus of control (Levy, 2007), intrinsic and extrinsic motivation (Bhuasiri et al., 2012;
Hoskins, 2012; Kim & Frick, 2011; Rovai et al., 2007; Zepke & Leach, 2010), a belief in
one’s ability to learn and succeed (Hoskins, 2012; Kim & Frick, 2011; Shea & Bidjerano,
2010), and learner satisfaction (Beqiri et al., 2010; Harper & Ross, 2011; Lykourentzou,
Giannoukos, Nikolopoulos et al., 2009; Paechter et al., 2010). In a quantitative crosssectional study, Lee, Choi, and Kim (2013) sought to determine factors of dropout and
persistence, and found by conducting a one-way MANOVA that students who persisted
had two significantly different factors from students who dropped out; academic locus of
control (β = 0.59) and self-regulation skills (β = 0.74). The authors determined that a
student’s belief that he or she controls learning success and had metacognitive selfregulatory skills were the most important influences on whether a learner dropped out or
not. Zepke and Leach (2010) conducted an extensive review of the literature on
completion and retention of students and found that motivation, which they characterized
as the learner’s perceptions of competency and autonomy, where highly correlated with
learner retention. Gupta and Bostrom (2009) found learner aptitudes that correlate with
persistence and positive learning outcomes can be characterized as either cognitive
abilities or motivation. Cognitive abilities were recognized as the learner’s learning style,
self-efficacy, and orientation to learning, while motivation subsumed persistence,
intensity, and direction of learning behaviors while learning (Gupta & Bostrom, 2009).
Alewine (2010) found that maintaining motivation requires emotional engagement, which
fosters perseverance and completion.
117
Online Course or Program Factors of Dropout
While there is little that an institution of higher learning can do about a learner’s
age, gender, GPA, or psychological attributes, they do have complete control over the
courses and the programs they offer. Institutions can instill an expectation of learner
participation and community in classes, of excellence in the quality of their service and
programs, and in course design. These three factors have been established in the
retention of online learner’s (Chen & Chih, 2012; DeLotell et al., 2010; Hoskins, 2012;
Johnson, Wisniewski, Kuhlemeyer, Isaacs, & Krzykowski, 2012; Lee & Choi, 2011;
Phelan, 2012; Travis & Rutherford, 2013; Tuquero, 2011), and will be explored more
fully in this section.
As learners participate in classes and interact with other learners and instructors
they are more likely to persist with their online programs (Belair, 2012; Joo et al., 2011;
Martinez-Caro, 2011; Phelan, 2012; Travis & Rutherford, 2013). The literature
corroborates the more a learner participates, the higher his or her probability of
succeeding and of completing (Boling et al., 2011; Cacciamani et al., 2012; Driscoll et
al., 2012; Kim & Frick, 2011). This participation has been discussed in several of the
eLearning success factors sections above. Learner participation in class improves learner
interest (Abrami et al., 2010; Bennett et al., 2012; Hoic-Bozic et al., 2009; Phelan, 2012)
and promotes team-building activities (Lee & Choi, 2011), which have been shown to
stimulate a positive climate in the classroom (Blaschke, 2012; Bradley, 2009; Cercone,
2008; Kim & Frick, 2011; Pelz, 2010), stave off feelings of isolation (Hughes & Berry,
2011; Jackson et al., 2010; Lee et al., 2011), and enhances both learning support
(Blaschke, 2012; Cheng et al., 2011; Pigliapoco & Bogliolo, 2008), and group
118
commitment (Joo et al., 2011; Pelz, 2010; Pigliapoco & Bogliolo, 2008). This
participation often requires interaction between the learner and other learners, which
fosters strong human connections (Boling et al., 2011; Ekmekci, 2013; Omar et al., 2011;
Wang et al., 2011), active citizenship (Hoskins, 2012), and a sense of community (Ke &
Hoadley, 2009; Oncu & Cakir, 2011; Phelan, 2012). All of these learner–learner factors
positively affect retention of online learners. Participation within a classroom setting also
entails interaction between the learner and the instructor, which improves information
support (Bhuasiri et al., 2012; Ferratt & Hall, 2009; Park & Choi, 2009; Pigliapoco &
Bogliolo, 2008) and instructional quality (DeLotell et al., 2010; Lassibille, 2011; Strang,
2012); both factors with a high correlation of retention in eLearning (DeLotell et al.,
2010; Falloon, 2011; Morrow & Ackermann, 2012; Oncu & Cakir, 2011). It was
proposed by researchers that the factors contributing to learner satisfaction differed
between modalities (online, on-campus, or blended learning). In a longitudinal survey
design, evaluating the course evaluations of 1,124,979 participants, Dzubian and Moskal
(2011) found that even though other research has determined that differing contexts may
cause different factoring in the same tool, in their study they determined “the
dimensionality in each mode is one, the underlying factors are identical, and they do not
ifferentiate the instructional idiosyncrasies found in the three modalities” (p. 239). This
indicates that when it comes to satisfaction learners do not consider course mode
important.
The online institution can provide quality support that engenders persistence by
providing thoughtful and mission-specific administrative support, ensuring decent faculty
compensation and training, and through providing adequate student support services.
119
Administrative support that has been shown to minimize learner attrition includes
admissions support (Lee et al., 2011; Park & Choi, 2009), registration assistance (DeilAmen, 2011; Lee et al., 2011), scholarship programs (Lee et al., 2011; Moisey & Hughes,
2008), research assistance (Lee et al., 2011; Moisey & Hughes, 2008), and trained staff
that can support both student life issues (Lee et al., 2011; Lee & Choi, 2011; Tuquero,
2011) and technical service engineers to provide technical support, ensure learner access,
and provide initial orientations for new and inexperienced online learners (CrawfordFerre & Wiest, 2012; Lee, 2010; Lee & Choi, 2011; Schultz, 2012). Also important in
administrative services is institutional commitment and emphasis on learner success
(Ally, 2008; Cornelius et al., 2011; Finn, 2011) and setting expectations for high levels of
learner achievement (Amrein-Beardsley & Haladyna, 2012; Ekmekci, 2013; O’Bannon &
McFadden, 2008). High quality instructors who provide excellent instructional quality
and content along with prompt and critical feedback increase retention (Archambault et
al., 2010; Bradford & Wyatt, 2010; Er et al., 2009; Hoic-Bozic et al., 2009), and need to
be fairly compensated (Lee & Choi, 2011) and properly trained in the use of the learning
environment (Crawford-Ferre & Wiest, 2012; Hussain, 2013). Online learning systems
require staff to support learners. As this support staff models a learning environment that
is perceived by learners to be easy and intuitive to use (Bhuasiri et al., 2012; Dibiase &
Kidwai, 2010; Joo et al., 2011; Lee, 2010; Martinez-Caro, 2011) and facilitates access to
appropriate resources (Archambault et al., 2010; Hoic-Bozic et al, 2009; Martinez-Torres
et al., 2011; Wang & Kania-Gosche, 2011), while allowing for the efficient management
of those resources (Blaschke, 2012; Hussain, 2013; Lee & Choi, 2011; Savery, 2010)
there is a greater likelihood that learners will accept and be satisfied with the service
120
quality, and less likely to dropout (Bhuasiri et al., 2012; Lee, 2010). Support services
need to not only provide service quality, but must also ensure system quality for the
learning environment as well. Using a quantitative survey study with 872 Korean and
American students, Lee (2010) sought to determine whether learner perceptions of
service quality predicted learner satisfaction. While noting that Korean students
perceived service quality as better, they found using structural equation modeling that
perceived service quality predicted perceived ease of use (β = 0.648), perceived
usefulness (β = 0.762), and learner satisfaction (β = 0.378). Lee found that perceived
ease of use and perceived usefulness also added an additional indirect effect on learner
satisfaction for a total effect on learner satisfaction, β = 0.788. Lee et al. (2011) sought to
determine the relationships between learner perception of support from instructor, peers,
and technical support and learner satisfaction using a mixed-methods survey with 110
participants. All forms of support (instructor, r = .659, peer, r = .557, and technical, r =
.541; all p < .01) were found to be correlated to learner satisfaction. System quality
emphasizes eight characteristics and has been shown to increase learner persistence with
online learning. These eight characteristics are data quality (Alshare et al., 2011; Lee &
Choi, 2011; Lu & Chiou, 2010), ease of use (Alshare et al., 2011; Bhuasiri et al., 2012;
Lu & Chiou, 2010), flexibility (Bhuasiri et al., 2012; Bolliger & Halupa, 2012; Lu &
Chiou, 2010), functionality (Alshare et al., 2011; Bhuasiri et al., 2012; Chen, 2010),
importance (Abrami et al., 2010; Alshare et al., 2011; Fidishun, 2011), integration
(Barber, 2012; Bhuasiri et al., 2012; Holton et al., 2009), portability (Bhuasiri et al.,
2012), and reliability (Alshare et al., 2011; Bhuasiri et al., 2012; Kiliç-Çakmak, 2010).
Lu and Chiou (2010), in a quantitative cross-sectional study with 522 participants, sought
121
through structural equation modeling to determine predictors of learner satisfaction based
on the learning management system. Their findings showed that learner satisfaction was
predicted by interface friendliness (β = 0.35), content richness (β = 0.37), and perceived
flexibility (β = 0.27), while perceived community was not found to be a significant
predictor.
Proper online course design that engages learners and fosters learner retention
includes three elements: the material and resources for the class must be accurate
(Alshare et al., 2011; MacLean & Scott, 2011), complete (Alshare et al., 2011; Desai et
al., 2008; Henning, 2012; Lee & Choi, 2011; MacLean & Scott, 2011), meet learner
content needs (Alshare et al., 2011; Dykman & Davis, 2008b; Hannay et al., 2010; Kim
& Frick, 2011), relevant (Abrami et al., 2010; Bass, 2012; Keengwe & Georgina, 2011;
Marschall & Davis, 2012), and timely or current (Dykman & Davis, 2008a; Gill, 2010;
Hannay et al., 2010); the design team needs to take a team approach (Barratt et al., 2012;
Lee, 2010; Tuquero, 2011); and subsequent learner support is essential (Abdous & Yen,
2010; DeLotell et al., 2010; Gupta & Bostrom, 2009; London & Hall, 2011). Learner
support includes the previously mentioned technical support (Bhuasiri et al., 2012;
Crawford-Ferre & Wiest, 2012; Lee, 2010; Lee et al., 2011; Schultz, 2012), and
supportive learning environment (Henschke, 2011; Lee & Choi, 2011; Stern & Kauer,
2010), but also instructional support (DeLotell et al., 2010; Gilbert et al., 2013; Holton et
al., 2009) and a learner focused approach (Allen et al., 2009; Travis & Rutherford, 2013;
Tuquero, 2011). Instructional support may include academic advising (Lee et al., 2011;
Park & Choi, 2009; Tuquero, 2011), counseling services (Deil-Amen, 2011; Kistler,
2011; Lee & Choi, 2011; Lee et al., 2011; Tuquero, 2011), tutoring services (Hughes &
122
Berry, 2011; Hussain, 2013; Sulčič & Lesjak, 2009), but requires the instructor to clarify
his or her expectations of the learner, and ensure understanding of assignments, activities,
materials, and means of learner assessment (Abdous & Yen, 2010; Ali & Ahmad, 2011;
O’Bannon & McFadden, 2008; Paechter et al., 2010) and regular office hours (Ferratt &
Hall, 2009; Lee et al., 2011).
Learner Satisfaction and Online Course Dropout
Learner satisfaction is part of what constitutes quality in education, along with
institutional cost-effectiveness, learning effectiveness, faculty satisfaction, and access
(Omar et al., 2011; Yen & Abdous, 2011). Learner satisfaction results from and
encourages learner engagement, higher academic performance, greater learner motivation
and learning, and a better opportunity for online success (Bolliger & Halupa, 2012;
Donavant, 2009; Gunawardena et al., 2010; Joo et al., 2011; Yen & Abdous, 2011).
Chen and Chih (2012) defined learner satisfaction as “a positive feeling or attitude
toward learning activities” (p. 546). It is for this reason that learner satisfaction in an
online environment tends to expand learner motivation (Cheng et al., 2011; Er et al.,
2009; Martinez-Caro, 2011; Paechter et al., 2010; Wilson & Allen, 2011), increase the
quality of learning outcomes (Abdous & Yen, 2010; Ali & Ahmad, 2011; Diaz &
Entonado, 2009; Joo et al., 2011; Kupczynski et al., 2011), heighten academic
achievement (Abrami et al., 2010; Oncu & Cakir, 2011; Paechter et al., 2010; Phelan,
2012; Yen & Abdous, 2011), and ensure maximum learning transfer (Joo et al., 2011;
Lee et al., 2011; Strang, 2012; Wang et al., 2011). In a quantitative cross-sectional
survey study involving 29 universities in Austria and 2,196 students, Paechter et al.
(2010) sought to correlate the experiences and expectations of learners to their
123
achievement and satisfaction. Using multiple regression, the researchers found that a
learner’s expectations contributed significantly to learner satisfaction with eLearning
only through the acquisition of knowledge and skills (β = .11). The learner’s experiences
in the classroom, however, contributed to learner satisfaction much more; the structure of
the online class (β = .23), the expertise of the instructor (β = .16), the instructor’s
interaction and support (β = .13), and the learner’s own motivation (β = .15) were all
significant predictors of learner satisfaction. As Paechter et al. concluded, two factors; a
learner’s achievement goals and the instructor, strongly contributed to achievement and
satisfaction.
When a learner is satisfied with a course or program they tend to persist and are
less likely to drop out (Abdous & Yen, 2010; Ali & Ahmad, 2011; DeLotell et al., 2010;
Ferguson & DeFelice, 2010; Hoskins, 2012; Hrastinski & Jaldemark, 2012; Lo, 2010;
Revere & Kovach, 2011; Willging & Johnson, 2009; Yen & Abdous, 2011), while
conversely, the less satisfied the learner is, the more likely they are to withdraw (Jackson
et al., 2010; Lassibille, 2011; Lo et al., 2011; Lykourentzou, Giannoukos, Nikolopoulos
et al., 2009; Martinez-Caro, 2011; Park & Choi, 2009; Pigliapoco & Bogliolo, 2008).
Learner satisfaction has been shown to be a significant predictor of retention and
persistence in eLearning (Ali & Ahmad, 2011; Hoskins, 2012; Yen & Abdous, 2011).
Specific factors tend to degrade learner satisfaction and will be explored in this section
while factors that increase learner satisfaction will be discussed in the next section. The
difference theory suggests that the greater the difference between a learner’s expected
knowledge and skill gain from a course and their actual gain from the course dictates
much of their satisfaction. If the actual gain in knowledge or skill is very close to what
124
was expected, then learner’s tend to be satisfied, however, if learners did not benefit from
the course as much as they expected they were more likely to be dissatisfied (Chen &
Chih, 2012). A large minority (30%) of online learners tended to be less satisfied with
eLearning than with traditional learning (Jackson et al., 2010), so it is important that
eLearning instructors promote activities, communications, and quality that meet learner
expectations (Karge et al., 2011). Jackson et al.’s (2010) study correlated online actions
of instructors with learner satisfaction in two Texas community colleges. Despite the
minority of learners being dissatisfied with eLearning, the researchers found that a
majority of the variance (R2 = .693) in learner satisfaction was explained by climate,
activities, timeliness, expectations, and enthusiasm. Picciano et al. (2010) drew data
from 6 years of national online learning studies to identify specific issues, including
learner satisfaction. Factors that have been identified and correlated with negative
learner satisfaction online are (a) reduced face-to-face time with instructor and other
students (Fahy, 2008; Falloon, 2011; Picciano et al., 2010; Stein et al., 2009), (b)
technological difficulties with little or no support (Bennett et al., 2012; Bolliger &
Halupa, 2012; London & Hall, 2011; Picciano et al., 2010; Rakap, 2010), (c) heightened
cognitive load (Bhuasiri et al., 2012; Caine, 2010; Pass & Sweller, 2012; Picciano et al.,
2010; Rey & Buchwald, 2011), and (d) reduced instructor assistance (Ali & Ahmad,
2011; Lee, 2010; Omar et al., 2011; Picciano et al., 2010; Travis & Rutherford, 2013).
Factors that Engender eLearner Satisfaction
The previous discourse solidly identifies the factors that have been found to
accentuate or diminish dropout in online courses. These selfsame factors have been
found to engender satisfaction in eLearning; “satisfaction informs how eLearning is
125
received, accepted, and valued, and it attests to the quality of the learning experience”
(Gunawardena et al., 2010, p. 209). Al-Fahad (2010) posited that “a critical issue for
researchers and practitioners alike” is to clearly understand “the factors influencing
[learner] satisfaction with online courses” (p. 62). In this final section, the factors that
have been demonstrated to engender learner satisfaction online will be detailed. Similar
to the dropout factors there are both learner and program elements that have been shown
to contribute to learner satisfaction with eLearning courses.
Many studies show that as the instructor rewards greater learner effort and time
spent engaged in learning (Hussain, 2013; Jeffrey, 2009; Oncu & Cakir, 2011; Rey &
Buchwald, 2011; Shea & Bidjerano, 2010), and a learner has higher expectations for
themselves (Chen & Chih, 2012; Driscoll et al., 2012; Ke, 2010; Rovai et al. 2007), they
are more likely to persist (Jeffrey, 2009; Lee & Choi, 2011; Morrow & Ackermann,
2012), show greater commitment to learning (Chaves, 2009; Chickeering & Gamson,
1987; Deil-Amen, 2011; Park & Choi, 2009), and be motivated to learn (Finn, 2011;
Henning, 2012). Motivation can be intrinsic or extrinsic (Bhuasiri et al., 2012; Dibiase &
Kidwai, 2010; Hoskins, 2012; Jeffrey, 2009; Moore, 2010), based on the acquisition of
knowledge for knowledge’s sake (Chen & Chih, 2012; Gupta & Bostrom, 2009; Hauer &
Quill, 2011; Paechter et al., 2010), or for the furthering of career goals (Ali & Ahmad,
2011; Chen & Chih, 2012; Taylor & Kroth, 2009a; Tolutiene & Domarkiene, 2010), or as
a means of social connection and extending of relationships (Anderson, 2008b; Chen &
Chih, 2012; Ekmekci, 2013; Kellogg & Smith, 2009). This motivation improves when
the material (DeLotell et al., 2010; Dorin, 2007; Driscoll et al., 2012; Ferratt & Hall,
2009) and learning activities of the course are both interesting and interactive (Chen &
126
Chih, 2012; Ferratt & Hall, 2009; Ho & Kuo, 2010; Lee & Choi, 2011), and when the
learner participates more (Cacciamani et al., 2012; Diaz & Entonado, 2009; Ismail et al.,
2010; O’Bannon & McFadden, 2008).
In a mixed methods study, Dibiase and Kidwai (2010) investigated the
counterintuitive finding that older learners do better than younger learners online.
Although the researchers did not confirm this finding in their study, they did find that
older learners spent considerably more time on task and participated more, resulting in
significantly greater satisfaction. Kupczynski et al. (2011) confirmed this finding.
Through a qualitative case study, the researchers collected records from 1,631 online
learners at a university in southern Texas to determine whether there was a relationship
between time on task and frequency of participation online and the learner’s final course
grades. They found that older learners participated more and spent more time on task and
that this related significantly to increased grades. They also found that factors of
ethnicity and gender did not affect grades in the online classes. With increased
motivation follows greater learning achievement (Alewine, 2010; Ali & Ahmad, 2011;
Hoskins, 2012; Ruey, 2010) and better learning outcomes (Ali & Ahmad, 2011; Lee et
al., 2011; Revere & Kovach, 2011). Pih-Shuw and Jin-Ton (2012) conducted a survey
study with 64 students to determine whether learning motivation, as represented by social
relationship, external expectation, stimulation, or interest in acquiring knowledge, was
related to learning satisfaction, as represented by satisfaction with the environment, the
instructor, content, learning results, and human relationships. After data collection,
correlational analysis and multiple regression analysis were performed. The authors
found that social relationships as a motivator contribute strongly to satisfaction with
127
learning environment (β = .255, p < .05), the desire to acquire knowledge as a motivator
predicts satisfaction with the instructor (β = .378, p < .01) and with the content (β = .438,
p < .001). Learner satisfaction has an affective element because more satisfied online
learners have better attitudes toward learning (Bhuasiri et al., 2012; Boling et al., 2011;
Chen & Chih, 2012; Driscoll et al., 2012; Lo et al., 2011), are likely to have an internal
locus of control (Cercone, 2008; Ferguson & DeFelice, 2010; Lee & Choi, 2011;
Lykourentzou, Giannoukos, Mpardis et al., 2009), a greater intent to transfer learning to
the real world (Abdous & Yen, 2010; Chaves, 2009; George, 2013; Joo et al., 2011), and
tend to derive enjoyment, pleasure, and even flow from the experience (Bhuasiri et al.,
2012; Gupta & Bostrom, 2009; Ho & Kuo, 2010; Hoskins, 2012; Rey & Buchwald,
2011).
Learners who take a deep learning approach and achieve at higher levels are also
more likely to be satisfied (Ally, 2008; DeLotell et al., 2010; Jeffrey, 2009; Kie & Xie,
2009). In a study that surveyed 51 learners in ten purely online courses, Kie and Xie
(2009) ensured that learner profiles, instructor competence, program context, and
technical support were the same while varying between three design models (wrap
around, content + support, and integrated) and three discussion types (open-ended,
closed-ended, and integrated). They found that the content + support courses best
encouraged knowledge construction, but the highest levels of learner satisfaction were in
integrated courses that fostered deep learning. This satisfaction appears to be because
these learners participate more (Chen & Chih, 2012; Hrastinski & Jaldemark, 2012; Kim
& Frick, 2011; Omar et al., 2011), are more responsive to interactions (Bhuasiri et al.,
2012; Bolliger & Halupa, 2012; Bradford & Wyatt, 2010; Lee, 2010), and have greater
128
cognitive absorption that meets their learning needs (Doherty-Restrepo et al., 2009; Ho &
Kuo, 2010; Hoskins, 2012; Ruey, 2010).
In a quantitative cross-sectional study of learner satisfaction and proposed
satisfaction components, Lo et al. (2011) surveyed 322 participants from 20 public
universities in Malaysia to determine the relationship between satisfaction and delivery
method (medium of learning transmission), content, system (infrastructure and technical
support), and interaction. Using Pearson correlations, the authors determined that all
factors correlated with one another moderately (r between .29 and .50, p < .01),
indicating the scales were acceptably independent. Using regression analysis, Lo et al.
determined that all of the satisfaction components were predictors of learner satisfaction.
Thus, content (β = .253, p < .001), the learning system (β = .214, p < .001), delivery
method (β = .148, p < .01), and interaction (β = .144, p < .01) were determined to have a
significant influence on eLearning satisfaction.
Chen and Chih (2012) explored the relationship between learner motivation and
satisfaction in conjunction with a number of moderating variables of a group of 64
graduate students. Correlational and multiple regression analysis was used to determine
the relationships between the variables. There was a significant positive correlation
between the predictor variable of learner motivation; comprising interest in acquiring
knowledge, professional development, escape or stimulation, social needs, or external
expectations and the criterion variable of learner satisfaction; which subsumed the
learning environment, instructor teaching, content, learning results, and relationships.
Between the 10 variables, 29 correlations were significant, with the highest correlation
between career development as the motivator and learning result as the satisfier, r = .612.
129
In their study, the greatest explanation of variance of learner satisfaction were the
motivators of career development, F(4, 59) = 34.19, p < .001, and interest in learning,
F(4, 59) = 7.65, p < .01, R2 = 0.431. The second greatest explanation of variance of
learner satisfaction were the motivators of social needs, F(4, 59) = .472, p < .001, and
interest in learning, F(4, 59) = 7.20, p < .01, R2 = 0.375. Chen and Chih concluded that
as learner interest increased so did learner satisfaction with the instructor, the content, the
relationships, and the learning results.
Learners tend to be dissatisfied online or ambivalent in negative environments
(Bradford & Wyatt, 2010; Ke, 2010; Levy, 2007; Martinez-Caro, 2011; Wilson, 2005),
where there is cognitive overload (Ally, 2008; Kalyuga, 2011; Kim & Frick, 2011), when
difficulties with technology are pervasive or the course is poorly designed (Ali & Ahmad,
2011; Bhuasiri et al., 2012; O’Toole & Essex, 2012; Travis & Rutherford, 2013), and
when methods of communication are difficult, time consuming, and impede information
fluency (Bradford & Wyatt, 2010; Falloon, 2011; Kellogg & Smith, 2009). Each of these
difficulties represents a course or program factor that can be improved with proper
planning and design. Bradford (2011), in an exploratory quantitative cross-sectional
study with 1,401 learners to determine the relationships between cognitive load,
academic performance, and learner satisfaction found a significant positive correlation
between satisfaction and cognitive load (r = .5, p < .01). Bradford equated cognitive load
to perceived mental effort. Some learners are pessimistic and unmotivated, but as has
been discussed in previous sections of this literature review, these traits can be improved
through proper application of teaching, cognitive, and social presence (George, 2013;
Hoskins, 2012; Joo et al., 2011; Travis & Rutherford, 2013). The relationship with the
130
instructor has been shown to have a great impact on the learning satisfaction if the
instructor is highly present (Ekmekci, 2013; Hoskins, 2012), supports individual learners
to facilitate their learning (Bradley, 2009; Cacciamani et al., 2012; Fahy, 2008; Oncu &
Cakir, 2011), stimulates learner motivation (Chen & Chih, 2012; Paechter et al., 2010;
Tolutiene & Domarkiene, 2010), and is responsive to learner needs (Bolliger & Halupa,
2012; Lee, 2010). Instructors who provide prompt and meaningful feedback (Alshare et
al., 2011; Amrein-Beardsley & Haladyna, 2012; Ke, 2010; Lee, 2010; Lee & Choi,
2011), who orchestrate online discussions and facilitate collaborative learning (Ke, 2010;
Geengwe & Georgina, 2011; Oncu & Cakir, 2011; Paechter et al., 2010), and have some
expertise in the implementation of an online class (Paechter et al., 2010) have been
shown to engender learner satisfaction and achievement.
Aside from the effects on satisfaction of teaching presence, satisfaction is
engendered in eLearning through proper functioning of the technology. Repeated studies
identify that a large factor in online satisfaction is the computer self-efficacy of the
learner; the more familiar the learner is with the technology, the more likely he or she
will be satisfied with the course (Alshare et al., 2011; Bolliger & Halupa, 2012; Dibiase
& Kidwai, 2010; Donavant, 2009; Gunawardena et al., 2010; McGlone, 2011).
Technology that engenders satisfaction in eLearning needs to be intuitive and easy to use
(Bhuasiri et al., 2012; Joo et al., 2011; Lee, 2010; Martinez-Caro, 2011), should add to
the value of the class rather than being disruptive (Kim & Frick, 2011), and have high
perceived quality; meaning that the content is relevant to the course, to the learner, and
allow some amount of learner customization and control (Caminotti & Gray, 2012;
Cheng et al., 2011; Lee & Choi, 2011; Lo et al., 2011). Proper setup of technology
131
allows for presentation of content that is not overly complex or difficult to use,
decreasing the chance for cognitive overload (Ally, 2008; Kalyuga, 2011; Kim & Frick,
2011).
Information quality and the materials and content available for a course can
greatly affect the satisfaction of online learners and are important in course design (Lee &
Choi, 2011; McGlone, 2011; Paechter et al., 2010). Course design indirectly affects
learner satisfaction through how it involves the learner in the curriculum, different
methods of delivery and activities that promote learning, and through class management
(Barber, 2012; DeLotell et al., 2010; Ho & Kuo, 2010; Lee & Choi, 2011; Sharples et al.,
2007; Stein et al., 2009). A major factor of learner satisfaction in eLearning is the need
for reduced ambiguity in presenting a class with a clear structure of what learning will
occur, what materials are available, the course requirements, and how learners will be
assessed (Abdous & Yen, 2010; Ali & Ahmad, 2011; Jackson et al., 2010; O’Bannon &
McFadden, 2008; Paechter et al., 2010; Pelz, 2010). A final factor to course design is the
inclusion of interactive elements for inter-learner communication and interaction that
promotes social presence (Biasutti, 2011; George, 2013; Joo et al., 2011; Savery, 2010).
Social presence has been shown to contribute to an effective learning climate while
initiating, maintaining, and encouraging critical thinking (Driscoll et al., 2012; Ke, 2010;
Phelan, 2012; Schultz, 2012).
Summary
The literature exposes eight themes regarding online learning, dropout, and
learner satisfaction. These themes were explored in the forgoing paragraphs. A brief
introduction to the topic based on older works was included to establish the context for
132
important findings and models in eLearning dropout. Christensen (2013) identified
eLearning as a disruptive innovation that may yet completely transform higher education.
The benefits and affordances for institutions and learners were discussed in detail because
it is these affordances that provide a unique opportunity for learners to learn with no
boundaries (Bradford & Wyatt, 2010; Cho & Lien, 2011; Ho & Kuo, 2010; Hoic-Bozic et
al., 2009), greater flexibility time (Antonis et al., 2011; Lee & Choi, 2011; Mancuso et
al., 2010), while personalizing learning to suit their needs (Cheng et al., 2011; Ke &
Kwak, 2013; London & Hall, 2011; Potter & Rockinson-Szapkiw, 2012), while granting
more learner autonomy and control (Boling et al., 2011; Stein et al., 2009; Wilson &
Allen, 2011), collaboration with others (Ali & Ahmad, 2011; Baran et al., 2011;
Martinez-Torres et al., 2011; Ruey, 2010), and interactions (Abdous & Yen, 2010; Chen
& Lien, 2011; George, 2013; Gupta & Bostrom, 2009; Kupczynski et al., 2011; Omar et
al., 2011). The six factors that engender eLearning success were explored. These
sections demonstrated the importance in the online classroom of the learner–instructor
relationship, with its essential learning relationship (Crawford-Ferre & Wiest, 2012;
Hussain, 2013; Wang & Kania-Gosche, 2011), the chance for an engaging cognitive
presence (Darabi et al., 2011; Hoskins, 2012), reduced didactic role for the instructor
(Bradley, 2009; Chaves, 2009; Hoic-Bozic et al., 2009; Paechter et al., 2010), while
triggering classroom social presence (DeLotell et al., 2010; Joo et al., 2011). In addition,
the value of learner–learner interactions (Oncu & Cakir, 2011; Wang, 2010) and its
ability to generate social presence by meeting the online learners’ social needs in
education (Cheng et al., 2011; Kellogg & Smith, 2009) while enhancing learner
engagement through interaction (Abrami et al., 2010; Driscoll et al., 2012; Ismail et al.,
133
2011) and collaboration (Bradley, 2009; Ferratt & Hall, 2009; Lee et al., 2011; Phelan,
2012) and the use of technological means of communication (Abdous & Yen, 2010;
Abrami et al., 2010; Er et al., 2009; Ferratt & Hall, 2009). Success in an eLearning
context was found to rely on the interaction of the learner with the material of the course
(Abdous & Yen, 2010; Ali & Ahmad, 2011; Chaves, 2009) and the need for reflection
(Caine, 2010; Green & Ballard, 2011; London & Hall, 2011), as well as the development
of a sense of community (Ke & Hoadley, 2009; Kellogg & Smith, 2009; Lear et al., 2010;
Shea & Bidjerano, 2010) and collaborating with the instructor, other learners, and the
content of the course (Bhuasiri et al., 2012; Cheng et al., 2011; Hrastinski & Jaldemark,
2012). The final factors for eLearning success were the need for relating the material to
the person and making it relevant (Ally, 2008; Chaves, 2009; Ghost Bear, 2012; O’Toole
& Essex, 2012; Stern & Kauer, 2010), while seeking ways to increase the learner’s
motivation (Albert & Johnson, 2011; Alewine, 2010; Marschall & Davis, 2012; Michinov
et al., 2011) were also explored.
In conjunction with the examination of what has been found to work in eLearning,
it is just as important to probe the factors in the online context that bring failure and
dropout as well. After establishing that attrition in online programs is almost universally
higher than for traditional programs (Cacciamani et al., 2012; Oncu & Cakir, 2011;
Travis & Rutherford, 2013; Tuquero, 2011), the factors relating to students (Lee & Choi,
2011; Lykourentzou, Giannoukos, Nikolopoulos et al., 2009; Park & Choi, 2009), to
course or the program (Deil-Amen, 2011; Travis & Rutherford 2013; Tuquero, 2011),
and the learner’s environment (Lee & Choi, 2011) that heighten dropout were scrutinized.
Factors that can help predict learner persistence are the learner’s academic background,
134
whether they have affiliated experiences and requisite skills to meet the challenges of
online classes, and certain psychological attributes, such as an internal locus of control,
self-motivation, goal commitment, and a love of learning (Bhuasiri et al., 2012; Kellogg
& Smith, 2009; Paechter et al., 2010). Factors that influence retention that the institution
can affect are course design, institutional support, and encouraging interactions (Lee &
Choi, 2011). Course design includes program quality and team building activities (DeilAmen, 2011; Kuleshov, 2008; Travis & Rutherford, 2013; Tuquero, 2011), while
institutional support includes factors such as administrative support, a support
infrastructure, tutorial assistance, technical support, and program orientation (Ferratt &
Hall, 2009; Hoskins, 2012; Kellogg & Smith, 2009). Outside forces have also been
shown to precipitate dropout, which are considered environmental factors. These
influences usually stem from the learner’s work environment or from the peer or home
environment (Lee & Choi, 2011). Environmental factors were not discussed in great
detail.
Across the various factors of both success and failure of learners in eLearning one
learner attribute was consistently linked with them all; learner satisfaction (Ali & Ahmad,
2011; Hoskins, 2012; Yen & Abdous, 2011). The net finding is that when a learner is
satisfied with their program he or she tends to persist (Abdous & Yen, 2010; Ali &
Ahmad, 2011; DeLotell et al., 2010; Ferguson & DeFelice, 2010; Hoskins, 2012;
Hrastinski & Jaldemark, 2012; Lo, 2010; Revere & Kovach, 2011; Willging & Johnson,
2009; Yen & Abdous, 2011) and when learners who are not satisfied with their program
are much more likely to drop out (Jackson et al., 2010; Lassibille, 2011; Lo et al., 2011;
Lykourentzou, Giannoukos, Nikolopoulos et al., 2009; Martinez-Caro, 2011; Park &
135
Choi, 2009; Pigliapoco & Bogliolo, 2008). Thus, factors that engender this attribute were
also detailed and investigated.
136
Chapter 3: Research Method
The purpose of this quantitative, correlational study is to investigate relationships
between adult learner characteristics, instructional process design elements, and learner
satisfaction among adult learners in a postsecondary online environment with at least one
physical facility in Missouri. The problem addressed is the low satisfaction among adults
in online postsecondary courses since learner satisfaction has been considered the largest
determinant in reducing online dropout (Chen & Lien, 2011; Kozub, 2010; MartinezCaro, 2009). This study sought to extend current knowledge in the mitigation of adult
online course dropout through determining which adult learner characteristics and
instructional process design elements engender improvement in learner satisfaction
(Donavant, 2009; Gunawardena et al., 2010; Holton et al., 2009; Huang et al., 2012;
Taylor & Kroth, 2009b).
Research Method and Design
A correlational design is most appropriate for determining whether relationships
between the study variables exist, the strength of existing relationships, and the
mechanisms by which they relate (Aiken & West, 1991; Miles & Shevlin, 2001);
therefore, a correlation design will be used to explain the study constructs as
operationalized variables (Licht, 1995). A quantitative correlational design is appropriate
to evaluate the predictive value of 14 predictor variables: six adult learner characteristics
and eight instructional process design elements on the criterion variable of learner
satisfaction (Aiken & West, 1991; Miles & Shevlin, 2001). Results of this study may
support the applicability of the adult learner characteristics; (a) intrinsic motivation to
learn, (b) prior experience, (c) need to know, (d) readiness to learn, (e) self-directed
137
learning, and (f) orientation to learn, and the eight andragogical process design elements
of (g) preparing the learner, (h) climate setting, (i) mutual planning, (j) diagnosis of
learning needs, (k) setting of learning objectives, (l) designing the learning experience,
(m) learning activities, and (n) evaluation in instructional models for adult online higher
education courses, and thereby improve the predictability of theory, characteristics, and
process elements.
Descriptive and qualitative studies dominate the field of adult learning, especially
with regards to andragogy; therefore, another qualitative study is not an optimal choice
for furthering knowledge in the field of adult education (Brookfield, 1986; Long et al.,
1980; Merriam et al., 2007; Rachel, 2002; Taylor & Kroth, 2009b). An experimental or
quasi-experimental study involves manipulated numeric variables while all others are
held constant, which can be problematic in an educational setting and not appropriate for
the number of variables required for this study (Baron & Kenny, 1986; Campbell &
Stanley, 1963). Correlational studies are specifically useful in situations where
prediction of the effects of variables upon one another or for further exploration and
explication of these variables is desirable and thereby most appropriate for the current
study (Aiken & West, 1991; Licht, 1995; Miles & Shevlin, 2001). The study sought to
determine whether the variables are associated and predict online learner satisfaction.
Population
The population for this study consists of online postsecondary students who are
over age 24 and attend a postsecondary institution accredited by the HLC-NCA with at
least one physical facility in Missouri. In the State of Missouri, there are four schools
that comprise the state university system with an enrollment of 73,565 (University of
138
Missouri, 2011), nine public universities with an enrollment of 68,851 (U.S. Department
of Education [USDOE], 2009), and 23 private colleges and universities with an
enrollment of 87,369 (USDOE, 2009). Of the almost 230,000 students enrolled in
Missouri, about 37.2% are adults over the age of 24 (USDOE, 2009), and 31.3% have
taken at least one course online (Hoskins, 2012).
Sample
The sample was chosen through stratified random sampling to ensure that the
individuals in the sample represent the population as nearly as possible (Campbell &
Stanley, 1963). By sampling by stratum, any differences in students who choose public
state schools versus public universities or private colleges and universities will be
negated because they exist in the sample in the same proportions. Schools will be chosen
through stratified random sampling based on the stratum listed above from the list of all
qualifying schools that will serve as the sampling frame (see Appendix A), selecting
sufficient schools so that the number of potential subjects is three times larger than is
needful for the sample, with the same proportion as total enrollments (Newbold, Mehta,
& Forbus, 2010; Särndal, Swensson, & Wretman, 2003). Based on an assumed return
rate of 5% (Nulty, 2008), a total of 3,900 students in the target population will be
solicited for participation. One in three randomly selected qualifying students received
an email inviting them to participate in the study, with up to two follow-ups (Nulty,
2008). A representative sample of the target population was sought; demographic
information from the participants was used to ensure that the respondents are
representative based on the stratified categories of schools, as well as major, gender,
139
ethnicity, and age. Based on a G*Power analysis a minimum sample size of 194 is
required (Faul et al., 2009; see Appendix J).
Materials/Instruments
An online survey using the Andragogy in Practice Inventory (API; see Appendix
C & D) and the Learner Satisfaction and Transfer of Learning Questionnaire (LSTQ; see
Appendix E & F) was presented to the study sample and used to collect demographic
data, as well as responses regarding the study’s 14 predictor variables. Since 2005,
researchers have focused on the creation of a measurement instrument to assess the
validity of adult learner characteristics and instructional process design elements
empirically (Holton et al., 2009; Taylor & Kroth, 2009b; Wilson, 2005). The API was
used to isolate and measure these adult learner characteristics and instructional process
design elements and their application in the classroom (Holton et al., 2009). The API has
been found to provide sound psychometric qualities that measure many of the adult
learner characteristics and instructional process design elements with validity and
reliability. Further, the API has been previously utilized in undergraduate and graduate
university settings as a paper and pencil survey, but has not been previously conducted
online (Holton et al., 2009). The six adult learner characteristics that are measured in the
API are (a) intrinsic motivation to learn, (b) prior experience, (c) need to know, (d)
readiness to learn, (e) self-directed learning, and (f) orientation to learn (Holton et al.,
2009). The eight instructional process design elements are (g) preparing the learner, (h)
climate setting, (i) mutual planning, (j) diagnosis of learning needs, (k) setting of learning
objectives, (l) designing the learning experience, (m) learning activities, and (n)
evaluation (Holton et al., 2009). The API was originally produced because of a perceived
140
lack of empirical rigor in the practice of andragogy in the field. The current version of
the API consists of 60 five-point Likert-type scale questions. Of these, 24 questions
determine the learner’s assessment of whether a course conforms to the adult learner
characteristics propounded by Knowles and his associates, while 36 questions determine
conformity with andragogical instructional process design elements. A factor analysis of
the questions and constructs measured by the API was previously conducted. From that
analysis, the eigenvalues and Cronbach’s coefficient alpha showed each factor’s
reliability (see Table 1).
Table 1
Factors in the API and Learner Satisfaction scale on the LSTQ
Factor
Eigenvalue
Intrinsic motivation to learn
Prior Experience
Need to Know
Readiness to learn
Self-directed learning
Orientation to learn
Prepare the learner
Climate setting
Mutual planning
Diagnosis of learning needs
Setting of learning objectives
Designing the learning experience
Learning activities
Evaluation
Learner online satisfaction
15.69
1.63
1.51
1.26
1.10
-1.53
3.14
--17.38
1.48
1.29
1.82
--
Variance
Explained
44.8%
4.7%
4.3%
3.6%
3.2%
-4.6%
7.5%
--41.4%
3.5%
3.1%
4.3%
--
Cronbach’s
Alpha
.93
.84
.76
.81
.74
-.88
.91
--.90
.94
.68
.86
.83
In a previous study in which the API was validated, three of the constructs did not
emerge, and one had weaker reliability than is optimal, and the orientation to learn
construct was undifferentiatable from the motivation construct and was included in that
construct in the previous study (Holton et al., 2009). In this version of the API, the
questions regarding orientation to learn have been modified to operationalize the
141
construct (R. Bates, personal communication, February 19, 2013). The mutual planning
construct was excluded from the earlier API as it was perceived that it would be
inappropriate in a higher education setting (Holton et al., 2009; Wilson, 2005). Questions
have been added to represent the construct of mutual planning in this version of the API
to determine if the original assumption regarding applicability in higher education was
correct (R. Bates, personal communication, February 19, 2013). The diagnosis of
learning needs construct did not emerge from the data in the previous study because the
questions were weak and cross loaded with other factors and did not provide sufficient
reliability (Holton et al., 2009; Wilson, 2005). The questions representing the construct
of diagnosis of learning needs have been modified to better represent the construct in this
version of the API (R. Bates, personal communication, February 19, 2013). The learning
activities construct in the API had a lower Cronbach’s alpha than normally acceptable for
internal consistency (Hair et al., 2009; Nunnaly, 1978). Questions have been added to
operationalize the construct of learning activities in this version of the API (R. Bates,
personal communication, February 19, 2013), and a post hoc Cronbach’s alpha will be
calculated prior to data analysis to determine the reliability of the instrument.
The LSTQ (see Appendix E & F) was originally designed with eight subscales to
identify levels of online learner satisfaction in relationship to other predictor variables in
a separate study from the validation performed on the API (Gunawardena et al., 2010).
In the present research study, the learner satisfaction subscale will be utilized to measure
the criterion variable of learner satisfaction, and has previously demonstrated reliability
for Cronbach’s alpha (see Table 1).
142
Operational Definition of Variables
In this study, the relationship of six adult learner characteristics and eight
instructional process design elements were examined for their grouped and individual
impact on learner satisfaction in an online environment. The internal reliability of these
learner characteristics and instructional process design elements will also be calculated to
determine how well appropriate questions measure each construct. The 14 predictor
variables (a) intrinsic motivation to learn, (b) prior experience, (c) need to know, (d)
readiness to learn, (e) self-directed learning, (f) orientation to learn, (g) prepare the
learner, (h) climate setting, (i) mutual planning, (j) diagnosis of learning needs, (k) setting
of learning objectives, (l) designing of learning experience, (m) learning activities, and
(n) evaluation are operationally defined below, as is the single criterion variable, learner
online satisfaction.
Intrinsic motivation to learn. The andragogical learner characteristic of
intrinsic motivation to learn is an interval-level predictor variable (Miles & Shevlin,
2001) and will measure the amount of motivation the learner had to apply what was
learned to their life or work (Ali & Ahmad, 2011; Bye et al., 2007; Galbraith & Fouch,
2007; Kalyuga, 2011; Karge et al., 2011; Simonson et al., 1999). Intrinsic motivation to
learn is a construct that will be derived from the API and consists of four (1, 4, 5, & 9) 5point Likert-type questions that range from 1 (strongly disagree) to 5 (strongly agree).
The scale scores will be between 4 and 20; a higher score indicates that internal
incentives and curiosity are more likely to drive the learner’s learning experience, while a
lower score indicates that external motivators are the likely motivator.
143
Prior experience. The andragogical learner characteristic of prior experience is
an interval-level predictor variable and will measure whether the learner’s prior
experience was utilized in the learning experience (Allen et al., 2009; Blaschke, 2012;
Cabrera-Lozoya et al., 2012; Chen & Lien, 2011; Hurtado & Guerrero, 2009; Lee &
Choi, 2011; Tapscott & Williams, 2010). Prior experience is a construct that will be
derived from the API and consists of three (3, 10, & 17) 5-point Likert-type questions
that range from 1 (strongly disagree) to 5 (strongly agree). The scale scores will be
between 3 and 15; a higher score indicates that the learner’s life and experiences were a
rich resource contributing toward the learner’s learning experience, while a lower score
indicates that those life experiences are non-existent, or were not utilized in their online
experience.
Need to know. The andragogical learner characteristic of need to know is an
interval-level predictor variable and will measure how well the learning corresponded to
the learner’s needs (Baskas, 2011; Fidishun, 2011; Gibbons & Wentworth, 2001; Kenner
& Weinerman, 2011; Kiliç-Çakmak, 2010; Strang, 2009). Need to know is a construct
that will be derived from the API and consists of four (6, 7, 18, & 24) 5-point Likert-type
questions that range from 1 (strongly disagree) to 5 (strongly agree). The scale scores
will be between 4 and 20; a higher score indicates that the learner’s needs and motivation
for learning were important to the learner’s learning experience, while a lower score
indicates that the learner’s needs and motivations were largely ignored.
Readiness to learn. The andragogical learner characteristic of readiness to learn
is an interval-level predictor variable and will measure how well the learner took
responsibility for their learning (Cercone, 2008; Chyung & Vachone, 2005; Kenner &
144
Weinerman, 2011; Taylor & Kroth, 2009b). Readiness to learn is a construct that will be
derived from the API and consists of four (11, 15, 20, & 23) 5-point Likert-type questions
that range from 1 (strongly disagree) to 5 (strongly agree). The scale scores will be
between 4 and 20; a higher score indicates that the learner felt ready to engage in the
online learning experience, while a lower score indicates the learner perceived little value
to what was being learned.
Self-directed learning. The andragogical learner characteristic of self-directed
learning is an interval-level predictor variable and will measure the amount of control the
learner had over learning (Blanchard et al., 2011; Guilbaud & Jermoe-D’Emilia, 2008;
Kistler, 2011; McGlone, 2011). Self-directedness is a construct that will be derived from
the API and consists of five (2, 8, 12, 14, & 16) 5-point Likert-type questions that range
from 1 (strongly disagree) to 5 (strongly agree). The scale scores will be between 5 and
25; a higher score indicates the learner perceived he or she had greater control over her or
his learning experience, while a lower score indicates the learner perceived he or she had
little control over the learning experience.
Orientation to learn. The andragogical characteristic of orientation to learn is an
interval-level predictor variable and will measure how applicable the learner felt the
learning was to his or her needs and problems (Ghost Bear, 2012, Goddu, 2012, Knowles,
1995, Knowles et al., 2005; Lee et al., 2011; Taylor & Kroth, 2009b). Orientation to
learn is a construct that will be derived from the API and consists of four (13, 19, 21, &
22) 5-point Likert-type questions that range from 1 (strongly disagree) to 5 (strongly
agree). The scale scores will be between 4 and 20; a higher score indicates that the
145
learner perceived that the material learned was immediately applicable to their life, while
a lower score indicates the learner perceived little of applicability in the course material.
Prepare the learner. The andragogical instructional process design element of
prepare the learner is an interval-level predictor variable and will measure how well
prepared the learner felt he or she was for the learning experience (Knowles et al., 2005;
Lee & Choi, 2011). Prepare the learner is a construct that will be derived from the API
and consists of five (25, 27, 29, 32, & 51) 5-point Likert-type questions that range from 1
(strongly disagree) to 5 (strongly agree). The scale scores will be between 5 and 25; a
higher score indicates that the learner perceived that the instructor properly prepared her
or him for learning, while a lower score indicates that the objectives, syllabus, and means
of achieving in the class were unclear and the learner felt unprepared for learning.
Climate setting. The andragogical instructional process design element of
climate setting is an interval-level predictor variable and will measure the comfort level
of the learner during the learning experience (Cercone, 2008; Jackson et al., 2010; Omar
et al., 2011). Climate setting is a construct that will be derived from the API and consists
of six (28, 30, 33, 35, 38, & 40) 5-point Likert-type questions that range from 1 (strongly
disagree) to 5 (strongly agree). The scale scores will be between 6 and 30; a higher score
indicates that the learner perceived the classes physical and psychological setting to be
conducive to learning, while a lower score indicates the learner was uncomfortable in the
classroom setting.
Mutual planning. The andragogical instructional process design element of
mutual planning is an interval-level predictor variable and will measure the amount of
planning the learner took part in with the instructor and other learners to determine what
146
was to be learned (Holton et al., 2009; Revere et al., 2012). Mutual planning is a
construct that will be derived from the API and consists of four (26, 37, 39, & 56) 5-point
Likert-type questions that range from 1 (strongly disagree) to 5 (strongly agree). The
scale scores will be between 4 and 20; a higher score indicates that the learner was
involved with the instructor in determining what and how learning would occur, while a
lower score indicates the learner was not involved in determining what and how they
would learn.
Diagnosis of learning needs. The andragogical instructional process design
element of diagnosis of learning needs is an interval-level predictor variable and will
measure whether analysis occurred to assist the learner determine his or her learning
needs (Knowles et al., 2005; Taylor & Kroth, 2009b; Wilson, 2005). Diagnosis of
learning needs is a construct that will be derived from the API and consists of four (34,
41, 42, & 49) 5-point Likert-type questions that range from 1 (strongly disagree) to 5
(strongly agree). The scale scores will be between 4 and 20; a higher score indicates the
learner participated in some method of determining their learning starting point for the
online course, while a lower score indicates that little or no reflection occurred relative to
what was already known about the subject.
Setting of learning objectives. The andragogical instructional process design
element of setting of learning objectives is an interval-level predictor variable and will
measure the learner’s experience in setting individualized learning objectives (Lee &
Choi, 2011; Mezirow, 1997; Wang & Kania-Gosche, 2011). Setting of learning
objectives is a construct that will be derived from the API and consists of five (31, 43, 44,
45, & 47) 5-point Likert-type questions that range from 1 (strongly disagree) to 5
147
(strongly agree). The scale scores will be between 5 and 25; a higher score indicates that
the learner and instructor engaged in some negotiation in what was to be learned and
reached an agreement on what that was, while a lower score indicates no such negotiation
and agreement.
Designing the learning experience. The andragogical instructional process
design element of designing the learning experience is an interval-level predictor variable
and will measure how flexible the learning experience was regarding its design
(Bransford et al., 2005; Cornelius et al., 2011; Kash & Dessinger, 2010). Designing the
learning experience is a construct that will be derived from the API and consists of four
(36, 46, 50, & 52) 5-point Likert-type questions that range from 1 (strongly disagree) to 5
(strongly agree). The scale scores will be between 4 and 20; a higher score indicates the
learner acknowledged his or her responsibility to successfully complete the negotiated
agreement, while a lower score indicates the learner felt little or no responsibility toward
completing assignments.
Learning activities. The andragogical process element of learning activities is an
interval-level predictor variable and will measure the interactivity of the learning
environment (Allen et al., 2009; Baran et al., 2011; Chickering & Gamson, 1987; Revere
& Kovach, 2011). Learning activities is a construct that will be derived from the API and
consists of five (48, 53, 55, 57, & 59) 5-point Likert-type questions that range from 1
(strongly disagree) to 5 (strongly agree). The scale scores will be between 5 and 25; a
higher score indicates the learner and instructor carried out the learner’s lesson plan
through a variety of learning activities, while a lower score indicates limited and
conventional learning activities.
148
Evaluation. The andragogical instructional process design element of evaluation
is an interval-level predictor variable and will measure the utility of the evaluation
methods regarding the learner’s learning (Ally, 2008; Bradley, 2009; Bransford et al.,
2005; George, 2013; Ghost Bear, 2012). Evaluation is a construct that will be derived
from the API and consists of three (54, 58, & 60) 5-point Likert-type questions that range
from 1 (strongly disagree) to 5 (strongly agree). The scale scores will be between 3 and
15; a higher score indicates the learner took an active role in how she or he was
evaluation, while a lower score indicates that evaluation occurred through traditional
normative or summative testing.
Learner online satisfaction. Learner satisfaction is the interval-level criterion
variable. Researchers have shown that as learner satisfaction increases so does
persistence (Gunarwardena et al., 2010; Hoskins, 2012; Joo et al. 2011; Lee & Choi,
2011) and learning outcomes (Gunawardena et al., 2010; Kozub, 2010; Martinez-Caro,
2009; McGlone, 2011). Learner satisfaction will be calculated from the Learner
satisfaction subscale of the LSTQ; consisting of five 5-point Likert-type questions that
range from 1 (strongly disagree) to 5 (strongly agree). The scale scores will be between
5 and 25; a higher score indicates the learner was satisfied overall with the learning
experience, while a lower score indicates dissatisfaction with the overall learning
experience.
Data Collection, Processing, and Analysis
The provosts of postsecondary HLC-NCA accredited programs with physical
facilities in the state of Missouri were approached, the study explained, and permission
requested to include learners from their schools in the study (see Appendix G).
149
Participating school’s provosts were requested to write a letter of endorsement to their
learners requesting them to volunteer for the study (see Appendix H). The provosts were
also requested to send an e-mail with this endorsement and an electronic link to an online
survey to all learners who have taken at least one online course, either successfully or
unsuccessfully, and who are over the age of 24. Each email briefly described the study,
along with the school’s endorsement and request to participate. The questions were
randomly presented to the user in blocks of 15 and were encrypted on the download,
while answers were encrypted and stored on the servers at Survey Gizmo. De-identified
data was retrieved for analysis in encrypted form. The survey was a combination of two
pre-validated instruments, Holton et al.’s (2009) 66-item API, which measures six adult
learner characteristics, eight andragogical instructional process design elements, and six
demographic questions (see Appendix C), and the 5-item Satisfaction subscale of the
LSTQ to determine learner satisfaction (Gunawardena et al., 2010) with their most recent
online course (see Appendix E). Using an a priori analysis for linear multiple regression
(fixed model) with 14 total predictors the total number of participants who complete the
entire survey will need to exceed 194 to have sufficient power to be assured of obtaining
a significant result if one exists (delta = 0.15, alpha = .05, beta = .05; see Appendix J).
Data collected was reviewed to ensure completeness, and analyzed using
hierarchical regression analysis for hypothesis testing (Aiken & West, 1991; Miles &
Shevlin, 2001) to assess the relationship of the predictor variables to the criterion variable
(Hair et al., 2009) using IBM SPSS Statistics Package Version 21. Hierarchical
regression analysis assesses any variance explained in online learner satisfaction by the
adult learner characteristics and instructional process design elements and determines
150
whether either set is a significant predictor on the criterion variable (Cohen et al., 2003).
Hierarchical regression analysis allows for measuring the total variance of a set of
variables, while controlling for the effects of the other set on online learner satisfaction
(Cohen et al., 2003). To verify the data assumptions associated with regression analysis
were met, five tests were conducted. A Durbin-Watson statistic was calculated for serial
correlation among the residuals; scatterplots and histograms allowed checking for
linearity, and multicollinearity was checked between predictor variables using correlation
coefficients and Tolerance–Variable Inflation Factor values. Finally, the studentized
residuals was checked using normal Q-Q plots to ensure the data were approximately
normally distributed. The results of the regression analysis were used to determine the
predictor relationship between the six learner characteristics, collectively, and eight
process design elements, collectively, and the criterion variable (Cohen et al., 2003).
Although many of the preceding assumptions, if violated, may be transformed or
otherwise mitigated, in the event that regression assumptions were not met, a Gaussian
process regression (non-parametric) would have been conducted instead (Banerjee,
Carlin, & Gelfand, 2004). As part of the data analysis, the reliability of the modified API
and the Satisfaction subscale of the LSTQ were also analyzed using confirmatory factor
analysis (Albright & Park, 2008; Bartholomew, Steele, Moustaki, & Galbraith, 2008).
Although previous studies have determined that demographic variables such as gender,
ethnicity, and level of education are not significant predictors of learner satisfaction
(Bhuasiri et al., 2012), a separate analysis was conducted to determine whether any
significant differences exist among the demographic data (college major, gender,
ethnicity, education level, number of online classes, and age) and the criterion variable.
151
Assumptions
It is assumed that since correlative research depends upon associations between
variables (Adcock & Collier, 2001; Baron & Kenney, 1986), the API and the LSTQ will
appropriately measure the constructs for which they were designed (Gunawardena et al.,
2010; Holton et al., 2009; Wilson, 2005), and accurately reflect the dynamics between the
variables. Since invitations were only sent to present and previous online learners by the
Provosts or Chief Academic Officers, it is assumed that participants were online learners
at one of these schools. Finally, it is assumed that participants will provide honest,
forthright, and complete answers to the questions presented to them in the online
instrument.
Limitations
Surveys have many advantages, but cannot always measure a target population
exactly on any indices, but may provide an estimate of the true population (Levy, 2013;
Thomas et al., 2010). A limitation may exist as participants may fail to complete the
survey, not answer all questions accurately or completely, may intentionally misreport, or
may have poor recall of the events or circumstances requested resulting in limitations of
study results (Glasow, 2005). Randomly selected participants were invited to participate
in the study and the demographics percentages of major, gender, ethnicity, education
level, and age were compared with the target population’s percentages to ensure
representativeness of the population to minimize possible selection bias (Dimitrov &
Rumrill, 2003; Mehta et al., 2007). If it is determined that certain stratum have been
oversampled, statistical adjustments will be made to properly weight the data to correct
for potential non-response bias (Perl, Greely, & Gray, 2006; Sax, Gilmartin, Bryant,
152
2003). Even so, each participant self-selected to participate, meaning a limitation may
result from those who choose to participate versus those who do not (Li, 2012; Mehta et
al., 2007; Strang, 2009). Participants will be selected from schools with physical
locations in Missouri, which could limit the generalizability of the findings to other
states, regions or countries; though random sampling and a verifiable statistical
demographic sample, as measured by major, gender, ethnicity, education level, and age
will ensure that the results are generalizable to the state of Missouri.
Delimitations
Several delimitations were made to narrow the scope of this study. The first and
second is that participants will be adult postsecondary students and over age 24. The
rationale for choosing mature adults (over the age of 24) and for choosing postsecondary
students is based on the theoretical framework of andragogy, whereby learners in this
environment are more likely to manifest the adult learning characteristics under study
(Knowles, 1980). In addition, between 2000 and 2010, the number of American students
in higher education who were over 25 rose by 42% and another increase of 20% in
enrollments in this age group is expected between 2010 and 2020 (Goddu, 2012). The
third delimitation is that participants have attended at least one online course to ensure
that adult learner participants must have some experience with the method of delivery. In
this way, learner’s responses will be based on actual experience with an online course
rather than conjectured experience. The final delimitation of participants is that the
school where they attended will have at least one physical facility in the state of Missouri
and be HLC-NCA accredited. An HLC-NCA accredited program ensures that an
accredited school has a clearly stated mission; its operations are based on that mission,
153
that the school acts in an ethical manner, provides high quality education through all of
its delivery methods (including online), is always seeking to improve its offerings, and
has sufficient resources to continue with this criterion (HLC-NCA, 2013). Because of the
expected high standards and quality of an accredited school, the researcher may
determine that the online program meets certain minimal requirements. Finally, there are
many accredited programs, so to delimit the scope of the study it was determined to only
include postsecondary schools that have at least one physical facility in the state of
Missouri. Despite the stratified random sampling method to obtain a generalizable
sample from Missouri-based colleges and universities, there is no evidence that this
sample would be further generalizable to other U.S. higher education institutions or
online higher education learners who are less than 25 or who participate in other than
online higher education training.
Ethical Assurances
Before collecting any data, the appropriate forms were completed and submitted
to Northcentral University’s Institutional Review Board (IRB), along with any IRB
approval’s from the collaborating colleges, and approval to conduct the research
received (see Appendix H). Following IRB approvals, solicitation emails were sent to
potential participants. Each participant was presented with an informed consent form
written at an eighth grade comprehension level, and the following elements; (a) an
explanation of the research being conducted, (b) associated risks, (c) what the study is
designed to determine, (d) a statement regarding confidentiality, (e) researcher contact
information, and (f) a statement regarding voluntary participation and non-consequential
withdrawal (USDHHS, 1979; see Appendix I). An option was provided for the learner to
154
print the informed consent page. Since the informed consent was entered online,
acceptance consisted of the participant clicking on a link to acknowledge understanding
and enter the study. No possibility existed for collecting data unknowingly from
participants since informed consent was presented to each participant before collection of
data, and participant names or identifying information were not collected as part of the
data. Although no identifying information was collected, participants’ confidentiality
was further assured through data encryption and electronic storage and by using
potentially identifying information in findings and reports, like demographic data, in the
aggregate.
Summary
The purpose of this quantitative correlational study is to investigate relationships
between the six adult learner characteristics and eight instructional process design
elements in an adult online learning environment, jointly and severally, and learner
satisfaction. The population of interest in this study consists of postsecondary learners
who have taken at least one online course from an HLC-NCA accredited program with
physical facilities in the state of Missouri. Data was collected from participants, after
obtaining appropriate informed consent, through access to an online survey adapted from
two pre-validated instruments: the API (Holton et al., 2009) and the Satisfaction subscale
of the LSTQ (Gunawardena et al., 2010). Data were analyzed using hierarchical
regression analysis to determine relationships among the six learner characteristics and
eight instructional process design elements and learner satisfaction (Aiken & West,
1991; Cohen et al., 2003; Miles & Shevlin, 2001). The study results may add to the
limited store of quantitative empirical research on the effect of andragogy on learning
155
outcomes (Henschke, 2011; Holton et al., 2009; Taylor & Kroth, 2009b), and
specifically, whether adult learner characteristics and instructional process design
elements predict and enhance learner satisfaction in a postsecondary online environment.
156
Chapter 4: Findings
Results
Descriptive information.
Demographic analysis.
Instrument reliability.
Research question one.
Validation tests and assumptions.
Results.
Research question two.
Validation tests and assumptions.
Results.
Evaluation of Findings
Summary
157
Chapter 5: Implications, Recommendations, and Conclusions
Implications
Research question one.
Research question two.
Recommendations
Conclusions
158
References
Abdous, M., & Yen, C.-J. (2010). A predictive study of learner satisfaction and outcomes
in face-to-face, satellite broadcast, and live video-streaming learning
environments. Internet and Higher Education, 13(2010), 248-257.
http://dx.doi.org/10.1016/j.iheduc.2010.04.005
Abela, J. (2009). Adult learning theories and medical education: A review. Malta
Medical Journal, 21(1), 11-18. Retrieved from
http://www.um.edu.mt/umms/mmj/PDF/234.pdf
Abrami, P. C., & Bernard, R. M. (2006). Research on distance education: In defense of
field experiments. Distance Education, 27(1), 5-26.
http://dx.doi.org/10.1080/01587910600653116
Abrami, P. C., Bernard, R. M., Bures, E. M., Borokhovski, E., & Tamim, R. (2010).
Interaction in distance education and online learning: Using evidence and theory
to improve practice. The Evolution from Distance Education to Distributed
Learning. Symposium conducted at Memorial Union Biddle Hotel, Bloomington,
IN. http://dx.doi.org/10.1007/s12528-011-9043-x
Adamson, C. W., & Bailie, J. W. (2012). Education versus learning: Restorative practices
in higher education. Journal of Transformative Education, 10, 139-156.
http://dx.doi.org/10.1177/1541344612463265
Adcock, R., & Collier, D. (2001). Measurement validity: A shared standard for
qualitative and quantitative research. American Political Science Review, 95, 529546. http://dx.doi.org/10.1017/S0003055401003100
Aiken, L. S., & West, S. G. (1991). Multiple regression: Testing and interpreting
interactions. Thousand Oaks, CA: Sage.
Al-Asfour, A. (2012). Examining student satisfaction of online statistics courses. Journal
of College Teaching & Learning, 9(1), 33-38. Retrieved from
http://journals.cluteonline.com/index.php/TLC/article/view/6764
Albert, L. J., & Johnson, C. S. (2011). Socioeconomic status - and gender-based
differences in students' perceptions of e-learning systems. Decisions Sciences
Journal of Innovative Education, 9, 421-436. http://dx.doi.org/10.1111/j.15404609.2011.00320.x
Albright, J. J., & Park, H. M.. (2008). Confirmatory factor analysis using Amos, LISREL,
Mplus, and SAS/STAT CALIS [Technical Working Paper]. The University
Information Technology Services (UITS) Center for Statistical and Mathematical
Computing, Indiana University. Retrieved from
http://www.indiana.edu/~statmath/stat/all/cfa/
159
Alewine, H. S. (2010). Andragogical methods and readiness for the correctional GED
classroom. Journal of Correctional Education, 61(1), 9-22. Retrieved from
http://www.ceanational.org/journal.htm
Al-Fahad, F. N. (2010). The learners’ satisfaction toward online e-learning implemented
in the college of applied studies and community service, King Saud University,
Saudi Arabia: Can e-learning replace the conventional system of education?
Turkish Online Journal of Distance Education (TOJDE), 11(2), 61-72. Retrieved
from https://tojde.anadolu.edu.tr/
Ali, A., & Ahmad, I. (2011). Key factors for determining students’ satisfaction in
distance learning courses: A study of Allama Iqbal Open University.
Contemporary Educational Technology, 2, 118-134. Retrieved from
http://cedtech.net/
Allen, B., Crosky, A., McAlpine, I., Hoffman, M., & Munroe, P. (2009). A blended
approach to collaborative learning: Making large group teaching more studentcentred. The International Journal of Engineering Education, 25, 569-576.
Retrieved from http://www.ijee.ie/
Allen, I. E., & Seaman, J. (2011). Going the distance: Online education in the United
States, 2011. Retrieved from Babson Survey Research Group website:
http://www.babson.edu/Academics/centers/blank-center/globalresearch/Documents/going-the-distance.pdf
Allen, I. E., & Seaman, J. (2013). Changing course: Ten years of tracking education in
the United States, 2011. Babson Park, MA: Babso Survey Research Group and
Quahog Research Group. Retrieved from
http://www.onlinelearningsurvey.com/reports/changingcourse.pdf
Alley, K. (2011). Logistics regression to determine the influence of Bean and Metzner’s
persistence factors as defined by the community college survey of student
engagement (CCSSE) on nontraditional students (Doctoral dissertation,
University of Missouri, Columbia). Retrieved from
https://mospace.umsystem.edu/xmlui/bitstream/handle/10355/15762/public.pdf?s
equence=1
Ally, M. (2008). Foundations of educational theory for online learning. In T. Anderson
(Ed.), The theory and practice of online learning (pp. 15-44). Edmonton, AB:
Athabasca University.
Alshare, K. A., Freeze, R. D., Lane, P. L., & Wen, H. J. (2011). The impacts of system
and human factors on online learning systems use and learner satisfaction.
Decision Sciences: Journal of Innovative Education, 9, 437-461.
http://dx.doi.org/10.1111/j.1540-4609.2011.00321.x
160
Amrein-Beardsley, A. A., & Haladyna, T. T. (2012). Validating a Theory-Based Survey
to Evaluate Teaching Effectiveness in Higher Education. Journal on Excellence in
College Teaching, 23(1), 17-42. Retrieved from ERIC database. (EJ865432)
Anderson, T. (2008a). Teaching in an online learning context. In T. Anderson (Ed.), The
theory and practice of online learning (pp. 343-365). Edmonton, AB: Athabasca
University.
Anderson, T. (2008b). Towards a theory of online learning. In T. Anderson (Ed.), The
theory and practice of online learning (pp. 45-74). Edmonton, AB: Athabasca
University.
Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teaching
presence in a computer conferencing context. Journal of Asynchronous Learning
Networks, 5(2), 1-17. Retrieved from
http://sloanconsortium.org/sites/default/files/v5n2_anderson_1.pdf
Andrews, R., & Haythornthwaite, C. (2007). Introduction to e-learning research. In R.
Andrews, & C. Haythornthwaite (Eds.). The Sage handbook of e-learning
research (pp. 1-51). Los Angeles, CA: Sage.
Antonis, K., Daradoumis, T., Papadakis, S., & Simos, C. (2011). Evaluation of the
effectiveness of a web‐based learning design for adult computer science courses.
IEEE Transactions on Education, 54, 374‐380.
http://dx.doi.org/10.1109/TE.2010.2060263
Archambault, L., Wetzel, K., Fouger, T. S., & Williams, M. K. (2010). Professional
development 2.0: Transforming teacher education pedagogy with 21st century
tools. Journal of Digital Learning in Teacher Education, 27(1), 4-11. Retrieved
from http://www.iste.org/learn/ publications/journals/jdlte.aspx
Astin, A. W. (1999). Student involvement: A developmental theory for higher education.
Journal of College Student Development, 40, 518-529. (Reprinted from Astin, A.
(1984). Student involvement: A developmental theory for higher education.
Journal of College Student Personnel 25, 297-308). Retrieved from
https://www.middlesex.mass.edu/ace/downloads/astininv.pdf
Ayers, D. F. (2011). A critical realist orientation to learner needs. Adult Education
Quarterly, 61, 341-357. http://dx.doi.org/10.1177/0741713610392769
Bahr, N., & Rohner, C. (2004, July). The judicious utilization of new technologies
through authentic learning in higher education: A case study. Annual Conference
Proceedings of Higher Education Research and Development Society of
Australasia, Miri, Sarawak, 27. Retrieved from http://www.herdsa.org.au/wpcontent/uploads/conference/2004/papers/bahr.pdf
Bala, S. (2010). Adopting advancements of ICT: A necessity for the empowerment of
teacher educators. GYANODAYA: The Journal of Progressive Education, 3(1),
161
29-35. Retrieved from
http://indianjournals.com/ijor.aspx?target=ijor:gjpe&type=home
Banerjee, S., Carlin, B. P., & Gelfand, A. E. (2004). Hierarchical modeling and analysis
for spatial data. Abingdon, England: Taylor and Francis.
Baran, E., Correia, A., & Thompson, A. (2011). Transforming online teaching practice:
Critical analysis of the literature on the roles and competencies of online teachers.
Distance Education, 32, 421-439.
http://dx.doi.org/10.1080/01587919.2011.610293
Barber, J. P. (2012). Integration of learning: A grounded theory analysis of college
students' learning. American Educational Research Journal, 49, 590-617.
http://dx.doi.org/10.3102/0002831212437854
Baron, R. M., & Kenny, D. A. (1986). The moderator-mediator variable distinction in
social psychological research: Conceptual, strategic, and statistical considerations.
Journal of Personality and Social Psychology, 51, 1173-1182. Retrieved from
http://www.public.asu.edu/~davidpm/classes/psy536/Baron.pdf
Barrett, B. F. D., Higa, C., & Ellis, R. A. (2012). Emerging university student
experiences of learning technologies across the Asia Pacific. Computers &
Education, 58, 1021-1027. http://dx.doi.org/10.1016/j.compedu.2011.11.017
Barros, S. R. (2013). Of metaphors and spaces within: The language of curriculum and
pedagogy in the hyperspace. Journal of Curriculum and Pedagogy, 10, 140-157.
http://dx.doi.org/10.1080/15505170.2013.838613
Bartholomew, D. J., Steele, F., Moustaki, I, & Galbraith, J. (2008). Analysis of
multivariate social science data (2nd ed.). London, UK: Chapman & Hall.
Baskas, R. S. (2011). Applying adult learning and development theories to educational
practice. Retrieved from ERIC Database. (ED519926)
Bass, C. (2012). Learning theories and their application to science instruction for adults.
The American Biology Teacher, 74, 387-390.
http://dx.doi.org/10.1525/abt.2012.74.6.6
Bean, J. P., & Metzner, B. S. (1985). A conceptual model of nontraditional undergraduate
student attrition. Review of Educational Research, 55, 485-540.
http://dx.doi.org/10.3102/00346543055004485
Behar, P. A. (2011). Constructing pedagogical models for e-learning. International
Journal of Advanced Corporate Learning, 4(3), 16-22.
http://dx.doi.org/10.3991/ijac.v4i3.1713
162
Belair, M. (2012). The investigation of virtual school communications. Techtrends:
Linking Research & Practice to Improve Learning, 56(4), 26-33.
http://dx.doi.org/10.1007/s11528-012-0584-2
Bell, F. (2011). Connectivism: Its place in theory-informed research and innovation in
technology-enabled learning. International Review Of Research In Open &
Distance Learning, 12(3), 98-118. Retrieved from ERIC database. (EJ920745)
Belzer, A. (2004). “It's not like normal school”: The role of prior learning contexts in
adult learning. Adult Education Quarterly, 55, 41-59.
http://dx.doi.org/10.1177/0741713604268893
Bennett, S., Bishop, A., Dalgarno, B., Waycott, J., & Kennedy, G. (2012). Implementing
web 2.0 technologies in higher education: A collective case study. Computers &
Education, 59, 524-534. http://dx.doi.org/10.1016/j.compedu.2011.12.022
Beqiri, M. S., Chase, N. M., & Bishka, A. (2010). Online course delivery: An empirical
investigation of factors affecting student satisfaction. Journal of Education for
Business, 85, 95-100. http://dx.doi.org/10.1080/08832320903258527
Bhuasiri, W., Xaymoungkhoun, O., Zo, H., Rho, J. J., & Ciganek, A. P. (2011). Critical
success factors for e-learning in developing countries: A comparative analysis
between ICT experts and faculty. Computers & Education, 58, 843-855.
http://dx.doi.org/10.1016/j.compedu.2011.10.010
Biasutti, M. (2011). The student experience of a collaborative e-learning university.
Computers & Education, 57, 1865-1875.
http://dx.doi.org/10.1016/j.compedu.2011.04.006
Blanchard, R. D., Hinchey, K. T., & Bennett, E. E. (2011). Literature review of residents
as teachers from an adult learning perspective. Paper presented at the annual
meeting of the American Educational Research Association, New Orleans, LA.
Retrieved from
http://www.eric.ed.gov/ERICWebPortal/contentdelivery/servlet/ERICServlet?acc
no=ED521385
Blaschke, L. (2012). Heutagogy and lifelong learning: A review of heutagogical practice
and self-determined learning. International Review of Research in Open and
Distance Learning, 13(1), 56-71. Retrieved from
www.irrodl.org/index.php/irrodl/article/download/1076/2113
Boling, E. C., Hough, M., Krinsky, H., Saleem, H., & Stevens, M. (2011). Cutting the
distance in distance education: Perspectives on what promotes positive, online
learning experiences. Internet and Higher Education.
http://dx.doi.org/10.1016/j.iheduc.2011.11.006
163
Bolliger, D. U., & Halupa, C. (2012). Student perceptions of satisfaction and anxiety in
an online doctoral program. Distance Education, 33(1), 81-98.
http://dx.doi.org/10.1080/01587919.2012.667961
Bradford, G. R. (2011). A relationship study of student satisfaction with learning online
and cognitive load: Initial results. Internet and Higher Education, 14, 217-228.
http://dx.doi.org/10.1016/j.iheduc.2011.05.001
Bradford, G., & Wyatt, S. (2010). Online learning and student satisfaction: Academic
standing, ethnicity and their influence on facilitated learning, engagement, and
information fluency. Internet and Higher Education, 13, 108-114.
http://dx.doi.org/10.1016/j.iheduc.2010.02.005
Bradley, J. (2009). Promoting and supporting authentic online conversations – which
comes first – the tools or instructional design? International Journal of
Pedagogies and learning, 5(3), 20-31. http://dx.doi.org/10.5172/ijpl.5.3.20
Bransford, J., Vye, N., Stevens, R., Kuhl, P., Schwartz, D., Bell, P., . . . Sabelli, N.
(2006). Learning theories and education: Toward a decade of synergy. In P.
Alexander & P. Winne (Eds.), Handbook of educational psychology (pp. 1-95).
Mahwah, NJ: Erlbaum.
Brookfield, S. D. (1986). Understanding and facilitating adult learning. San Francisco,
CA: Jossey-Bass.
Brown, G. H. (1947). A comparison of sampling methods. Journal of Marketing, 11, 331337. Retrieved from
http://www.jstor.org/discover/10.2307/1246272?uid=3739448&uid=2129&uid=2
&uid=70&uid=3737720&uid=4&sid=21103110710457
Brown, J. L. M. (2012). Online learning: A comparison of web-based and land-based
courses. Quarterly Review of Distance Education, 13, 39-42. Retrieved from
http://www.infoagepub.com/quarterly-review-of-distance-education.html
Burke, L. A., & Hutchins, H. M. (2007). Training transfer: An integrative literature
review. Human Resource Development Review, 6, 263-296.
http://dx.doi.org/10.1177/1534484307303035
Bye, D., Pushkar, D., & Conway, M. (2007). Motivation, interest, and positive affect in
traditional and nontraditional undergraduate students. Adult Education Quarterly,
57, 141-158. http://dx.doi.org/10.1177/0741713606294235
Cabrera-Lozoya, A., Cerdan, F., Cano, M.‐D., Garcia‐Sanchez, D., & Lujan, S. (2012).
Unifying heterogeneous e-learning modalities in a single platform: CADI, a case
study. Computers & Education, 58, 617‐630.
http://dx.doi.org/10.1016/j.compedu.2011.09.014
164
Cacciamani, S., Cesareni, D., Martini, F., Ferrini, T., & Fujita, N. (2012). Influence of
participation, facilitator styles, and metacognitive reflection on knowledge
building in online university courses. Computers & Education, 58, 874-884.
http://dx.doi.org/10.1016/j.compedu.2011.10.019
Caine, G. (2010). Making connections between e-learning and natural learning. ELearning Handbook, 33, 1-10. Retrieved from
http://www.cainelearning.com/PRODUCTS/From_natural_learning_to_elearning.
pdf
Callens, J. C. (2011). Is the assumed necessity to give learner control in distance
education an ‘urban legend’? Proceedings of 14th International Conference on
Interactive Collaborative Learning Conference, Piestany, Slovakia, 279-280.
http://dx.doi.org/10.1109/ICL.2011.6059590
Caminotti, E., & Gray, J. (2012). The effectiveness of storytelling on adult learning.
Journal of Workplace Learning, 24, 430-438.
http://dx.doi.org/10.1108/13665621211250333
Campbell, D. T., & Stanley, J. (1963). Experimental and quasi-experimental designs for
research. Boston, MA: Cengage Learning.
Canning, N. (2010). Playing with heutagogy: Exploring strategies to empower mature
learners in higher education. Journal of Further & Higher Education, 34, 59-71.
http://dx.doi.org/10.1080/03098770903477102
Carnevale, A. P., Smith, N., & Strohl, J. (2010). Help wanted: Projections of jobs and
education requirements through 2018. Retrieved from Georgetown University
Center on Education and the Workforce website:
http://www9.georgetown.edu/grad/gppi/hpi/cew/pdfs/FullReport.pdf
Cercone, K. (2008). Characteristics of adult learners with implications for online learning
design. Association for the Advancement of Computing in Education Journal
(AACE), 16, 137-159. Retrieved from http://www.editlib.org/j/AACEJ
Chan, S. (2010). Applications of andragogy in multi-disciplined teaching and learning.
Journal of Adult Education, 39(2), 25-35. Retrieved from ERIC database.
(EJ930244)
Chaves, C. A. (2009). On-line course curricula and interactional strategies: The
foundations and extensions to adult e-learning communities. European Journal of
Open, Distance and E-Learning, 1. Retrieved from
http://www.eurodl.org/materials/contrib/2009/Christopher_Chaves.pdf
Chen, L.-C., & Lien, Y.-H. (2011). Using author co-citation analysis to examine the
intellectual structure of e-learning: A MIS perspective. Scientometrics, 89, 867886. http://dx.doi.org/10.1007/s11192-011-0458-y
165
Chen, P.-S., & Chih, J.-T. (2012). The relations between learner motivation and
satisfaction with aspects of management training. International Journal of
Management, 29, 545-561. Retrieved from
http://www.internationaljournalofmanagement.co.uk/
Cheng, B., Wang, M., Yang, S. J. H., Kinshuk, & Peng, J. (2011). Acceptance of
competency-based workplace e-learning systems: Effects of individual and peer
learning support. Computers & Education, 57, 1317-1333.
http://dx.doi.org/10.1016/j.compedu.2011.01.018
Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in
undergraduate education. American Association for Higher Education Bulletin,
39(7), 3-7. Retrieved from ERIC database. (ED282491)
Christensen, C. (2013). Disruptive innovation. Retrieved from Clayton Christensen
website: http://www.claytonchristensen.com/key-concepts/
Christensen, C., Johnson, C. W., & Horn, M. B. (2008). Disrupting class: How disruptive
innovation will change the way the world learns. New York, NY: McGraw-Hill.
Christie, M. M., & Garrote Jurado, R. R. (2009). Barriers to innovation in online
pedagogy. European Journal of Engineering Education, 34, 273-279.
http://dx.doi.org/10.1080/03043790903038841
Chu, R. J., Chu, A. Z., Weng, C., Tsai, C.-C., & Lin, C.-C. (2012). Transformation for
adults in an Internet-based learning environment: Is it necessary to be selfdirected? British Journal of Educational Technology, 43, 205-216.
http://dx.doi.org/10.1111/j.1467-8535.2010.01166.x
Chyung, S. Y., & Vachon, M. (2005). An investigation of the profiles of satisfying and
dissatisfying factors in e-learning. Performance Improvement Quarterly, 59, 227245. http://dx.doi.org/10.1177/0741713609331546
Clapper, T. C. (2010). Beyond Knowles: What those conducting simulation need to know
about adult learning theory. Clinical Simulation in Nursing, 6, e7-e14.
http://dx.doi.org/10.1016/j.ecns.2009.07.003
Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2003). Applied multiple
regression/correlation analysis for the behavioral sciences (3rd ed.). New York,
NY: Routledge/Taylor & Francis Group.
Conceicao, S. (2002). The sociocultural implications of learning and teaching in
cyberspace. New Directions for Adult and Continuing Education, 96, 37-45.
http://dx.doi.org/10.1002/ace.77
Connors, S. (2005.). Assessing mentor characteristics in an online environment: A study.
In C. Crawford et al. (eds.), Proceedings of Society for Information Technology &
166
Teacher Education International Conference 2005 (pp. 319-323). Chesapeake,
VA: AACE.
Cornelius, S., Gordon, C., & Ackland, A. (2011). Towards flexible learning for adult
learners in professional contexts: An activity-focused course design. Interactive
Learning Environments, 19, 381-393.
http://dx.doi.org/10.1080/10494820903298258
Cox, T. (2013). Adult learning orientations: The case of language teachers in Peru.
International Forum of Teaching and Studies, 9(1), 3-10. Retrieved from
http://americanscholarspress.com/IFST.html
Crawford-Ferre, H., & Wiest, L. R. (2012). Effective online instruction in higher
education. Quarterly Review of Distance Education, 13, 11-14. Retrieved from
http://www.infoagepub.com/quarterly-review-of-distance-education.html
Darabi, A., Arrastia, M. C., Nelson, D. W., Cornille, T., & Liang, X. (2010). Cognitive
presence in asynchronous online learning: A comparison of four discussion
strategies. Journal of Computer Assisted Learning, 27, 216-227.
http://dx.doi.org/10.1111/j.1365-2729.2010.00392.x
Deil-Amen, R. (2011). Socio-academic integrative moments: Rethinking academic and
social integration among two-year college students in career-related programs.
Journal of Higher Education, 82(1), 54-91.
http://dx.doi.org/10.1353/jhe.2011.0006
DeLotell, P. J., Millam, L. A., & Reinhardt, M. M. (2010). The use of deep learning
strategies in online business courses to impact student retention. American
Journal of Business Education, 3(12), 49-55. Retrieved from
http://journals.cluteonline.com/index.php/AJBE/article/viewFile/964/948
Desai, M. S., Hart, J., & Richards, T. C. (2008). E-learning: Paradigm shift in education.
Education, 129(2), 327-334. Retrieved from ERIC Database. (EJ871567)
Deulen, A. A. (2013). Social constructivism and online learning environments: Toward a
theological model for Christian educators. Christian Education Journal, 10(1),
90-98. Retrieved from
http://mmljournalfeed.wordpress.com/2013/04/09/christian-education-journal-vol10-no-1/
Dewey, J. (1938/1997). Experience and education. New York, NY: Free Press.
Diaz, L. A., & Entonado, F. B. (2009). Are the functions of teachers in e-learning and
face-to-face learning environments really different? Educational Technology &
Society, 12, 331-343. Retrieved from http://www.ifets.info/
Dibiase, D., & Kidwai, K. (2010). Wasted on the young? Comparing the performance
and attitudes of younger and older US adults in an online class on geographic
167
information. Journal of Geography in Higher Education, 34, 299-326.
http://dx.doi.org/10.1080/03098265.2010.490906
Dimitrov, D. M., & Rumrill, P. D., Jr. (2003). Pretest-posttest designs and measurement
of change. Work, 20, 159-165. Retrieved from
http://www.phys.lsu.edu/faculty/browne/MNS_Seminar/JournalArticles/Pretestposttest_design.pdf
Doherty-Restrepo, J. L., Hughes, B. J., Del Rossi, G., & Pitney, W. A. (2009). Evaluation
models for continuing education program efficacy: How does athletic training
continuing education measure up? Athletic Training Education Journal, 4, 117124. Retrieved from http://nataej.org/4.3/Doherty-Restrepo_Final.pdf
Donavant, B. W. (2009). The new, modern practice of adult education: Online instruction
in a continuing professional education setting. Adult Education Quarterly, 59,
227-245. http://dx.doi.org/10.1177/0741713609331546
Dorin, M. (2007). Online education of older adults and its relation to life satisfaction.
Educational Gerontology, 33, 127-143.
http://dx.doi.org/10.1080/03601270600850776
Driscoll, A., Jicha, K., Hunt, A. N., Tichavsky, L., & Thompson, G. (2012). Can online
courses deliver in-class results?: A comparison of student performance and
satisfaction in an online versus a face-to-face introductory sociology course.
Teaching Sociology, 40, 312-331. http://dx.doi.org/10.1177/0092055x12446624
Drouin, M. A., (2008). The relationship between students’ perceived sense of community
and satisfaction, achievement, and retention in an online course. The Quarterly
Review of Distance Education, 9, 267-284. Retrieved from
http://www.infoagepub.com/index.php?id=89&i=2
Dykman, C. A., & Davis, C. K. (2008a). Online education forum: Part one—The shift
toward online education. Journal of Information Systems Education, 19, 11-16.
Retrieved from http://www.jise.appstate.edu/Issues/19/V19N1P11abs.pdf
Dykman, C. A., & Davis, C. K. (2008b). Online education forum: Part two—Teaching
online versus teaching conventionally. Journal of Information systems Education,
19, 157-164. Retrieved from http://www.jise.appstate.edu/Issues/19/V19N2P157abs.pdf
Dykman, C. A., & Davis, C. K. (2008c). Online education forum: Part three—A quality
online educational experience. Journal of Information Systems Education, 19,
281-290. Retrieved from http://www.jise.appstate.edu/Issues/19/V19N3P281abs.pdf
Dziuban, C., & Moskal, P. (2011). A course is a course is a course: Factor invariance in
student evaluation of online, blended and face-to-face learning environments. The
168
Internet and Higher Education, 14, 236-241.
http://dx.doi.org/10.1016/j.iheduc.2011.05.003
Ekmekci, O. (2013). Being there—Establishing instructor presence in an online learning
environment. Higher Education Studies, 3, 29-38.
http://dx.doi.org/10.5539/hes.v3n1p29
Ekstrand, B. (2013). Prerequisites for persistence in distance education. Online Journal of
Distance Learning Education, 16(3). Retrieved from
http://www.westga.edu/~distance/ojdla/fall163/ekstrand164.html
Eneau, J., & Develotte, C. (2012). Working together online to enhance learner autonomy:
Analysis of learners' perceptions of their online learning experience. ReCALL,
24(1), 3-19. http://dx.doi.org/10.1017/S0958344011000267
Er, E., Ozden, M. Y., & Arifoglu, A. (2009). LIVELMS: A blended e-learning
environment: A model proposition for integration of asynchronous and
synchronous e-learning. International Journal of Learning, 16, 449-460.
Retrieved from http://ijl.cgpublisher.com/product/pub.30/prod.2066
Fahy, P. J. (2008). Characteristics of interactive online learning media. In T. Anderson
(Ed.), The theory and practice of online learning (pp. 167-199). Edmonton, AB:
Athabasca University.
Falloon, G. (2011). Making the connection: Moore's theory of transactional distance and
its relevance to the use of a virtual classroom in postgraduate online teacher
education. Journal of Research on Technology in Education, 43, 187-209.
Retrieved from http://www.iste.org/learn/publications/journals/jrte
Ferratt, T. W., & Hall, S. R. (2009). Extending the vision of distance education to
learning via virtually being there and beyond. Communications of the Association
for Information Systems, 25, 425-436. Retrieved from http://aisel.aisnet.org/cais/
Faul, F., Erdfelder, E., Buchner, A., & Lang, A.-G. (2009). Statistical power analyses
using G*Power 3.1: Tests for correlation and regression analyses. Behavior
Research Methods, 41, 1149-1160. http://dx.doi.org/10.3758/brm.41.4.1149.
Ferguson, J. M., & DeFelice, A. E. (2010). Length of online course and student
satisfaction, perceived learning, and academic performance. International Review
of Research in Open and Distance Learning, 11, 73-84. Retrieved from
http://www.irrodl.org/index.php/irrodl
Fidishun, D. (2011, March). Andragogy and technology: Integrating adult learning
theory as we teach with technology. Retrieved from
http://frank.mtsu.edu/~itconf/proceed00/ fidishun.html
Finn, D. (2011). Principles of adult learning: An ESL context. Journal of Adult
Education, 40(1), 34-39. Retrieved from http://www.mpaea.org/publications.htm
169
Fulmer, S. M., & Frijters, J. C. (2009). A review of self-report and alternative approaches
in the measurement of student motivation. Educational Psychology Review, 21,
219-246. http://dx.doi.org/10.1007/s10648-009-9107-x
Galbraith, D. D., & Fouch, S. E. (2007). Principles of adult learning: Application to
safety training. Professional Safety, 52(9), 35-40. Retrieved from
http://www.vista-training.com/principles-adult-learning.pdf
Garrison, D. R. (1997). Self-directed learning: Toward a comprehensive model. Adult
Education Quarterly, 48, 18-33). http://dx.doi.org/10.1177/074171369704800103
George, V. P. (2013). A communication-focused model for learning and education.
Business Education and Accreditation, 5, 117-130. Retrieved from
http://www.theibfr.com/bea.htm
Ghost Bear, A. (2012). Technology, learning, and individual differences. MPAEA
Journal of Adult Education, 41(2), 27-42. Retrieved from
https://www.mpaea.org/?page=publications
Gibbons, H. S., & Wentworth, G. P. (2001, June). Andrological and pedagogical training
differences for online instructors. Proceedings of the Distributed Learning
Association, Callaway, GA. Retrieved from
http://www.westga.edu/~distance/ojdla/fall43/gibbons wentworth43.html
Gilbert, M. J., Schiff, M., & Cunliffe, R. H. (2013). Teaching restorative justice:
Developing a restorative andragogy for face-to-face, online and hybrid course
modalities. Contemporary Justice Review, 16, 43-69.
http://dx.doi.org/10.1080/10282580.2013.769305
Gill, R. (2010). Conceptual framework for using computers to enhance employee
engagement in large offices. Human Resource Development Review, 9, 115-143.
http://dx.doi.org/10.1177/1534484309354707
Glasow, P. A. (2005). Fundamentals of survey research methodology. Maclean, VA:
MITRE. Retrieved from
http://www33.homepage.villanova.edu/edward.fierros/pdf/Glasow.pdf
Glassman, M., & Kang. M. J. (2010). Pragmatism, connectionism, and the internet: A
mind’s perfect storm. Computers in Human Behavior, 26, 1412-1418.
http://dx.doi.org/10.1016/j.chb.2010.04.019
Goddu, K. (2012). Meeting the CHALLENGE: Teaching strategies for adult learners.
Kappa Delta Pi Record, 48, 169-173.
http://dx.doi.org/10.1080/00228958.2012.734004
Gonzalez-Gomez, F., Guardiola, J., Rodriguez, O. M., & Alonso, M. A. M. (2012).
Gender differences in e-learning satisfaction. Computers & Education, 58, 283290. http://dx.doi.org/10.1016/j.compedu.2011.08.017
170
Gorges, J., & Kandler, C. (2012). Adults' learning motivation: Expectancy of success,
value, and the role of affective memories. Learning and Individual Differences,
22, 610-617. http://dx.doi.org/10.1016/j.lindif.2011.09.016
Green, G., & Ballard, G. H. (2011). No substitute for experience: Transforming teacher
preparation with experiential and adult learning practices. Southeastern Regional
Association of Teacher Educators (SRATE) Journal, 20(1), 12-20. Retrieved from
ERIC database. (EJ948702)
Greener, S. L. (2010). Plasticity: The online learning environment's potential to support
varied learning styles and approaches. Campus-Wide Information Systems, 27,
254-262. http://dx.doi.org/10.1108/10650741011073798
Guilbaud, P., & Jerome-D’Emilia, B. (2008). Adult instruction & online learning:
Towards a systematic instruction framework. International Journal of Learning,
15(2), 111-121. Retrieved from
http://ijl.cgpublisher.com/product/pub.30/prod.1638
Gunawardena, C. N., Linder-VanBerschot, J. A., LaPointe, D. K., & Rao, L. (2010).
Predictors of learner satisfaction and transfer of learning in a corporate online
education program. The American Journal of Distance Education, 24(, 207-226.
http://dx.doi.org/10.1080/08923647.2010.522919
Gupta, S., & Bostrom, R. P. (2009). Technology-mediated learning: A comprehensive
theoretical model. Journal of the Association for Information Systems (JAIS), 10,
686-714. Retrieved from http://aisel.aisnet.org/jais/vol10/iss9/1
Hair, J. F., Black, B., Babin, B., & Anderson, R. E. (2009). Multivariate data analysis
(7th ed.). Upper Saddle River, NJ: Prentice Hall.
Hannay, M., Kitahara, R., & Fretwell, C. (2010). Student-focused strategies for the
modern classroom. Journal of Instructional Pedagogies, 2, 1-16. Retrieved from
http://www.aabri.com/manuscripts/09406.pdf
Harper, L., & Ross, J. (2011). An application of Knowles' theories of adult education to
an undergraduate interdisciplinary studies degree program. Journal of Continuing
Higher Education, 59(3), 161-166.
http://dx.doi.org/10.1080/07377363.2011.614887
Hauer, J., & Quill, T. (2011). Educational needs assessment, development of learning
objectives, and choosing a teaching approach. Journal of Palliative Medicine, 14,
503-508. http://dx.doi.org/10.1089/jpm.2010.0232
Haythornthwaite, C., Bruce, B. C., Andrews, R., Kazmer, M. M., Montague, R.-A., &
Preston, C. (2007). Theories and models of and for online learning. First Monday,
12(8). Retrieved from
http://www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/1976/1851
171
Henning, T. (2012). Writing professor as adult learner: An autoethnography of online
professional development. Journal of Asynchronous Learning Networks, 16(2), 926. Retrieved from http://sloanconsortium.org/jaln/v16n2/writing-professor-adultlearner-autoethnography-online-professional-development
Henschke, J. A. (2008). Reflections on the experiences of learning with Dr. Malcolm
Shepherd Knowles. New Horizons in Adult Education and Human Resource
Development, 22(3-4), 44-52. http://dx.doi.org/10.1002/nha3.10316
Henschke, J. A. (2011). Considerations regarding the future of andragogy. Adult
Learning, 22(1), 34-37. http://dx.doi.org/10.1177/104515951102200109
Higher Learning Commission: A Commission of the North Central Association (HLCNCA). (2013). The criteria for accreditation and core components. Retrieved from
http://www.ncahlc.org/Information-for-Institutions/criteria-and-corecomponents.html
Ho, L.-A., & Kuo, T.-H. (2010). How can one amplify the effect of e-learning? An
examination of high-tech employees' computer attitude and flow experience.
Computers in Human Behavior, 26(2), 23-31.
http://dx.doi.org/10.1016/j.chb.2009.07.007
Hodge, P., Wright, S., Barraket, J., Scott, M., Melville, R., & Richardson, S. (2011).
Revisiting 'how we learn' in academia: Practice-based learning exchanges in three
Australian universities. Studies in Higher Education, 36, 167-183.
http://dx.doi.org/10.1080/03075070903501895
Hoic-Bozic, N., Mornar, V., & Boticki, I. (2009). A blended learning approach to course
design and implementation. IEEE Transactions on Education, 52(1), 19-30.
http://dx.doi.org/10.1109/GTE.2007.914945
Holbert, K. E., & Karady, G. G. (2009). Strategies, challenges and prospects for active
learning in the computer-based classroom. IEEE Transaction on Education, 52(1),
31-38. http://dx.doi.org/10.1109/te.2008.917188
Holmberg, B. (1989). The evolution, principles and practices of distance education. In U.
Bernath, F. W. Busch, D. Garz, A. Hanft, T. Hulsmann, B. Moschner, . . . O.
Zawacki-Richter (Eds.), Studien und Berichte det Arbeitsselle
Fernstudienforschung der Carl von Ossietzky Universitat Oldenburg (Vol. 11).
Oldenburg, Germany: BIS-Verlag der Carl von Ossietzky Universitat.
Holton, E., Wilson, L., & Bates, R. A. (2009). Toward development of a generalized
instrument to measure andragogy. Human Resource Development Quarterly,
20(2), 169-193. http://dx.doi.org/10.1002/hrdq.20014
Hoskins, B. J. (2012). Connections, engagement, and presence. Journal of Continuing
Higher Education, 60(1), 51-53. http://dx.doi.org/10.1080/07377363.2012.650573
172
Hrastinski, S. (2008). The potential of synchronous communication to enhance
participation in online discussions: A case study of two e-learning courses.
Information and Management, 45, 499-506.
http://dx.doi.org/10.1016/j.im.2008.07.005
Hrastinski, S., & Jaldemark, J. (2012). How and why do students of higher education
participate in online seminars? Education and Information Technologies, 17, 253271. http://dx.doi.org/10.1007/s10639-011-9155-y
Hsieh, P.-A. J., & Cho, V. (2011). Comparing e-learning tools' success: The case of
instructor-student interactive vs. self-paced tools. Computers & Education, 57,
2025-2038. http://dx.doi.org/10.1016/j.compedu.2011.05.002
Hsu, Y., & Shiue, Y. (2005). The effect of self-directed learning readiness on
achievement comparing face-to-face and two-way distance learning instruction.
International Journal of Instructional Media, 32, 143-156. Retrieved from
http://www.adprima.com
Huang, E. Y., Lin, S. W., & Huang, T. K. (2012). What type of learning style leads to
online participation in the mixed-mode e-learning environment? A study of
software usage instruction. Computers & Education, 58(1), 338-349.
http://dx.doi.org/10.1016/j.compedu.2011.08.003
Hughes, B. J., & Berry, D. C. (2011). Self-directed learning and the millennial athletic
training student. Athletic Training Education Journal, 6(1), 46-50. Retrieved from
http://nataej.org/6.1/0601-046050.pdf
Hurtado, C., & Guerrero, L. A. (2009). A PDA-based collaborative tool for learning
chemistry skills. Proceedings of the 13th international conference on computer
supported cooperative work in design. CSCWD’09, Santiago, Chile, 378-383.
http://dx.doi.org/10.1109/cscwd.2009.4968088
Hussain, I. (2013). A study of learners' reflection on andragogical skills of distance
education tutors. International Journal of Instruction, 6(1), 123-138. Retrieved
from ERIC Database. (ED539907)
Ismail, I., Idrus, R. M., Baharum, H., Rosli, M., & Ziden, A. (2011). The learners'
attitudes towards using different learning methods in e‐learning portal
environment. International Journal of Emerging Technologies in Learning, 6(3),
49‐52. Retrieved from http://www.online-journals.org/i-jet
Ismail, I., Gunasegaran, G., & Idrus, R. M. (2010). Does e-learning portal add value to
adult learners? Current Research Journal of Social Sciences, 2, 276-281.
Retrieved from http://maxwellsci.com/print/crjss/v2-276-281.pdf
Jackson, L. C., Jones, S. J., & Rodriguez, R. C. (2010). Faculty actions that result in
student satisfaction in online courses. Journal of Asynchronous Learning
173
Networks, 14(4), 78-96. Retrieved from
http://jaln.sloanconsortium.org/index.php/jaln
Jeffrey, L. M. (2009). Learning orientations: Diversity in higher education. Learning and
Individual Differences, 19, 195-208.
http://dx.doi.org/10.1016/j.lindif.2008.09.004
Johnson, T., Wisniewski, M., Kuhlemeyer, G., Isaacs, G., & Krzykowski, J. (2012).
Technology adoption in higher education: Overcoming anxiety through faculty
bootcamp. Journal of Asynchronous Learning Networks, 16, 63-72. Retrieved
from http://sloanconsortium.org/jaln/v16n2/technology-adoption-highereducation-overcoming-anxiety-through-faculty-bootcamp
Joo, Y. J., Lim, K. Y., & Kim, E. K. (2011). Online university students' satisfaction and
persistence: Examining perceived level of presence, usefulness and ease of use as
predictors in a structural model. Computers & Education, 57, 1654-1664.
http://dx.doi.org/10.1016/j.compedu.2011.02.008
Kaliski, J. A., Booker, Q. E., & Schumann, P. L. (2012). An architecture for dynamic elearning environment based on student activity and learning styles. Business
Education and Accreditation, 4, 113-124. Retrieved from
http://www.theibfr.com/bea.htm
Kalyuga, S. (2011). Informing: A cognitive load perspective. Informing Science: The
International Journal of an Emerging Transdiscipline, 14, 33-45. Retrieved from
http://www.inform.nu/Articles/Vol14/ISJv14p033-045Kalyuga586.pdf
Karge, B. D., Phillips, K. M., Dodson, T. J., & McCabe, M. (2011). Effective strategies
for engaging adult learners. Journal of College Teaching and Learning, 8(12), 5356. Retrieved from
http://journals.cluteonline.com/index.php/TLC/article/view/6621
Kash, S., & Dessinger, J. C. (2010). Paulo Freire's relevance to online instruction and
performance improvement. Performance Improvement, 49(2), 17-21.
http://dx.doi.org/10.1002/pfi.20129
Ke, F. (2010). Examining online teaching, cognitive, and social presence for adult
students. Computers & Education, 55, 808-820.
http://dx.doi.org/10.1016/j.compedu.2010.03.013
Ke. F., & Carr-Chellman, A. (2006). Solitary learner in online collaborative learning: A
disappointing experience? The Quarterly Review of Distance Education, 7, 249265. Retrieved from http://www.editlib.org/p/106766
Ke, F., & Hoadley, C. (2009). Evaluating online learning communities. Educational
Technology Research & Development, 57(1), 487-510.
http://dx.doi.org/10.1007/s11423-009-9120-2
174
Ke, F., & Kwak, D. (2013). Online learning across ethnicity and age: A study on learning
interaction, participation, perceptions, and learning satisfaction. Computers and
Education, 61, 43-51. http://dx.doi.org/10.1016/j.compedu.2012.09.003
Ke, F., & Xie, K. (2009). Toward deep learning for adult students in online courses.
Internet and Higher Education, 12, 136-145.
http://dx.doi.org/10.1016/j.iheduc.2009.08.001
Kear, K., Chetwynd, F., Williams, J., & Donelan, H. (2012). Web conferencing for
synchronous online tutorials: Perspectives of tutors using a new medium.
Computers & Education, 58, 953-963.
http://dx.doi.org/10.1016/j.compedu.2011.10.015
Keengwe, J., & Georgina, D. (2012). The digital course training workshop for online
learning and teaching. Educational and Information Technologies, 17, 365-379.
http://dx.doi.org/10.1007/s10639-011-9164-x
Kellogg, D. L., & Smith, M. A. (2009). Student-to-student interaction revisited: A case
study of working adult business students in online courses. Decision Sciences
Journal of Innovative Education, 7(2), 433-456. http://dx.doi.org/10.1111/j.15404609.2009.00224.x
Kember, D. (1989a). A longitudinal-process model of dropout from distance education.
Journal of Higher Education, 60, 278-301. Retrieved from
http://www.ashe.ws/?page=186
Kember, D. (1989b). An illustration, with case studies, of a linear-process model of dropout from distance education. Distance Education, 10, 196-211.
http://dx.doi.org/10.1080/0158791890100205
Kenner, C., & Weinerman, J. (2011). Adult learning theory: Applications to
nontraditional college students. Journal of College Reading and Learning, 41(2),
87-96. Retrieved from http://www.crla.net/journal.htm
Khowaja, S., Ghufran, S., & Ahsan, M. J. (2011). Estimation of population means in
multivariate stratified random sampling. Communications in Statistics: Simulation
and Computation, 40, 710-718. http://dx.doi.org/10.1080/03610918.2010.551014
Kiener, M. (2010). Examining college teaching: A coaching perspective. Rehabilitation
Education, 24(1-2), 69-74. http://dx.doi.org/10.1891/088970110805029840
Kiliç-Çakmak, E. (2010). Learning strategies and motivational factors predicting
information literacy self-efficacy of e-learners. Australasian Journal of
Educational Technology, 26(2), 192-208. Retrieved from ERIC Database.
(EJ886194)
175
Kim, K., & Frick, T. W. (2011). Changes in student motivation during online learning.
Journal of Educational Computing Research, 44(1), 1-23.
http://dx.doi.org/10.2190/ec.44.1.a
Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during
instruction does not work: An analysis of the failure of constructivist, discovery,
problem-based, experiential, and inquiry-based teaching. Educational
Psychologist, 41, 75-86. http://dx.doi.org/10.1207/s15326985ep4102_1
Kistler, M. J. (2011). Adult learners: Considerations for education and training.
Techniques: Connecting Education and Careers, 86(2), 28-30. Retrieved from
ERIC Database. (ED926047)
Knowles, M. S. (1973, 1990). The adult learner: A neglected species (4th ed.). Houston,
TX: Gulf Publishing.
Knowles, M. S. (1975). Self-directed learning: A guide for learners and teachers.
Englewood Cliffs, NJ: Prentice Hall/Cambridge.
Knowles, M. S. (1980). The modern practice of adult education; Andragogy versus
pedagogy. Englewood Cliffs, NJ: Prentice Hall/Cambridge.
Knowles, M. S. (1984). Andragogy in action: Applying modern principles of adult
learning. San Francisco, CA: Jossey-Bass.
Knowles, M. S. (1995). Designs for adult learning: Practical resources, exercises, and
course outlines from the father of adult learning. Alexandria, VA: American
Society for Training and Development.
Knowles, M. S., Holton, E. F. III, & Swanson, R. A. (2005). The adult learner: The
definitive classic in adult education and human resource development (6th ed.).
London, UK: Elsevier.
Kobsiripat, W., Kidrakarn, P., & Ruangsuwan, C. (2011). The development of selfdirected learning by using SDL e-training system. European Journal of Social
Science, 21, 556-562. Retrieved from
http://www.europeanjournalofsocialsciences.com/
Korr, J., Derwin, E., Greene, K., & Sokoloff, W. (2012). Transitioning an adult-serving
university to a blended learning model. Journal of Continuing Higher Education,
60(1), 2-11. http://dx.doi.org/10.1080/07377363.2012.649123
Kozub, R. M. (2010). An ANOVA analysis of the relationships between business
students' learning styles and effectiveness of web based instruction. American
Journal of Business Education, 3(3), 89-98. Retrieved from
http://journals.cluteonline.com/index.php/AJBE
176
Kroth, M., Taylor, B., Lindner, L., & Yopp, M. (2009). Improving your teaching using
synergistic andragogy. Journal of Industrial Teacher Education, 46, 135-141.
Retrieved from http://scholar.lib.vt.edu/ejournals/JITE/v46n3/pdf/kroth.pdf
Kuleshov, G. G. (2008). Computerized education: What is behind the attractive curtain?
In M. Iskander (ed.), Innovative techniques in instruction technology, e-learning,
e-assessment, and education (pp. 123-128). London, England: Springer
Science+Business Media.
Kupczynski, L., Gibson, A. M., Ice, P., Richardson, J., & Challoo, L. (2011). The impact
of frequency on achievement in online courses: A study from a south Texas
University. Journal of Interactive Online Learning, 10(3), 141-149. Retrieved
from http://www.ncolr.org/jiol
Lam, P., & Bordia, S. (2008). Factors affecting student choice of e-learning over
traditional learning: Student and teacher perspectives. The International Journal
of Learning, 14(12), 131-139. Retrieved from
http://ijl.cgpublisher.com/product/pub.30/prod.1585
Lapsley, R., Kulik, B., Moody, R., & Arbaugh, J. B. (2008). Is identical really identical?
An investigation of equivalency theory and online learning. Journal of Educators
Online, 5(1), 1-19. Retrieved from http://www.thejeo.com/
Lassibille, G. (2011). Student progress in higher education: What we have learned from
large-scale studies. The Open Education Journal, 4, 1-8.
http://dx.doi.org/10.2174/1874920801104010001
Lassibille, G., & Navarro Gómez, L. (2008). Why do higher education students drop out?
Evidence from Spain. Educational Economics, 16, 89-105. http://dx.doi.org/
10.1080/09645290701523267
Law, K. M. Y., Lee, V. C. S., & Yu, Y. T. (2010). Learning motivation in e-learning
facilitated computer programming courses. Computers and Education, 55, 218228. http://dx.doi.org/10.1016/j.compedu.2010.01.007
Lear, J. L., Ansorge, C., & Steckelberg, A. (2010). Interactivity/community process
model for the online education environment. MERLOT Journal of Online
Learning and Training, 6(1), 71-77. Retrieved from
http://jolt.merlot.org/vol6no1/lear_0310.htm
Lee, D., Redmond, J. A., & Dolan, D. (2008). Lessons from the e-learning experience in
South Korea in traditional universities. In M. Iskander (Ed.), Innovative
techniques in instruction technology, e-learning, e-assessment, and education (pp.
216-222). London, England: Springer Science+Business Media.
Lee, J.-W. (2010). Online support service quality, online learning acceptance, and student
satisfaction. Internet and Higher Education, 13, 277-283.
http://dx.doi.org/10.1016/j.iheduc.2010.08.002
177
Lee, S. J., Srinivasan, S., Trail, T., Lewis, D., & Lopez, S. (2011). Examining the
relationship among student perception of support, course satisfaction, and
learning outcomes in online learning. Internet and Higher Education, 14, 158163. http://dx.doi.org/10.1016/j.iheduc.2011.04.001
Lee, Y., & Choi, J. (2011). A review of online course dropout research: Implication for
practice and future research. Educational Technology Research and
Development, 59, 593-618. http://dx.doi.org/10.1007/s11423-010-9177-y
Lee, Y., Choi, J., & Kim, T. (2013). Discriminating factors between completers of and
dropouts from online learning courses. British Journal of Educational
Technology, 44, 328-337. http://dx.doi.org/10.1111/j.1467-8535.2012.01306.x
Levy, P. S. (2013). Sampling of populations: Methods and applications. Hoboken, NJ:
Wiley & Sons.
Levy, Y. (2007). Comparing dropouts and persistence in e-learning courses. Computers
& Education, 48, 185-204. http://dx.doi.org/10.1016/j.compedu.2004.12.004
Li, S. D. (2012). Testing mediation using multiple regression and structural equation
modeling analyses in secondary data. Evaluation Review, 35, 240-268.
http://dx.doi.org/10.1177/0193841X11412069
Licht, M. H. (1995). Multiple regression and correlation. In L. G. Grimm and P. R.
Yarnold (Eds.), Reading and understanding multivariate statistics (pp. 19-64).
Washington, DC: American Psychological Association.
Lint, A. H. (2013). Academic persistence of online students in higher education impacted
by student progress factors and social media. Online Journal of Distance
Learning Administration, 16(4). Retrieved from
http://www.westga.edu/~distance/ojdla/winter164/lint164.html
Liu, X., Liu, S., Lee, S.-H., & Magjuka, R. J. (2010). Cultural differences in online
learning: International student perceptions. Educational Technology & Society,
13(3), 177-188. Retrieved from http://www.ifets.info/journals/13_3/16.pdf
Lo, C. C. (2010). How student satisfaction factors affect perceived learning. Journal of
the Scholarship of Teaching and Learning, 10(1), 47-54. Retrieved from
http://josotl.indiana.edu/
Lo, M.-C., Ramayah, T., & Hong, T. C. (2011). Modeling user satisfaction in e-learning:
A supplementary tool to enhance learning. Review of Business Research, 11, 128133. Retrieved from http://www.freepatentsonline.com/article/Review-BusinessResearch/272616353.html
London, M., & Hall, M. J. (2011). Unlocking the value of web 2.0 technologies for
training and development: The shift from instructor-controlled, adaptive learning
178
to learner-driven, generative learning. Human Resources Management, 50, 757775. http://dx.doi.org/10.1002/hrm.20455
Long, H., Hiemstra, R., & Associates (1980). Changing approach to studying adult
education. San Francisco, CA: Jossey-Bass.
Lu, H., & Chiou, M. (2010). The impact of individual differences on e-learning system
satisfaction: A contingency approach. British Journal of Educational Technology,
41, 307-323. http://dx.doi.org/10.1111/j.1467-8535.2009.00937.x
Lykourentzou, I., Giannoukos, I., Mpardis, G., Nikolopoulos, V., & Loumos, V. (2009).
Early and dynamic student achievement prediction in e-learning courses using
neural networks. Journal of the American Society for Information Science and
Technology, 60, 372-380. http://dx.doi.org/10.1002/asi.20970
Lykourentzou, I., Giannoukos, I., Nikolopoulos, V., Mpardis, G., & Loumos, V. (2009).
Dropout prediction in e-learning courses through the combination of machine
learning techniques. Computers and Education, 53, 950-965.
http://dx.doi.org/10.1016/j.compedu.2009.05.010
MacLean, P., & Scott, B. (2011). Competencies for learning design: A review of the
literature and a proposed framework. British Journal of Educational Technology,
42, 557-572. http://dx.doi.org/10.1111/j.1467-8535.2010.01090.x
Mahle, M. (2011). Effects of interactivity on student achievement and motivation in
distance education. Quarterly Review of Distance Education, 12, 207-215.
Retrieved from http://www.infoagepub.com/index.php?id=89&i=142
Major, C. H. (2010). Do virtual professors dream of electric students? University faculty
experiences with online distance education. Teachers College Record: The Voice
of Scholarship in Education, 112, 2154-2208. Retrieved from
http://www.tcrecord.org/
Malik, S. K., & Khurshed, F. (2011). Nature of teacher-students’ interaction in electronic
learning and traditional courses of higher education – a review. Turkish online
Journal of Distance Education (TOJDE), 12, 157-166. Retrieved from
https://tojde.anadolu.edu.tr/
Mancuso, D. S., Chlup, D. T., & McWhorter, R. R. (2010). A study of adult learning in a
virtual world. Advances in Human Resources, 12, 681-699.
http://dx.doi.org/10.1177/1523422310395368
Marques, J. (2012). The dynamics of accelerated learning. Business Education and
Accreditation, 4(1). 101-112. Retrieved from http://www.theibfr.com/bea.htm
Marschall, S., & Davis, C. (2012). A conceptual framework for teaching critical reading
to adult college students. Adult Learning, 23, 63-68.
http://dx.doi.org/10.1177/1045159512444265
179
Martinez-Caro, E. (2011). Factors affecting effectiveness in e-learning: An analysis in
production management courses. Computer Applications in Engineering
Education, 19, 572-581. http://dx.doi.org/10.1002/cae.20337
Martinez-Torres, M. R., Toral, S. L., & Barrero, F. (2011). Identification of the design
variables of elearning tools. Interacting With Computers, 23, 279-288.
http://dx.doi.org/10.1016/j.intcom.2011.04.004
McGlone, J. R. (2011). Adult learning styles and on-line educational preference.
Research in Higher Education Journal, 12, 1-9. Retrieved from
http://www.aabri.com/rhej.html
McGrath, V. (2009). Reviewing the evidence on how adult students learn: An
examination of Knowles' model of andragogy. Adult Learner: The Irish Journal
of Adult and Community Education, 99-110. Retrieved from
http://www.aontas.com/download/pdf/adult_learner_2009.pdf
Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2009). Evaluation of
evidence-based practices in online learning: A meta-analysis and review of online
learning studies. Washington, DC: U.S. Department of Education, Office of
Planning, Evaluation, and Policy Development.
Mehta, A., Clayton, H., & Sankar, C. S. (2007). Impact of multi-media case studies on
improving intrinsic learning motivation of students. Journal of Educational
Technology Systems, 36, 79-103. http://dx.doi.org/10.2190/ET.36.1.f
Merriam, S. B., Caffarella, R. S., & Baumgartner, L. (2007). Learning in adulthood: A
comprehensive guide (3rd ed.). San Francisco, CA: Jossey-Bass.
Mezirow J. (1997). Transformative learning: Theory to practice. New directions in adult
and continuing education. 74, 5-12. Retrieved from
http://www.dlc.riversideinnovationcentre.co.uk/wpcontent/uploads/2012/10/Transformative-Learning-Mezirow-1997.pdf
Mezirow, J. (2000). Learning as transformation: Critical perspectives on a theory in
progress. San Francisco, CA: Jossey-Bass.
Michinov, N., Brunot, S., Le Bohec, O., Juhel, J., & Delaval, M. (2011). Procrastination,
participation, and performance in online learning environments. Computers and
Education, 56, 243-252. http://dx.doi.org/10.1016/j.compedu.2010.07.025
Miles, J., & Shevlin, M. (2001). Applying regression & correlation: A guide for students
and researchers. Los Angeles, CA: Sage.
Milheim, K. L. (2011). The role of adult education philosophy in facilitating the online
classroom. Adult Learning, 22(2), 24-31.
http://dx.doi.org/10.1177/104515951102200204
180
Minter, R., L. (2011). The learning theory jungle. Journal of College Teaching and
Learning, 8(6), 7-15. Retrieved from
http://journals.cluteonline.com/index.php/TLC/article/view/4278/4365
Moisey, S. D., & Hughes, J. A. (2008). Supporting the online learner. In T. Anderson
(Ed.), The theory and practice of online learning (pp. 419-439). Edmonton, AB:
Athabasca University.
Moore, K. (2010). The three-part harmony of adult learning, critical thinking, and
decision-making. Journal of Adult Education, 39(1), 1-10. Retrieved from
https://www.mpaea.org/docs/pdf/Vol39No12010.pdf
Moore, M. G. (1989). Editorial: Three types of interaction. The American Journal of
Distance Education, 3(2), 1-6. Retrieved from
http://aris.teluq.uquebec.ca/portals/598/t3_moore1989.pdf
Morrow, J., & Ackermann, M. E. (2012). Intention to persist and retention of first-year
students: The importance of motivation and sense of belonging. College Student
Journal, 46(3), 483-491. Retrieved from
http://www.projectinnovation.com/College_Student_Journal.html
Muilenburg, L. Y., & Berge, Z. L. (2005). Student barriers to online learning: A factor
analytic study. Distance Education, 26(1), 29-48.
http://dx.doi.org/10.1080/01587910500081269
Muirhead, B. (2004). Contemporary online education challenges. International Journal
of Instructional Technology & Distance Learning (ITDL), 1(10). Retrieved from
http://itdl.org/journal/oct_04/article05.htm
Muniz-Solari, O., & Coats, C. (2009). Integrated networks: National and international
online experiences. International Review of Research in Open and Distance
Learning, 10(1), 1-19. http://dx.doi.org/10.1016/j.ejor. 2007.11.053
Muth, B. (2008). Radical conversations: Part one social-constructivist methods in the
ABE classroom. Journal of Correctional Education, 59, 261-281. Retrieved from
http://coe.csusb.edu/programs/correctionalEd/documents/RadicalConversationsIFi
nal.pdf
Nagel, S. L., Maniam, B., & Leavell, H. (2011). Pros and cons of online education for
educators and students. International Journal of Business Research, 11(6), 136142. Retrieved from http://www.freepatentsonline.com/article/InternationalJournal-Business-Research/272485035.html
Nandeshwar, A., Menzies, T., & Nelson, A. (2011). Learning patterns of university study
retention. Expert Systems with Applications, 38, 14984-14996.
http://dx.doi.org/10.1016/j.eswa.2011.05.048
181
Newbold, J. J., Mehta, S. S., & Forbus, P. (2010). A comparative study between nontraditional and traditional students in terms of their demographics, attitudes,
behavior, and educational performance. International Journal of Education
Research, 5, 1-25. Retrieved from
http://www.thefreelibrary.com/A+comparative+study+between+nontraditional+and+traditional+students...-a0222252932
Newell, C. C. (2007). Learner characteristics as predictors of online course completion
among nontraditional technical college students (Doctoral dissertation).
University of Georgia. Retrieved from
http://athenaeum.libs.uga.edu/bitstream/handle/10724/9793/newell_chandler_c_2
00705_edd.pdf?sequence=1
Nichols, A. J. & Levy, Y. (2009). Empirical assessment of college student-athletes’
persistence in e-learning courses: A case study of a U.S. National Association of
Intercollegiate Athletics (NAIA) institution. Internet and Higher Education, 12,
14-25. http://dx.doi.org/10.1016/j.iheduc.2008.10.003
Nikolaki, E., & Koutsouba, M. I. (2012). Support and promotion of self-regulated
learning through the educational material at the Hellenic Open University.
Turkish Online Journal of Distance Education, 13, 226-238. Retrieved from
ERIC Database. (EJ997819)
North Central Association of Colleges and Schools (NCA). (n.d.). Online colleges in
Missouri (MO): Finding accredited online schools. Retrieved from
http://www.onlinecolleges.net/missouri/
Nummenmaa, M., & Nummenmaa, L. (2008). University students' emotions, interest and
activities in a web-based learning environment. British Journal of Educational
Psychology, 78, 163-178. http://dx.doi.org/10.1348/000709907X203733
Nunnaly, J. (1978). Psychometric theory. New York, NY: McGraw-Hill.
Nulty, D. D. (2008). The adequacy of response rates to online and paper surveys: What
can be done? Assessment and Evaluation in Higher Education, 33, 301-314.
http://dx.doi.org/10.1080/02602930701293231
O'Bannon, T., & McFadden, C. (2008). Model of experiential andragogy: Development
of a non-traditional experiential learning program model. Journal of
Unconventional Parks, Tourism & Recreation Research, 1(1), 23-28. Retrieved
from http://juptrr.asp.radford.edu/Volume_1/Experiential_Andragogy.pdf
Oblinger, D. G., & Hawkins, B. L. (2006). IT myths: The myth about no significant
difference. EDUCAUSE Review, 14-15. Retrieved from
http://net.educause.edu/ir/library/pdf/erm0667.pdf
Omar, A., Kalulu, D., & Belmasrour, R. (2011). Enhanced instruction: The future of elearning. International Journal of Education Research, 6(1), 21-37. Retrieved
182
from http://www.journals.elsevier.com/international-journal-of-educationalresearch/
Oncu, S., & Cakir, H. (2011). Research in online learning environments: Priorities and
methodologies. Computers & Education, 57, 1098-1108.
http://dx.doi.org/10.1016/j.compedu.2010.12.009
Orvis, K. A., Horn, D. B., & Belanich, J. (2008). The roles of task difficulty and prior
videogame experience on performance and motivation in instructional
videogames. Computers in Human Behavior, 24, 2415-2433.
http://dx.doi.org/10.1016/j.chb.2008.02.016
O'Toole, S., & Essex, B. (2012). The adult learner may really be a neglected species.
Australian Journal of Adult Learning, 52(1), 183-191. Retrieved from
http://www.ala.asn.au
Paas, F., & Sweller, J. (2012). An evolutionary upgrade of cognitive load theory: Using
the human motor system and collaboration to support the learning of complex
cognitive tasks. Educational Psychology Review, 24(1), 27-45.
http://dx.doi.org/10.1007/s10648-011-9179-2
Paechter, M., Maier, B., & Macher, D. (2010). Students’ expectations of, and experiences
in e-learning: Their relation to learning achievements and course satisfaction.
Computers and Education, 54, 222-229.
http://dx.doi.org/10.1016/j.compedu.2009.08.005
Packham, G., Jones, G., Miller, C., & Thomas, B. (2004). E-learning and retention: Key
factors influencing student withdrawal. Education and Training, 46, 335-342.
http://dx.doi.org/10.1108/00400910410555240
Park, J.-H., & Choi, H. J. (2009). Factors influencing adult learners’ decision to drop out
or persist in online learning. Journal of Educational Technology & Society, 12(4),
207-217. Retrieved from http://www.ifets.info/journals/12_4/18.pdf
Patterson, B., & McFadden, C. (2009). Attrition in online and campus degree programs.
Online Journal of Distance Learning Administration, 12(2). Retrieved from
http://www.westga.edu/~distance/ojdla/summer122/patterson112.html
Pelz, B. (2010). (My) three principles of effective online pedagogy. Journal of
Asynchronous Learning Networks, 14(1), 103-116. Retrieved from
http://sloanconsortium.org/publications/jaln_main
Perl, P., Greely, J. Z., & Gray, M. M. (2006). What proportion of adult Hispanics are
Catholic? A review of survey data and methodology. Journal for the Scientific
Study of Religion, 45, 419-436. http://dx.doi.org/10.1111/j.14685906.2006.00316.x
183
Phelan, L. (2012). Interrogating students' perceptions of their online learning experiences
with Brookfield's critical incident questionnaire. Distance Education, 33(1), 3144. http://dx.doi.org/10.1080/01587919.2012.667958
Picciano, A. G., Seaman, J., & Allen, I. E. (2010). Educational transformation through
online learning: To be or not to be. Journal of Asynchronous Learning Networks,
14(4), 17-35. Retrieved from
http://sloanconsortium.org/sites/default/files/2_jaln14-4_picciano_0.pdf
Pigliapoco, E. E., & Bogliolo, A. A. (2008). The effects of psychological sense of
community in online and face-to-face academic courses. International Journal of
Emerging Technologies in Learning, 3(4), 60-69. Retrieved from
http://www.online-journals.org/i-jet
Pih-Shuw, C., & Jin-Ton, C. (2012). The relations between learner motivation and
satisfaction with aspects of management training. International Journal of
Management, 29(2), 545-561. Retrieved from
http://www.internationaljournalofmanagement.co.uk/
Potter, S. L., & Rockinson-Szapkiw, A. J. (2012). Technology integration for
instructional improvement: The impact of professional development. Performance
Improvement, 51(2), 22-27. http://dx.doi.org/10.1002/phi.21246
Rachel, J. R. (2002). Andragogy’s detectives: A critique of the present and a proposal for
the future. Adult Education Quarterly, 53, 210-227.
http://dx.doi.org/10.1177/0741713602052003004
Racović-Marković, M. (2010). Advantages and disadvantages of e-learning in
comparison to traditional forms of learning. Annals of the University of Petroşani,
Economics, 10, 289-298. Retrieved from
http://upet.ro/annals/economics/pdf/2010/20100227.pdf
Rakap, S. (2010). Impacts of learning styles and computer skills on a students' learning
online. Turkish Online Journal of Educational Technology, 9, 108-115. Retrieved
from http://www.tojet.net/articles/v9i2/9212.pdf
Reushle, S. (2006, October). A framework for designing higher education e-learning
environments. Paper presented at the World Conference on E-Learning in
Corporate, Government, Healthcare and Higher Education (e-Learn), Honolulu,
Hawaii. Retrieved from
http://www.academia.edu/292305/A_Framework_for_Designing_Higher_Educati
on_E-Learning_Environments
Reushle, S., & Mitchell, M. (2009). Sharing the journey of facilitator and learner: Online
pedagogy in practice. Journal of Learning Design, 3(1), 11-20. Retrieved from
ERIC database. (EJ903915)
184
Revere, L., Decker, P., & Hill, R. (2012). Assessing learning outcomes beyond
knowledge attainment. Business Education Innovation Journal, 4(1), 72-79.
Retrieved from http://www.beijournal.com/home.html
Revere, L., & Kovach, J. V. (2011). Online technologies for engaged learning: A
meaningful synthesis for educators. The Quarterly Review of Distance Education,
12(2), 113-124. Retrieved from http://www.infoagepub.com/quarterly-review-ofdistance-education.html
Rey, G. D., & Buchwald, F. (2011). The expertise reversal effect: Cognitive load and
motivational explanations. Journal of Experimental Psychology – Applied, 17(1),
33-48. http://dx.doi.org/10.1037/a0022243
Rhode, J. F. (2009). Interaction equivalency in self-paced online learning environments:
An exploration of learner preferences. The International Review of Research in
Open and Distance Learning, 10(1). Retrieved from
http://files.eric.ed.gov/fulltext/EJ831712.pdf
Rodrigues, K. J. (2012). It does matter how we teach math. Journal of Adult Education,
41(1), 29-33. Retrieved from https://www.mpaea.org/?page=publications
Ross-Gordon, J. M. (2011). Research on adult learners: Supporting the needs of a student
population that is no longer nontraditional. Peer Review, 13, 26-29. Retrieved
from http://www.aacu.org/peerreview/pr-wi11/prwi11_RossGordon.cfm
Rovai, A. P. (2003). In search of higher persistence rates in distance education online
programs. Internet and Higher Education, 6, 1-16.
http://dx.doi.org/10.1016/S1096-7516(02)00158-6
Rovai, A. P., Ponton, M. K., Wighting, M. J., & Baker, J. D. (2007). A comparative
analysis of student motivation in traditional classroom and e-learning courses.
International Journal on E-Learning, 6(3), 413-432. Retrieved from
http://www.aace.org/pubs/ijel/
Ruey, S. (2010). A case study of constructivist instructional strategies for adult online
learning. British Journal of Educational Technology, 41(5), 706-720.
http://dx.doi.org/10.1111/j.1467-8535.2009.00965.x
Russ, C. L., Mitchell, G. W., & Durham, S. K. (2010). Components that affect success in
distance learning as perceived by career and technical educators. Business
Education Innovation Journal, 2, 73-79. Retrieved from
http://www.beijournal.com/
Ryan, A. B., Connolly, B., Grummell, B., & Finnegan, F. (2009). Beyond redemption?
Locating the experience of adult learners and educators. Adult Learner: The Irish
Journal of Adult And Community Education, 129-133. Retrieved from
http://www.aontas.com/pubsandlinks/publications/the-adult-learner-2009/
185
Särndal, C.-E., Swensson, B., & Wretman, J. (2003). Model Assisted Survey Sampling.
New York, NY: Springer.
Sandlin, J. A. (2005). Andragogy and its discontents: An analysis of andragogy from
three critical perspectives. Pennsylvania Association of Adult Continuing
Education (PAACE) Journal of Lifelong Learning, 14, 25-42. Retrieved from
http://www.iup.edu/page.aspx?id=17489
Savery, J. R. (2010). Be VOCAL: Characteristics of successful online instructors.
Journal of Interactive Online Learning, 9, 141-152. Retrieved from
http://www.ncolr.org/jiol/issues/pdf/4.2.6.pdf
Sax, L. J., Gilmartin, S. K., & Bryant, A. N. (2003). Assessing response rates and
nonresponse bias in Web and paper surveys. Research in Higher Education, 44,
409-432. http://dx.doi.org/10.1023/A:1024232915870
Scanlon, L. (2009). Identifying supporters and distracters in the segmented world of the
adult learner. Studies in Continuing Education, 31(1), 29-43.
http://dx.doi.org/10.1080/01580370902741878
Schnotz, W., Fries, S., & Horz, H. (2009). Motivational aspects of cognitive load theory.
In M. Wosnitza, S. A. Karabenick, A. Efklides, & P. Nenniger (Eds.),
Contemporary motivation research: From global to local perspectives (pp. 6996). New York, NY: Hogrefe & Huber.
Schultz, R. B. (2012). A critical examination of the teaching methodologies pertaining to
distance learning in geographic education: Andragogy in an adult online
certificate program. Review of International Geographical Education Online, 2,
45-60. Retrieved from http://www.rigeo.org/
Sharples, M., Taylor, J., & Vavoula, G. (2007). A theory of learning for the mobile age.
In R. Andrews, & C. Haythornthwaite (Eds.), The Sage handbook of e-learning
research (pp. 219247). Los Angeles, CA: Sage.
Shea, P., & Bidjerano, T. (2010). Learning presence: Towards a theory of self-efficacy,
self-regulation, and the development of communities of inquiry in online and
blended learning environments. Computers and Education, 55, 1721-7131.
http://dx.doi.org/10.1016/j.compedu.2010.07.017
Shea, P., Fredericksen, E., & Pickett, A. (2006). Student satisfaction and perceived
learning with on-line courses: Principles and examples from the SUNY learning
network. Journal of Asynchronous Learning Networks, 4(2), 2-31. Retrieved from
http://sloanconsortium.org/publications/jaln_main
Shih, M., Feng, J., & Tsai, C.-C. (2008). Research and trends in the field of e-learning
from 2001 to 2005: A content analysis of cognitive studies in selected journals.
Computers and Education, 51, 955-967.
http://dx.doi.org/10.1016/j.compedu.2007.10.004
186
Shinsky, E. J., & Stevens, H. A. (2011). Teaching in educational leadership using Web
2.0 applications: Perspectives on what works. Peer Reviewed Articles, 8.
Retrieved from
http://scholarworks.gvsu.edu/cgi/viewcontent.cgi?article=1003&context=coe_arti
cles
Simonson, M., Schlosser, C., & Hanson, D. (1999). Theory and distance education: A
new discussion. American Journal of distance Education, 13(1), 60-75.
http://dx.doi.org/10.1080/08923649909527014
Sims, R. (2008). Rethinking e-learning: A manifesto for connected generations. Distance
Education, 29, 153-164. http://dx.doi.org/ 10.1080/01587910802154954
Sinclair, A. (2009). Provocative pedagogies in e-learning: Making the invisible visible.
International Journal of Teaching and Learning in Higher Education, 21(2), 197209. Retrieved from ERIC Database. (EJ899306)
Smith, T. C. (2005). Fifty-one competencies for online instruction. The Journal of
Educators Online, 2(2), 1-18. Retrieved from
http://www.thejeo.com/Ted%20Smith%20Final.pdf
So, H.-J., & Bonk, C. J. (2010). Examining the roles of blended learning approaches in
computer-supported collaborative learning (CSCL) environments: A Delphi
study. Educational Technology & Society, 13(3), 189–200. Retrieved from ERIC
Database. (EJ899878)
Stavredes, T., & Herder, T. (2014). A guide to online course design: Strategies for
student success. San Francisco, CA: Jossey-Bass.
Stern, C., & Kauer, T. (2010). Developing theory-based, practical information literacy
training for adults. ScienceDirect, 42, 69-74.
http://dx.doi.org/10.1016/j.iilr.2010.04.011
Stein, D. S., Calvin, J., & Wanstreet, C. E. (2009). How a novice adult online learner
experiences transactional distance. Quarterly Review of Distance Education, 10,
305-311. Retrieved from http://www.infoagepub.com/index.php?id=89&i=43
Strang, K. D. (2009). Measuring online learning approach and mentoring preferences of
international doctorate students. International Journal of Educational Research,
48, 245-257. http://dx.doi.org/10.1016/j.ijer.2009.11.002
Strang, K. D. (2012). Skype synchronous interaction effectiveness in a quantitative
management science course. Decision Sciences Journal of Innovative Education,
10(1), 3-23. http://dx.doi.org/10.1111/j.1540-4609.2011.00333.x
Styer, A. J. (2007). A grounded meta-analysis of adult learner motivation in online
learning from the perspective of the learner (Doctoral dissertation). Available
from ProQuest Dissertations and Theses database. (UMI No. 3249903)
187
Sulčič, V., & Lesjak, D. (2009). E-learning and study effectiveness. Journal of Computer
Information Systems, 49(3), 40-47. Retrieved from
http://iacis.org/jcis/articles/Abrahams_Macmillan_2009_49_3.pdf
Tallent-Runnels, M. K., Thomas, J. A., Lan, W. Y., Cooper, S., Ahern, T. C., Shaw, S.
M., & Liu, X. (2006). Teaching courses online: A review of the research. Review
of Educational Research, 76(1), 93-135.
http://dx.doi.org/10.3102/00346543076001093
Tapscott, D., & Williams, A. D. (2010). Innovating the 21st-century university: It’s time!
EDUCAUSE Review, 45, 16-18, 20-24, 26, 28-29. Retrieved from
http://net.educause.edu/ir/library/pdf/erm1010.pdf
Taran, C. (2006). Enabling SMEs to deliver synchronous online training—Practical
guidelines. Campus-Wide Information Systems, 23, 182-195.
http://dx.doi.org/10.1108/10650740610674193
Taylor, B., & Kroth, M. (2009a). A single conversation with a wise man is better than ten
years of study: A model for testing methodologies for pedagogy or andragogy.
Journal of the Scholarship of Teaching and Learning, 9(2), 42-56. Retrieved from
http://josotl.indiana.edu/
Taylor, B., & Kroth, M. (2009b). Andragogy's transition into the future: Meta-analysis of
andragogy and its search for a measurable instrument. Journal of Adult
Education, 38(1), 1-11. Retrieved from ERIC Database. (EJ891073)
Thomas, L., Buckland, S. T., Rexstad, E. A., Laake, J. L., Strindberg, S., Hedley, S. L. . .
. Burnham, K. P. (2010). Distance software: Design and analysis of distance
smapling surveys for estimating population size. Journal of Applied Ecology, 47,
5-14. http://dx.doi.org/10.1111/j.1365-2664.2009.01737.x
Tinto, V. (1975). Dropout from higher education: A theoretical synthesis of recent
research. Review of Educational Research, 45, 89-125.
http://dx.doi.org/10.3102/00346543045001089
Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition
(2nd ed.). Chicago, IL: University of Chicago Press.
Tolutiene, G. & Domarkiene, J. (2010). Learning needs and the possibilities of their
satisfaction: The case of prospective andragogues. Tiltai, 50(1), 147-158.
Retrieved from http://www.ku.lt/leidykla/tiltai.php
Travis, J. E., & Rutherford, G. (2012). Administrative support of faculty preparation and
interactivity in online teaching: Factors in student success. National Forum of
Educational Administration & Supervision Journal, 30(1), 30-44. Retrieved from
http://www.scribd.com/doc/110649809/Administrative-Support-of-FacultyPreparation-and-Interactivity-in-Online-Teaching-Factors-in-Student-Success-byDr-Jon-E-Travis-and-Grace-Rutherfo
188
Tuquero, J. (2011). A meta-ethnographic synthesis of support services in distance
learning programs. Journal of Information Technology Education, 10, 157-179.
Retrieved from http://www.jite.org/documents/Vol10/JITEv10IIPp157179Tuquero974.pdf
University of Missouri System. (2011). University of Missouri system facts. Retrieved
from http://www.umsystem.edu/ums/about/facts/
U.S. Department of Education National Center for Educational Statistics. (2009). The
integrated postsecondary education data system (IPEDS). Retrieved from
http://nces.ed.gov/fastfacts/index.asp?faq=FFOption5#
U.S. Department of Health and Human Services, National Commission for the Protection
of Human Subjects of Biomedical and Behavioral Research [DHHS]. (1979). The
Belmont Report: Ethical principles and guidelines for the protection of human
subjects of research (45 CFR 46). Retrieved from
http://www.hhs.gov/ohrp/humansubjects/guidance/belmont.html
Varmecky, J. (2012). Learning for life transitions. Mountain Plains Adult Education
Association (MPAEA) Journal of Adult Education, 41(2), 1-11. Retrieved from
https://www.mpaea.org/?page=publications
Vogel-Walcutt, J. J., Gebrim, J. B., Bowers, C., Carper, T. M., & Nicholson, D. (2010).
Cognitive load theory vs. constructivist approaches: Which best leads to efficient,
deep learning? Journal of Computer Assisted Learning, 27, 133-145.
http://dx.doi.org/10.1111/j.1365-2729.2010.00381.x
Vroom, V. H. (1994). Work and motivation. San Francisco, CA: Jossey-Bass.
Vygotsky, L. S. (with M. Cole, V. John-Steiner, S. Scribner, & E. Souberman [Eds.]).
(1978). Mind in society: The development of higher psychological processes.
Cambridge, MA: Harvard University Press.
Walther, J. B., Gay, G., & Hancock, J. T. (2005). How do communication and technology
researchers study the Internet? Journal of Communications, 55, 632-657.
http://dx.doi.org/ 10.1111/j.1460-2466.2005.tb02688.x
Wang, H. (2010). 10 ways to make e-learning more exciting. Online Cl@ssroom: Ideas
for Effective Online Instruction, 7-8. Retrieved from
http://www.worldcat.org/title/online-clssroom-ideas-for-effective-onlineinstruction/oclc/60638446
Wang, M., Vogel, D., & Ran, W. (2011). Creating a performance-oriented e-learning
environment: A design science approach. Information & Management, 48, 260269. http://dx.doi.org/10.1016/j.im.2011.006.003
189
Wang, V. X., & Kania-Gosche, B. (2011). Assessing adult learners using web 2.0
technologies. International Journal of Technology in Teaching and Learning,
7(1), 61-78. http://dx.doi.org/10.4018/ijtem.2011070103
Watkins, R. (2005). Developing interactive e-learning activities. Performance
Improvement, 44(5), 5-7. http://dx.doi.org/10.1002/pfi.4140440504
Werth, E. P., & Werth, L. (2011). Effective training for millennial students. Adult
Learning, 22(3), 12-19. http://dx.doi.org/10.1177/104515951102200302
Willging, P. A., & Johnson, S. D. (2009). Factors that influence students' decision to
dropout of online courses. Journal of Asynchronous Learning Networks, 13(3),
115-127. Retrieved from http://sloanconsortium.org/jaln/v13n3/factors-influencestudents%E2%80%99-decision-dropout-online-courses-previously-publishedjaln-84
Williams, R., Karousou, R., & Mackness, J. (2011). Emergent learning and learning
ecologies in web 2.0. International Review of Research in Open and Distance
Learning, 12(3). Retrieved from
http://www.irrodl.org/index.php/irrodl/article/view/883/1686
Wilson, D., & Allen, D. (2011). Success rates of online versus traditional college
students. Research In Higher Education Journal, 14, 1-9. Retrieved from
http://www.aabri.com/manuscripts/11761.pdf
Wilson, L. S. (2005). A test of andragogy in a post-secondary educational setting
[Doctoral dissertation, Louisiana State University and Agricultural and
Mechanical College]. Retrieved from http://etd.lsu.edu/docs/available/etd06152005-122402/unrestricted/Wilson_dis.pdf
Yang, Y., & Cornelious, L. F. (2005). Preparing instructors for quality online instruction.
Online Journal of Distance Learning Administration, 8(1). Retrieved from
http://www.westga.edu/~distance/ojdla/spring81/yang81.htm
Yen, C.-J., & Abdous, M. (2011). A study of the predictive relationships between faculty
engagement, learner satisfaction and outcomes in multiple learning delivery
modes. International Journal of Distance Education Technologies, 9(4), 57-70.
http://dx.doi.org/10.4018/jdet.2012010105
Young, S. F. (2008). Theoretical frameworks and models of learning: Tools for
developing conceptions of teaching and learning. International Journal for
Academic Development, 13(1), 41-49.
http://dx.doi.org/10.1080/13601440701860243
Zepke, N., & Leach, L. (2010). Improving student engagement: Ten proposals for action.
Active Learning in Higher Education, 11, 167-177.
http://dx.doi.org/10.1177/1469787410379680
190
Zhao, Y., Lei, J., Yan, B., Tan, H. S., & Lai, C. (2005). What makes the difference: A
practical analysis of effectiveness of distance education. Teachers College
Record, 107, 1836-1884. http://dx.doi.org/10.1111/j.1467-9620.2005.00544.x
191
Appendixes
192
Appendix A: Higher Learning Commission of the North Central Association of Colleges
and Schools Institutions with Physical Facilities in Missouri
University of Missouri System
Missouri University of Science and Technology
University of Missouri – Columbia
University of Missouri – Kansas City
University of Missouri – St. Louis
Public Universities
DeVry University – Missouri
Harris-Stowe State University
Missouri Southern State University
Missouri State University - Springfield
Missouri State University – West Plains
Northwest Missouri State University
Southeast Missouri State University
Missouri Western State University
University of Central Missouri
University of Phoenix – Kansas City Campus
University of Phoenix – Springfield Campus
University of Phoenix – St. Louis Campus
Private Colleges and Universities
Central Methodist University – College of Graduate and Extended Studies
Columbia College
193
Cottey College
Cox College
Crowder College
Culver-Stockton College
Drury University
East Central College
Fontbonne University
Hannibal-Lagrange College
Jefferson College
Lindenwood University
Linn State Technical College
Maryville University of Saint Louis
Metropolitan Community College – Longview
Metropolitan Community College – Penn Valley
Missouri Southern State University
Missouri Valley College
National American University – Independence
National American University – Zona Rosa
North Central Missouri College
Saint Louis Community College – Florissant Valley
Saint Louis Community College – Forest Park
Saint Louis Community College – Meramac
Saint Louis Community College – Wildwood
194
Saint Louis University
Saint Charles Community College
State Fair Community College
Stephens College
Three Rivers Community College
Webster University
William Woods University
195
Appendix B: Results of Random Selection of Schools
University of Missouri System
1. University of Missouri – Columbia*
2. Missouri University of Science and Technology
3. University of Missouri – St. Louis
Public Universities
1. Missouri State University – West Plains *
2. Southeast Missouri State University *
3. University of Phoenix – St. Louis Campus *
4. Missouri Western State University *
5. Northwest Missouri State University *
6. DeVry University – Missouri
7. University of Central Missouri
8. Missouri State University - Springfield
Private Colleges and Universities
1. Saint Louis University [refused to participate]
2. Columbia College *
3. Cox College *
4. Crowder College *
5. St. Louis Community College – Florissant Valley *
6. Webster University*
7. Metropolitan Community college – Penn Valley
8. Lindenwood University
196
9. Maryville University of Saint Louis
10. East Central College
197
Appendix C: Andragogy in Practice Inventory
198
199
200
201
Appendix D: Permission to Use API
202
203
Appendix E: Learner Satisfaction and Transfer of Learning Survey
Learner Satisfaction Subscale of the LSTQ
1 = strongly disagree, 2 = disagree, 3 = neither agree or disagree, 4 = agree, 5 = strongly agree
As a result of my experience in this online course, I would like to participate in
another online course in the future.
1 2 3 4 5
I would recommend this learning opportunity to others
1 2 3 4 5
The online class was a useful learning experience
1 2 3 4 5
The online course met my expectations
1 2 3 4 5
In the online class I was able to keep up with the workload
1 2 3 4 5
204
Appendix F: Permission to Use LSTQ
205
Appendix G: Request Letter to Chief Academic Officers
Dear Dr. XXX,
I thank you in advance for your time and consideration of the following
request. I am currently a doctoral candidate pursuing a Doctor of philosophy
(Ph.D.) in eLearning at Northcentral University.
In sending this letter I am hoping to enlist your assistance and support for my
dissertation study. The purpose of my study, titled Andragogy and Online
Course Satisfaction: A Correlation Study, is to investigate relationships between
adult learner characteristics, instructional process design elements and learner
satisfaction among adult learners in a postsecondary online environment with at
least one physical facility in Missouri.
As part of a stratified random sample of HLC-NCA schools in the state of
Missouri, <school name> was selected as one of 12 schools selected to take part
in this study.
The assistance that I am seeking from you is two-fold:

First, I’d like to meet with you in the near future so that we can discuss the
study sufficiently that you’d feel positive about endorsing the study.

Second, I’d like for you to arrange sending an e-mail with your
endorsement and my invitation to participate in the study to students at
<school name> who meet the study criteria. The rest of the study will be
anonymous and done completely online using Survey Gizmo.
I will never have access to any student identifying information, nor will I have
access to your student unless they choose to participate in the study by
acknowledging and accepting the attached informed consent form.
The study criteria includes: (a) students that have or are attending <school
name>, (b) who have taken, successfully or not, at least one completely online
class, and (c) are 25 years old or older.
Thank you again for your consideration and support,
<signature>
Stephen W. Watts, M.Ed.
allstarts@hotmail.com
314.749.6368
206
Appendix H: Responses from Provosts and Chief Academic Officers
207
208
209
Appendix I: Informed Consent Form
Andragogy and Online Course Satisfaction: A Correlation Study
What is the study about? You are invited to participate in a research study being
conducted for a dissertation at Northcentral University in Prescott Arizona. The
researcher is interested in your opinions about your most recent online educational
experience. You were selected to participate in the study because you are at least 25
and have participated in a college or university course online. There is no deception
in this study.
What will be asked of me? You will be asked to answer some questions in an online
survey regarding your feelings about your most recent online college or university
course. Please answer the questions in the survey as they apply to your experiences.
It is estimated that the survey will take 20-25 minutes of your time.
Who is involved? The following people are involved in this research project and can
be contacted at any time through email. The researcher or the chair would be happy
to answer any questions that may arise about the study. Please direct any questions or
comments to:
Principal Researcher:
Stephen Watts, M.Ed.
allstarts@hotmail.com
Dissertation Chair:
Dr. Robin Throne
rthrone@ncu.edu
Are there any risks? There are no known risks in this study. Because some of the
questions ask about behavior of college or university faculty, this could be distressing
to some people; however, you may stop the study at any time. You can choose not to
answer any question that you feel uncomfortable in answering.
What are some benefits? There are no direct benefits to you for participating in this
research. No incentives are offered. The results have scientific interest that may
eventually have benefits for the improvement in the teaching of online courses.
Is the study anonymous/confidential? The data collected in this study are
anonymous and confidential. Your name or personal information is not linked to the
data. The data from the survey are not linked to an email address. Only the
researchers in this study will see the data and the data will be stored on a secure
encrypted server.
Can I stop participating in the study? You have the right to withdraw from the study
at any time without penalty. You can skip any question you do not want to answer.
What if I have questions about my rights as a research participant or
complaints? If you have questions about your rights as a research participant, any
complaints about your participation in the research study, or any problems that
210
occurred in the study, please contact the researchers identified in the consent form.
Or, if you prefer to talk to someone outside the study team, you can contact
Northcentral University’s Institutional Review Board at irb@ncu.edu or
1.888.327.2877 ex 8014.
We would be happy to answer any questions that may arise about the study. Please
direct your questions or comments to: Stephen Watts (allstarts@hotmail.com), or Dr.
Robin Throne (rthrone@ncu.edu).
Participant Online Consent Signature.
I have read the description above for the Andragogy and Online Course Satisfaction:
A Correlation Study study. I understand what the study is about and what is being
asked of me. In lieu of a signed consent form, my participation in the study by
answering the questions in the survey indicates that I have read and understand the
informed consent form and agree to participate in the study.
[X] I have read, understand, and desire to participate in the study.
[X] I have read, understand, and do not desire to participate in the study.
211
Appendix J: G*Power A Priori Analysis
Download