E-learning in higher education: some key aspects

advertisement
Higher Education Research & Development
Vol. 28, No. 3, June 2009, 303–318
E-learning in higher education: some key aspects and their
relationship to approaches to study
Robert A. Ellis*a, Paul Ginnsb and Leanne Piggottc
a
Institute of Teaching and Learning, University of Sydney, Australia; bFaculty of Education
and Social Work, University of Sydney, Australia; cFaculty of Economics and Business,
University of Sydney, Australia
(Received 13 September 2007; final version received 7 November 2008)
Taylor and Francis Ltd
CHER_A_384160.sgm
High
10.1080/07294360902839909
0729-4360
Original
Taylor
302009
28
r.ellis@vcc.usyd.edu.au
RobertEllis
00000June
Education
&Article
Francis
(print)/1469-8366
2009
Research and (online)
Development
While there has been systematic and on-going research into e-learning in universities
for over two decades, there has been comparatively less evidence-based research
into how key aspects of e-learning are internally constituted from a student
perspective and how these aspects might be related to university students’ learning
experiences. The purpose of this paper is to explore key aspects of e-learning that
might be related to university student approaches to study, so that a better
understanding of the internal structure of these aspects is achieved. Student
responses to surveys are analysed at the level of each item to identify which items
made the most sense to over 200 third-year economics students. Data was also
analysed both at the variable level, to identify which items coalesce to determine
the structure of e-learning variables, and at the student level, to see if there were
groups of students in the sample that shared similar experiences of e-learning when
it was used to support a predominately campus-based learning experience. The
results suggest several implications for improving particular aspects of the student
experience of e-learning when it is used to support a campus-based experience.
Keywords: approaches to study; experiences of e-learning; quantitative analyses
Introduction
E-learning is being introduced as a fundamental part of the student learning experience in higher education. It is no longer core business only for those universities with
a mission for distance education, its affordances are being systematically integrated
into the student learning experience by predominately campus-based universities.
Evidence of this widespread uptake can be seen in reputable research journals and on
the websites of national bodies responsible for leading learning and teaching in higher
education. Examples of these include the websites of the Higher Education Academy
in the UK, Educause in the USA and the Australian Learning and Teaching Council
in Australia.
While we can recognize sustained research interest into e-learning in the student
experience in higher education over the last two decades (Goodyear, 1984, 1991;
Goodyear, Jones, Asensio, Hodgson, & Steeples, 2005; Laurillard, 1993, 2002;
Salmon, 2002a, 2004), more focused explorations into how key aspects of e-learning
are associated with the students’ face-to-face experience of learning are relatively
*Corresponding author. Email: r.ellis@vcc.usyd.edu.au
ISSN 0729-4360 print/ISSN 1469-8366 online
© 2009 HERDSA
DOI: 10.1080/07294360902839909
http://www.informaworld.com
304
R.A. Ellis et al.
sparse. There is comparatively little research into how both online and face-to-face
contexts play a relational role in helping students achieve their learning outcomes. A
growing use of e-learning to support face-to-face experiences presupposes there is an
understanding of what the key aspects of e-learning are, how they are internally
constituted and externally associated with each other and how they are related to key
aspects of the face-to-face experience. Without these fundamental understandings, the
quality of the student experience of learning comprising online and face-to-face experiences is likely to be put at risk. There is a need for more evidence-based research to
inform the ways we think about creating and designing such experiences so that the
quality of learning is likely to be enhanced.
This study investigates how e-learning is used to support the face-to-face experience of third year business students at University. In this paper, e-learning is defined
as information and communication technologies used to support students improve
their learning (Higher Education Funding Council of England, 2005). Students studying Government Foreign and Defence policy in their business degree experienced
tutorials and lectures as part of their weekly schedule. They were also expected to lead
and run a tutorial by engaging in pre- and post- online contributions in the form of
discussions and related submissions to structure and inform the debate. This study
explores how the students perceived key aspects of the online experience, such as the
design of their course website and the submissions made by others and themselves.
The study also investigated how these perceptions are related to their approaches to
study, measured by the revised Study Process Questionnaire, the R-SPQ (Biggs,
Kember, & Leung, 2001) and to their academic achievement. The term ‘explore’ is
used here intentionally, as while we have some idea of the broad focus of these aspects
from previous research, we seek to understand how these aspects are internally structured and which parts might coalesce within the online environment and with study
approaches. The research questions of the study are:
●
●
●
●
What are some of the key aspects of the student experience of e-learning when
it supports a face-to-face experience?
How are the parts constituted?
How do these parts relate to student approaches to study?
Is there any relationship between variations in the student experience of e-learning, approaches to study and their achievement?
Theory of learning and prior research
Seminal research into student learning in higher education over the last few decades
has focused on key aspects of the student experience of learning, as shown in Figure 1.
Research into the student experience of learning in higher education has focused
on: student characteristics, such as the conceptions of learning with which they enter
courses; course context, such as teaching methods; learning context, such as student
perceptions of the quality of teaching and quantity of work; student approaches to
learning, what they do and why they approach learning in particular ways; and the
quality of their learning outcomes (Prosser & Trigwell, 1999; Ramsden, 2002). This
research has shown that variation in the way students approach their learning is related
to how they perceive their context, what they think they are learning and the quality
of their learning outcomes. This study adds to this research by considering associations between student approaches to learning and their experience of e-learning. In
Figure 1. Presage-Process-Product Model of Student Learning (adapted from Prosser & Trigwell, 1999).
Higher Education Research & Development
305
Figure 1. Presage-Process-Product Model of Student Learning (adapted from Prosser &
Trigwell, 1999).
particular, the focus is on four aspects: (1) interactivity, (2) approaches to e-moderating, (3) issues related to course design, and (4) workload awareness.
Interactivity
Definitions of interactivity in the higher education literature include those that refer to
action and activity amongst people and those that refer to action and/or activity
between a student and a computer/program. Barretto, Piazzalunga, Ribeiro, Dalla and
Filho (2003) provide a general definition of the term ‘interactivity’ as an ‘activity and/
or action between individuals and/or machines’ (p. 272). Educational software is
considered interactive and interactivity in this case occurs between the computer
program and the user. More specifically, it is possible to describe four types of interaction: learner-content, learner-instructor, learner-learner and learner-interface. Interactivity for the purposes of learning is recognized as one of the key ways to capture
affordances of e-learning to increase the learner’s knowledge (Laurillard, 2002; Sabry
& Baldwin, 2003). In order to do this purposefully, the use of the interaction needs to
be closely tied to the objectives of the course (Martens, Valcke, & Portier, 1997). It is
suggested that it also needs to focus on student control of the learning so as to encourage active engagement that will be meaningful (Sims, 1997) and may require in-depth
case studies to understand how technologies support the development of interactivity
and comprehension (Godwin, Thorpe, & Richardson, 2008). In this study, we refer to
interaction as activity and action amongst the students in the learning process, i.e.
student-student interaction. Interaction between student and teacher is seen as part of
a concept of e-teaching and moderating.
Approaches to e-moderating
A key aspect of teaching related to e-learning is e-moderating. This can be broadly
described as a process of facilitating the development of small and large groups through
conferencing (Brace-Govan, 2003; Salmon, 2002b). Key teaching strategies in
306
R.A. Ellis et al.
e-moderating include summarizing and weaving together the ideas of students who
have made a contribution to a topic of discussion. The role of e-moderator is particularly
important in helping learners to reflect about the issues under discussion. E-moderation
of online discussions has been shown to be beneficial for promoting group work and
collaborative learning and it is considered to be an effective method to achieve structured group interaction (Dewiyanti, Brand-Gruwel, & Jochems, 2005). In this study,
the concept of e-moderation is broadened to include online communication between
student and teacher that is not necessarily part of a structured discussion in a small
group. It includes online feedback from the teacher about class activities or submitted
written work, as well as online communication to keep students informed about matters
relevant to their learning. In this study, we refer to all these activities as e-teaching.
Issues related to course design
The incorporation of e-learning into the student experience of learning typically
requires consideration of the interaction between students, teachers and the technology within a design framework. Often cohesion of the design elements is found
through alignment to intended learning outcomes (Biggs, 2005). Some studies have
highlighted the importance of using an outcomes-based approach to course design, so
that the learners’ outcomes are one of the key rationales for design decisions, rather
than the provision of content (Littlejohn, 2002; O’Toole & Absalom, 2003). Some
research has itemized aspects of design from an instructional perspective, such as the
use of the technology, instructional objectives, testing, multimedia materials and the
learning activities that arise through combinations of these aspects (Hashim, 1999).
There has been significant research into approaches to learning that are moderated by
design approaches, such as problem-based learning. This approach has been particularly influential in international medical education and is readily adapted for online
learning (Oliver & Omari, 1999; Pearson, 2006). The concept of design in this study
focuses on associations between the face-to-face and online contexts of the students’
learning experience and how the design of the online materials and activities help the
students to learn and understand the whole experience.
Workload awareness
Students’ perceptions of workload have been found to be related to the quality of their
learning experiences. The research is often quantitative and seeks to identify associations between the different aspects of the student experience. Research into the relationships between learning approaches, study motivation, hours of study and
perceived workload suggest that workload and motivation are significantly associated
to the quality of the experience (Kember, Ng, Tse, Wong, & Pomfret, 1996). Other
research has used self-report inventories and closed-ended questionnaires to investigate associations between learning and workload. For example, Richardson (2003)
administered the Approaches to Study Inventory and the Course Experience Questionnaire to students enrolled in an online course and found that students’ final results
were most strongly associated with perceptions of appropriate workload. In the
present study, workload is conceptualised as the amount of online work additional to
the work the students are expected to do in class.
The above areas of research are a way of unpacking key aspects of e-learning. This
study explores how these issues are internally structured from a student perspective,
Higher Education Research & Development
307
and how they are related to the students’ whole experience of learning, both at the
variable and student levels. To investigate key aspects of the whole student experience
of learning, the revised Study Process Questionnaire (R-SPQ: Biggs et al., 2001) is
used. We report results from a new questionnaire, labelled ‘Questionnaire 1’, which
draws on some of the above research in the development of its items. This questionnaire is designed to explore how e-learning helps students learn in a context which
also involves significant face-to-face learning. Analyses are then performed to investigate associations between the experiences tapped by the two questionnaires.
Learning context
Third-year students studying Government as part of their undergraduate business
degree were supported with significant online resources designed to integrate with,
and extend, their face-to-face learning.
The course aimed to help students understand the factors leading to the direction
taken and decisions made in Australian Government foreign and defence policy over
the last one-hundred years. Particular attention was paid to links between foreign
policy-making and international relations. To engage with these aims in a meaningful
way students were expected to not only attend lectures and tutorials on the key issues,
but also lead discussions on one of the issues – largely through the online
environment. A typical approach adopted by students was to contribute in both preand post-class discussions. They first researched the topic, then posted key issues and
summaries on the online discussion board before the class, fielded any queries leading
up to the face-to-face tutorial and, finally, posted key questions for debate online after
all the students had had a chance to engage in the debate in the tutorial. Their online
role post-tutorial required them to moderate the debate, for which they received guidelines and training earlier. Their leadership of a topic online was worth 20% and their
contributions to the postings of other students’ topics was worth another 20%. This
made the online part of their course experience worth 40% of their final mark. To post
and moderate the online discussions, students used Blackboard 6.1.
Methods
We believe that relatively little is known about the internal structure of the online
experience and how key aspects relate to the face-to-face experience. Consequently a
semi-exploratory approach was adopted (Goodyear et al., 2005). It is semi-exploratory
in the sense that the content of the e-learning items in the new questionnaire have been
informed by prior research and while we have some idea about how these items might
coalesce we seek to explore the students’ experience of e-learning as it links to, integrates with and extends their face-to-face experience. In analysing the data we strive
to maintain a balance between pre-conceived and emerging patterns (Prosser &
Trigwell, 1999). In other words, our aim is to use our understanding and experience
of the phenomenon being researched to inform the way we interrogate the data and let
patterns within the data reveal new insights to improve the types of research questions
being asked.
Three types of statistical analyses are used. Factor analyses are used to look at
the structural relationships amongst the items of the questionnaire, as there were
expectations that a smaller number of underlying constructs may explain students’
responses to individual items. Pearson correlation coefficients are used to investigate
308
R.A. Ellis et al.
the strength of the relationships between pairs of constructs that were identified
through grouping the items. The cluster analyses are at the level of the student.
These look for subgroups identified on the basis of similarities of the variables being
investigated.
Data
Data gathered included student responses to Questionnaire 1, a twenty-item, five-point
Likert scale questionnaire investigating the student experience of e-learning when it is
designed to support a face-to-face experience; student responses to the revised R-SPQ
(Biggs et al., 2001); an ‘Overall Satisfaction with Quality’ item, to check on the validity of scale scores used in some of the analyses; and the students’ achievement
measured by the mark they received for their online participation.
Results
The results are offered in four parts. Firstly, student perceptions of e-learning within
their course experience are considered using Questionnaire 1. Descriptive and
frequency analyses of the items that offer the most interesting differences in students’
ratings are discussed. We then examine how the items relate to each other at a variable
level using factor analysis. Based on these results, scale scores were constructed. To
investigate how key aspects of the e-learning experience identified by the factor analysis are associated to student approaches to study, we calculated deep-approach and
surface-approach scale scores for the R-SPQ and we offer correlation analyses of the
scores from the scales in Questionnaire 1, the R-SPQ scales, the ‘overall satisfaction
with quality’ item and student achievement. Finally, to investigate if there were any
similar experiences of learning at the level of the student across the sample, a cluster
analysis was used to look for groupings of the variables.
Student perceptions of e-learning
The main source of data for how the students experienced e-learning is Questionnaire
1. Student ratings ranged from 1 (strongly disagree) to 5 (strongly agree). In explaining the analysis, observations are offered about the student responses to the individual
items. These are categorized according to similarities amongst the foci of the items
and the frequency of student responses to the 5-point Likert scale summarized in three
groupings as percentages: agree, disagree and neutral. We then present an analysis of
the students’ responses to the items around groupings or themes using a principal
components factor analysis. The methodology for this section is adapted from Goodyear et al. (2005) and is particularly useful for exploring which items have the most
face-validity to students at a discrete level and how patterning in the data then
coalesce at the variable level.
Items 12, 13 and 2 in Table 1 identify student perceptions about the submissions
made by themselves and by others of the online part of their learning experience.
Items 12 and 13 reported two of the highest agreement responses frequencies by
students, as well as the highest mean. The affordance of e-learning to facilitate the
sharing of alternate perspectives on key issues seems to be captured here. While selfsubmissions also seemed to be an important perception, the representing item, item 2,
did not attract the same level of agreement amongst the students.
Higher Education Research & Development
Table 1.
309
Items focusing on submissions made by students.
Likert scale response (%)
Item
12
13
2
Other students’ online submissions helped me
understand my ideas from a new perspective.
The submissions from other students helped
develop my understanding of particular topics.
I felt my submissions to the course website
were valued by other students.
Mean
Disagree
Neutral
Agree
3.86
8
15
77
3.76
9
15
76
3.45
13
33
54
Some items, given in Table 2, focused on the design of the online materials that
students were using to help them achieve their learning outcomes.
There was a mixed response by students to the items related to the design of their
course websites. Student responses to items 11 and 13 suggest that design of online
activities is a relatively important issue for their learning. Student perceptions of the
website design as a whole, and the way online materials explain things, seem to be less
important to students according to their response to items 19 and 4.
Some items focused on workload, informed by previous research (Ramsden,
1991). These are presented in Table 3.
The volume of workload was rated a significant issue for students, according to
their responses to items 6 and 16. This may capture student concern about adding on
significant e-learning activities to existing face-to-face activities without allowing for
the additional time involved.
Table 4 includes some items focused on the skills related to how the teacher
engaged with students online.
Student perceptions of teacher responses, as indicated by the mean scores and
Likert scale responses, seemed to vary. Responses to item 9 suggested that the majority of the students felt that the teacher’s online responses were helpful. Responses to
item 1 suggest that the teacher’s online responses did not necessarily motivate the
majority to engage in deeper learning.
Table 2.
Items related to design issues.
Likert scale response (%)
Item
11
20
19
4
7
The online activities are designed to get the best
out of students.
The online learning materials in this course are
designed to really try to make topics
interesting to students.
The design of the website (online experiences in
this course) helped my learning.
The online learning materials in this course are
extremely good at explaining things.
The design of the website in this course made
me want to explore the issues more.
Mean
Disagree
Neutral
Agree
3.39
20
25
55
3.16
24
35
41
3.16
19
49
33
3.14
21
45
34
2.87
38
38
25
310
R.A. Ellis et al.
Table 3.
Items related to student workload.
Likert scale response (%)
Item
16
6
8
15
Mean Disagree Neutral Agree
The sheer volume of work for the online
component of this course means it can’t all be
thoroughly comprehended. (reversed)
The workload for the online component of this
course is too heavy. (reversed)
I generally had enough time to understand the
things I had to learn online.
The workload online didn’t stop me from
developing my understanding of the key issues in
this course.
Table 4.
3.81
13
20
67
3.74
15
19
66
3.08
44
27
39
3.01
25
31
44
Items related to online responses made by the teachers.
Likert scale response (%)
Item
9
1
I didn’t receive enough helpful online feedback
from my teacher. (reversed)
The teacher’s responses online motivated me to
learn more deeply.
Mean
Disagree
Neutral
Agree
3.40
57
24
19
2.58
48
31
21
Some items, focused on the students’ experience of the link between the online
activities and the face-to-face context are presented in Table 5.
Student responses to items 17 and 18 about how the online activities may have
helped learning in the face-to-face context did not attract widespread agreement. A
significant percentage of students were unsure about the relationships between the
online activities and the face-to-face context. The design of the materials may have
contributed to this. Attention to the design, especially how the activities are supposed
to link learning across the online and face-to-face contexts, may be a way to remove
the students’ uncertainty.
To investigate the patterns of the student data in greater detail (i.e. to see if there
were any coherent groupings of items) factor analyses including all items were
conducted. Using principal components estimation with varimax rotation to simple
structure, by applying the Kaiser Guttman rule (Kaiser & Caffrey, 1965), which
suggests that the number of factors is equal to the number of factors with eigenvalues
greater than 1, we identified a subset of seventeen items that provide a clear factor
Table 5.
Items relating online resources with face-to-face context.
Likert scale response (%)
Item
17
18
Mean Disagree Neutral Agree
The online activities helped me to understand the
face-to face activities in this course.
The online learning materials helped me to learn
during the face-to-face situations in this course.
2.99
16
64
20
2.96
17
64
19
Higher Education Research & Development
311
structure across four factors when low loadings of less than 0.4 are omitted. (Before
the analysis, negatively worded items were reversed to ensure the consistency of the
interpretation of the data.) Table 6 shows the loadings of four independent factors,
which suggest the existence of four coherent subsets of variables.
Factor 1 in Table 6 shows five coherent items underlying aspects related to the way
the teacher taught using the course website. From a student perspective, the way the
teacher supported online discussions by guiding and focusing them (items 14 and 3),
the way the teacher responded and interacted with students online (items 1 and 5) and
Table 6. Exploratory factor analysis structure for Questionnaire 1 (student experience of
e-learning supporting a face-to-face experience).
Factor
Factor 1 Factor 2 Factor 3
Factor 4
E-teaching Design Workload Interactivity
Item
14
1
5
9
3
17
18
11
19
7
16
6
8
12
The teacher helped to guide online
discussions between students.
The teacher’s responses online
motivated me to learn more deeply.
The teacher’s interactivity with me
online encouraged me to get the most
out of my learning.
I didn’t receive enough helpful online
feedback from my teacher. (reversed)
The teacher helped to focus online
discussions.
The online activities helped me to
understand the face-to face activities
in this course.
The online learning materials helped me
to learn during the face-to-face
situations in this course.
The online activities are designed to get
the best out of students.
The design of the website (online
experiences in this course) helped my
learning.
The design of the website in this course
made me want to explore the issues
more.
The sheer volume of work for the online
component of this course means it
can’t all be thoroughly
comprehended. (reversed)
The workload for the online component
of this course is too heavy. (reversed)
I generally had enough time to
understand the things I had to learn
online.
Other students’ online submissions
helped me understand my ideas from
a new perspective.
.67
.26
.12
.33
.84
.18
.15
.06
.70
.30
.26
.26
.75
−.03
−.13
.23
.86
.19
.12
−.01
.07
.74
.02
.03
.04
.75
−.06
.16
.24
.63
−.01
.36
.23
.63
.15
.19
.18
.61
.19
.05
.03
.08
.80
.24
.13
−.02
.79
−.05
.19
.30
.52
.35
.23
.24
−.06
.76
312
R.A. Ellis et al.
Table 6.
(Continued).
Factor
Factor 1 Factor 2 Factor 3
Factor 4
E-teaching Design Workload Interactivity
Item
10
13
2
I interacted with students’ online
postings/submissions even if they
weren’t assessed.
The submissions from other students
helped develop my understanding of
particular topics.
I felt my submissions to the course
website were valued by other students.
−.01
−.01
.22
.59
.21
.28
−.01
.75
.17
.12
.19
.65
Notes: Eigenvalue 5.8, 1.3, 1.6 and 1.7, 61% variance explained; bold denotes.
the way the teacher provided feedback online to students (item 9) were items that were
coherent at the level of a variable. For the remainder of this study, this variable will
be referred to as student perceptions of e-teaching.
Factor 2 shows five items that loaded onto one factor focusing on student perceptions of online resources. From a student perspective, items reflecting the design of
activities and the website (items 7, 11 and 19) and the links between design and the
face-to-face experience (items 17 and 18) are coherent at the level of the variable. For
the remainder of this study, this variable will be referred to as student perceptions of
online design.
Factor 3 shows three items that loaded onto one factor (items 6, 16 and 8). These
items capture workload issues related to e-learning. As negative items were reversed
to ensure consistency of analysis, this variable is referred to as student perceptions of
appropriate workload online.
Factor 4 shows four items (12, 10, 13 and 2) that loaded onto one factor reflecting
the ways in which submissions made by students were perceived as helpful for learning.
In this study, this variable is referred to as student perceptions of online interactivity.
The factor analysis suggests that at the level of variables, three items were not part
of the structure of any of the factors identified: item 4, ‘The online learning materials
in this course are extremely good at explaining things’; item 20, ‘The online learning
materials in this course are designed to really try to make topics interesting to
students’; and item 15, ‘The workload online didn’t stop me from developing my
understanding of the key issues of this course’.
Associations amongst student perceptions of e-learning and the quality of the
course, student approaches to study and achievement
Next, we investigated patterns of student perceptions of e-learning by analysing the
scores of the subscales reflected by the four factors (e-teaching, design, workload and
interactivity) as well as the scores of subscales for student approaches to study (deep
and surface) based on the student responses to the R-SPQ. Table 7 shows correlations
amongst the variables and two outcome indicators, that is, the item measuring student
satisfaction with the quality of the online materials and activities and the student
online grade.
.58**
−.02
Outcome indicators
7. Overall quality of online materials and activities
8. Online Grade
Notes: n = 140; *p < 0.05 (2-tailed); **p < 0.01 (2-tailed).
.22**
−.05
Approaches to study
5. Deep Approach (R-SPQ) (α = 0.81)
6. Surface Approach (R-SPQ) (α = 0.78)
1
.45**
.47**
.33**
1
.71**
.12
.13
−.10
1
.49**
.29**
2
.65**
.12
.22**
−.03
1
.38**
3
.46**
.00
.09
.03
1
4
.19*
.05
1
−.05
5
−.05
.05
1
6
1
−.29**
7
1
8
Correlations between student perceptions of e-learning in their face-to-face experience, perception of overall quality, R-SPQ scores and online
Perceptions of e-learning
1. Questionnaire 1 factor 1: e-teaching (α = 0.87)
2. Questionnaire 1 factor 2: design (α = 0.75)
3. Questionnaire 1 factor 4: interactivity (α = 0.68)
4. Questionnaire 1 factor 3: workload (α = 0.65)
Variables
Table 7.
grade.
Higher Education Research & Development
313
314
R.A. Ellis et al.
Table 7 also shows the degree of reliability of the scales using Cronbach’s (1951)
estimate of internal consistency. All scales included in the analysis showed acceptable
levels of internal consistency (Schmitt, 1996). Pearson product-moment correlations
were then calculated between these scale scores, students’ overall ratings of the quality of the online learning materials and activities and a mark awarded for the online
component of the course.
Inspection of the correlation matrix indicated several results of note. Firstly,
using a Type One error rate (α) of 0.05 for statistical significance, students’
responses on each of the proposed scales correlated with ratings of the overall quality of the online materials and activities. These correlations ranged from .71 for
Good Design, to .46 for Appropriate Workload, indicating moderate to strong levels
of correlation. As is standard practice with the Course Experience Questionnaire
(e.g. Lawless & Richardson, 2002; Wilson, Lizzio, & Ramsden, 1997), the overall
rating of quality of the online materials and activities was included as a check on the
validity of identified scales. The results of the current study indicate the constructs
underlying the proposed scales were perceived by students as important facets of
e-learning supporting a face-to-face experience.
Secondly, two of the four e-learning experience scales were significantly correlated with a deep approach to study. There was a positive correlation between ratings
of Good e-teaching and self-reports of a deep approach, r = .22; and between ratings
of Student interactivity and self-reports of a deep approach, r = .22. There was also a
statistically significant correlation between students’ overall ratings of the quality of
the online materials and activities and a deep approach, r = .19. Finally, students who
tended to report a surface approach to learning in the course were more likely to
receive a lower grade for the online component of the unit, r = −.29.
The final analysis undertaken was a cluster analysis. Cluster analyses were
conducted to look for distributions of qualitatively different experiences in the student
population. The methodology used for this analysis is based on related studies by
Seifert (1995), Prosser, Ramsden, Trigwell and Martin (2003) and Ellis and Calvo
(2006). We used a hierarchical cluster analysis on standardised variable scores for this
purpose, followed by a between-groups contrast analysis on individual variables.
Using Ward’s minimum variance method, we found that a two-cluster solution was
the most parsimonious explanation of the relations between variables and group
membership. The means and standard deviations of the two groups on the variables
are given in Table 8.
We interpret the cluster analysis results as follows. The first cluster of students
may be characterised as a group that, on average, rated relatively low the items on eteaching, design of resources, student interaction and appropriate workload, and
tended to score higher on items reflecting a surface approach to learning. They also
had relatively lower grades for the online component of the course. The second
cluster, in comparison, rated relatively high items reflecting e-teaching, design of
resources, student interaction and workload, and had relatively low scores on the
items reflecting a surface approach to learning. They also had relatively higher
grades for the online component of the course. The difference between the clusters
on the deep approach variable, while in the expected direction, was not statistically
significant.
The cluster analysis results indicate that students who have different perceptions
of the e-learning environment take different approaches to study and achieve at
different levels on an assessment task associated with online learning.
Higher Education Research & Development
315
Table 8. Summary statistics for student perceptions of e-learning, approaches to study and
achievement.
Cluster 1 (n = 43)
Cluster 2 (n = 97)
M
SD
M
SD
Student perceptions of e-learning
Good teaching *
Good design *
Student interaction *
Appropriate workload *
−.71
−.99
−.80
−.43
.79
.91
1.12
1.02
.34
.41
.32
.18
.93
.70
.71
.95
Approaches to study
Deep
Surface *
−.15
.52
.96
1.32
.03
−.23
1.01
.74
Achievement
Online grade *
−.46
1.35
.20
.74
Variable
Notes: n = 140, *denotes a statistically significant difference between clusters (p < 0.05).
Discussion
This study began by recognizing that the experience of campus-based students is
being systematically supported through the use of e-learning materials and resources.
It was noted that in campus-based experiences supported by e-learning we have little
research-based evidence about how key aspects of e-learning might be constituted and
how these relate to key aspects of the whole student experience (Bliuc, Goodyear, &
Ellis, 2007; Sharpe & Benfield, 2005; Sharpe, Benfield, Roberts, & Francis, 2006).
Adopting a semi-exploratory approach, previous research had suggested that aspects
such as e-moderating, online communication, the design of materials and websites,
workload issues and submissions made by students online might be some of the areas
worth investigating (Prosser & Trigwell, 1999; Ramsden, 1991).
The frequency analyses gave the researchers a feel for levels of agreement
amongst the student population as to which items were perceived to be the most relevant. In terms of further exploration in this area, some items that did not load are
almost as important as those that did. For example, item 15, ‘The workload online
didn’t stop me from developing my understanding of the key issues in this course’, did
not load significantly onto the workload subscale, but item 8, ‘I generally had enough
time to understand the things I had to learn online’, did. In order to understand which
items have the most meaning for students, and how these might be related to their
subscale, and other aspects of the experience, more work on the constituent structure
of all the subscales is needed.
The identification of the four underlying factors described as e-teaching, design,
workload and interactivity is an important contribution to research into the most
meaningful aspects of e-learning when it is used to support students in a predominately face-to-face experience. We do not suggest that these are the only possible
aspects to be identified here, nor that they are, in a sense, necessarily pitched at a level
of abstraction that may not change. For example, e-teaching as a subscale in the factor
analysis is made up of items addressing online discussions. In other studies it may
316
R.A. Ellis et al.
include other things. The main outcome for this study is the fact that we identified four
independent factors accounting for different aspects of the learning experience. The
way they related to each other and the outcome variables (satisfaction and mark)
provide an improved understanding of how e-learning is related to other key aspects
of the student experience.
Specifically, at the variable level, significant correlations were identified amongst
the e-learning, approaches and outcome variables. Significant strong positive correlations were found between all the e-learning variables and students’ perceptions of the
quality of the e-learning experience. Significant strong positive correlations were
found between the deep approaches, the e-learning variables, perceptions of the quality of e-learning and achievement. We interpret these results as evidence of the
importance of careful structuring and design of e-learning activities and resources,
especially in relation to the broader student experience.
At the level of groups of students, significant differences were found amongst
students in terms of their perceptions, approaches to study and achievement. The
differences were consistent with the results suggested by the analyses at the variable level. It is perhaps these results that have the clearest implications for practice.
The analyses suggest that students who had negative perceptions of the quality of
teaching, design, interactivity and workload tended to approach their studies in the
course in a comparatively poor way and tended to achieve relatively poorly online.
These associations suggest that if we wish to improve the quality of the student
experience online, addressing student perceptions about what the e-learning experience involves and how it can be useful for learning is essential. For example, about
a third of the students (cluster 1, n = 43) did not perceive the value of the submissions made by other students, nor of the interaction with the teacher online and of
the learning process facilitated by online activities. This suggests that some awareness-raising about the nature and purpose of submissions and online feedback
would be a useful teaching strategy if we wish to improve the quality of e-learning.
We cannot assume that the mere existence of e-learning activities and materials
supporting a face-to-face experience of learning will improve the quality of the
experience. How students perceive and use the activities and materials represent
one of the keys to unlocking the full value of e-learning in the student learning
experience at university.
References
Barretto, S.F.A., Piazzalunga, R., Ribeiro, V.G., Dalla, M.B.C., & Filho, R.M.L. (2003).
Combining interactivity and improved layout while creating educational software for the
Web. Computers and Education, 40, 271–284.
Biggs, J.B. (2005). Aligning teaching for constructing learning. Retrieved August 20, 2008,
from: http://www.heacademy.ac.uk/embedded_object.asp?id=21686&filename=Biggs
Biggs, J., Kember, D., & Leung, D.Y.P. (2001). The revised two-factor Study Process
Questionnaire: R-SPQ-2F. British Journal of Educational Psychology, 71, 133–149.
Bliuc, A., Goodyear, P., & Ellis, R.A. (2007). Research focus and methodological choices in
studies into students’ experiences of blended learning in higher education. Internet and
Higher Education, 15, 231–244.
Brace-Govan, J. (2003). A method to track discussion forum activity: The Moderators’
Assessment Matrix. Internet and Higher Education, 6, 303–325.
Cronbach, L.J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16,
297–334.
Higher Education Research & Development
317
Dewiyanti, S., Brand-Gruwel, S., & Jochems, W. (2005). Applying reflection and moderation
in an asynchronous computer-supported collaborative learning environment in campusbased higher education. British Journal of Educational Technology, 36, 673–676.
Ellis, R.A., & Calvo, R.A. (2006). Discontinuities in university student experiences of
learning through discussions. British Journal of Educational Technology, 37, 55–68.
Godwin, S.J., Thorpe, M.S., & Richardson, J.T.E. (2008). The impact of computer-mediated
interaction on distance learning. British Journal of Educational Technology, 39, 52–70.
Goodyear, P. (1984). LOGO: A guide to learning through programming. London: Heineman.
Goodyear, P. (1991). Teaching knowledge and intelligent tutoring. Norwood, NJ: Ablex.
Goodyear, P., Jones, C., Asensio, M., Hodgson, V., & Steeples, C. (2005). Networked
learning in higher education: Students’ expectations and experiences. Higher Education,
50, 473–508.
Hashim, Y. (1999). Are instructional design elements being used in module writing? British
Journal of Educational Technology, 30, 341–358.
Higher Education Funding Council of England. (2005). HEFCE strategy for e-learning.
Retrieved January 14,2006, from http://www.hefce.ac.uk/pubs/hefce/2005/05_12/
Kaiser, H.F., & Caffrey, J. (1965). Alpha factor analysis. Psychometrika, 30, 1–14.
Kember, D., Ng, S., Tse, H., Wong, E.T.T., & Pomfret, M. (1996). An examination of the
interrelationships between workload, study time, learning approaches and academic
outcomes. Studies in Higher Education, 21, 347–358.
Laurillard, D. (1993). Rethinking university teaching: A framework for the effective use of
educational technology (1st ed.). London: Routledge.
Laurillard, D. (2002). Rethinking university teaching: A framework for the effective use of
educational technology (2nd ed.). London: Routledge.
Lawless, C.J., & Richardson, J.T.E. (2002). Approaches to study and perceptions of academic
quality in distance education. Higher Education, 44, 257–282.
Littlejohn, A.H. (2002). Improving continuing professional development in the use of ICT.
Journal of Computer Assisted Learning, 18, 166–174.
Martens, R.L., Valcke, M.M.A., & Portier, S.J. (1997). Interactive learning environments to
support independent learning: The impact of discernability of embedded support devices.
Computers and Education, 28, 187–197.
Oliver, R., & Omari, A. (1999). Using online technologies to support problem-based learning:
Learners’ responses and perceptions. Australian Journal of Educational Technology, 15,
58–79.
O’Toole, J.M., & Absalom, D.J. (2003). The impact of blended learning on student outcomes:
Is there room on the horse for two? Journal of Educational Media, 28, 179–189.
Pearson, J. (2006). Investigating ICT using problem-based learning in face-to-face and online
learning environments. Computers & Education, 47, 56–73.
Prosser, M., & Trigwell, K. (1999). Understanding learning and teaching: The experience in
higher education. Milton Keynes, UK: Society for Research into Higher Education &
Open University Press.
Prosser, M., Ramsden, P., Trigwell, K., & Martin, E. (2003). Dissonance in experience of
teaching and its relation to the quality of student learning. Studies in Higher Education,
28, 37–48.
Ramsden, P. (1991). A performance indicator of teaching quality in higher education: The
Course Experience Questionnaire. Studies in Higher Education, 16, 129–150.
Ramsden, P. (2002). Learning to teach in higher education (2nd ed.). London: Routledge.
Richardson, J.T.E. (2003). Approaches to studying and perceptions of academic quality in a
short web-based course. British Journal of Educational Technology, 34, 433–442.
Sabry, K., & Baldwin, L. (2003). Web-based learning interaction and learning styles. British
Journal of Educational Technology, 34, 443–454.
Salmon, G. (2002a). E-tivities: The key to active online learning. London: Taylor & Francis.
Salmon, G. (2002b). Mirror, mirror on my screen: Exploring online reflections. British
Journal of Educational Technology, 33, 379–391.
Salmon, G. (2004). E-moderating: The key to teaching and learning online (2nd ed.). London:
Taylor & Francis.
Schmitt, N. (1996). Uses and abuses of coefficient alpha. Psychological Assessment, 8,
350–353.
318
R.A. Ellis et al.
Seifert, T. (1995). Characteristics of ego- and task-orientated students: A comparison of two
methodologies. British Journal of Educational Psychology, 65, 125–138.
Sharpe, R., & Benfield, G. (2005). The student experience of e-learning in higher education:
A review of the literature. Brooks eJournal of Learning and Teaching, 3, 1–10.
Sharpe, R., Benfield, G., Roberts, G., & Francis, R. (2006). The undergraduate experience of
blended e-learning: A review of UK literature and practice. Retrieved January 14, 2006,
from www.heacademy.ac.uk
Sims, R. (1997). Interactivity: A forgotten art. Computers in Human Behavior, 13, 157–180.
Trigwell, K., & Prosser, M. (1997). Towards an understanding of individual acts of teaching
and learning. Higher Education Research & Development, 16, 241–252.
Wilson, K.L., Lizzio, A., & Ramsden, P. (1997). The development, validation and application
of the Course Experience Questionnaire. Studies in Higher Education, 22, 33–53.
Download