Capturing information to improve learner retention and completion of

advertisement
Capturing information to improve
learner retention and completion
of courses
Canberra Institute of Technology
E-standards for Training
June 2014
flexiblelearning.net.au
Capturing information to improve learner retention and completion of courses
Acknowledgements
This research was funded through the National VET E-Learning Strategy and
managed by the New Generation Technologies business activity. We acknowledge
and thank them for their support.
Canberra Institute of Technology wishes to acknowledge and thank participants
across the Institute, the primary researcher Penny Neuendorf and the following
supports:
Jasmin Kientzel (CIT)
John Smith (CIT)
Michael de Raddt (Moodle HQ)
Lead author: Penny Neuendorf (penny.neuendorf@cit.edu.au)
Disclaimer The Australian Government, through the Department of Industry, does not accept any
liability to any person for the information or advice (or the use of such information or advice) which is
provided in this material or incorporated into it by reference. The information is provided on the basis
that all persons accessing this material undertake responsibility for assessing the relevance and
accuracy of its content. No liability is accepted for any information or services which may appear in any
other format. No responsibility is taken for any information or services which may appear on any linked
websites.
With the exception of the Commonwealth Coat of Arms, the Department’s logo, any material protected
by a trade mark and where otherwise noted all material presented in this document is provided under a
Creative Commons Attribution 3.0 Australia (http://creativecommons.org/licenses/by/3.0/au/) licence.
New Generation Technologies
incorporating E-standards for Training
National VET E-learning Strategy
Capturing information to improve learner retention and completion of courses
Table of Contents
1 Executive Summary .......................................................................................... 1
2 Background ....................................................................................................... 2
2.1 National VET E-learning Strategy ............................................................................ 2
2.2 New Generation Technologies Business Activity .................................................... 2
Introduction .......................................................................................................... 3
3.1 Purpose and Structure of the Project ....................................................................... 3
4 Literature Review .............................................................................................. 4
4.1 Introduction .............................................................................................................. 4
4.2 Learning Analytics .................................................................................................... 4
Definition ....................................................................................................................................5
Use of Learning Analytics ..........................................................................................................6
4.3 Completion rates .................................................................................................... 10
Driving the push for increased completion rates ......................................................................10
Subject Load Pass Rate ..........................................................................................................10
4.4 Analytic Tools ......................................................................................................... 12
Key
..................................................................................................................................12
Teacher-centric tools ...............................................................................................................13
SNAPP ..................................................................................................................................13
LOCO-Analyst........................................................................................................................13
Pentaho .................................................................................................................................13
Gephi .....................................................................................................................................13
AWStats .................................................................................................................................13
Many Eyes .............................................................................................................................14
Excel ......................................................................................................................................14
R ............................................................................................................................................14
Tableau Software...................................................................................................................14
Student-centric Tools ...............................................................................................................15
E2Coach.................................................................................................................................15
Course Signals ......................................................................................................................15
Persistence +PLUS................................................................................................................15
Platform-centric Tools ..............................................................................................................15
GISMO – Graphic Interactive Student Monitoring Tool for Moodle ........................................15
Blackboard Analytics for Learn ..............................................................................................16
Moodle Course Completion Block..........................................................................................16
Moodle Progress Bar .............................................................................................................16
Desire2Learn Insights ............................................................................................................16
Moodog ..................................................................................................................................16
Other
..................................................................................................................................17
New Generation Technologies
incorporating E-standards for Training
National VET E-learning Strategy
Capturing information to improve learner retention and completion of courses
5 Hypotheses ...................................................................................................... 18
6 Methodology .................................................................................................... 18
6.1 Data Selection and Collection ................................................................................ 21
Set 1 – Fully online courses .....................................................................................................21
Set 2 – Fully online courses .....................................................................................................22
Set 3 – Blended Delivery courses ............................................................................................22
Set 4 – Blended Delivery courses ............................................................................................22
Set 5 – Courses using Virtual Learning Environments .............................................................22
Set 6 – Courses using forums..................................................................................................22
Data collection limitation ..........................................................................................................22
Tools Selection ........................................................................................................................23
6.2 Process .................................................................................................................. 23
7 Results ............................................................................................................. 24
7.1 Comparing 4 week data to completion .................................................................. 30
7.2 Teacher interviews ................................................................................................. 45
Process ..................................................................................................................................45
Interview data ..........................................................................................................................48
General comments: .................................................................................................................48
Expectations ............................................................................................................................50
Forums and student success ...................................................................................................50
Student participation and unexpected results ..........................................................................50
Value of additional data points .................................................................................................51
Learning Analytics usefulness and skills: Excel .......................................................................52
Learning Analytics usefulness and skills: GISMO ....................................................................53
Learning Analytics usefulness and skills: SNAPP ....................................................................53
Perceived added value of Learning Analytics (LA) tools ..........................................................53
Perceived value of technology as learner engagement tool ....................................................54
Final comments and preferred LA tool: ....................................................................................55
8 Discussion ....................................................................................................... 56
8.1 LMS usage and academic performance ................................................................ 56
8.2 Excel ...................................................................................................................... 56
8.3 R ............................................................................................................................. 59
8.4 SNAPP ................................................................................................................... 60
8.5 GISMO ................................................................................................................... 60
8.6 Correlational Analysis ............................................................................................ 60
8.7 Hits/Clicks .............................................................................................................. 60
8.8 Dwell time............................................................................................................... 61
8.9 Virtual Classroom Participation .............................................................................. 61
8.10 Forum rich courses .............................................................................................. 61
New Generation Technologies
incorporating E-standards for Training
National VET E-learning Strategy
Capturing information to improve learner retention and completion of courses
8.11 Fully Online courses ............................................................................................ 62
8.12 Teachers .............................................................................................................. 62
8.13 Students ............................................................................................................... 62
9 Conclusion....................................................................................................... 64
10 References ..................................................................................................... 65
More Information ................................................................................................ 71
New Generation Technologies
incorporating E-standards for Training
National VET E-learning Strategy
Capturing information to improve learner retention and completion of courses
1 Executive Summary
This report provides an analysis and evaluation of a range of current Learning
Analytics (LA) literature and tools within the context of improving learner retention
and course completions. From a range of LA solutions, four tools were selected
(GISMO, SNAPP, Excel and R) for an in-depth investigation. These tools were tested
with 15 courses (competencies from a range of programs including Fashion Design,
Business, Music, Health, Aged Care, Population Health, Information and
Communication Technology and Tourism), to which 578 students were enrolled.
To better understand educators’ LA needs, nine teachers were interviewed in the
context of a demonstration of the four tools. All teachers expressed keen interest in
using ‘one click’ visualization software to monitor student progress and prompt them
to make student contact. None of the currently available LA solutions--including the
four tools tested in this project--had the capability to add to VET teachers’ knowledge
of their students’ progress. This result could be due to the small class sizes (under 25
participants) and the teachers’ accumulated knowledge of their students. (Students
often work with the same teacher for multiple courses over whole qualifications, a
fact highlighted in the teacher interviews.) Many teachers, including the project
participants, already look at Learning Management System (LMS) logs to monitor
online engagement prior to meeting with students.
Gathering LA data from LMS logs would be more effective if courses were designed
with collection of these data in mind, a topic which has been discussed extensively in
the relevant literature. For example, a course would need to have engagement
exercises, individualised feedback or assessment oriented content in the first four
weeks. Future research projects with purpose-designed courses and visualization
tools that are available to students and teachers are the next logical step.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 1
Capturing information to improve learner retention and completion of courses
2 Background
2.1 National VET E-learning Strategy
The National VET E-learning Strategy (Strategy) aims to strengthen the Australian
training sector’s use of new learning technologies and leverage opportunities
provided by such projects as the National Broadband Network (NBN) to make major
advances in the achievement of government training objectives.
The Strategy seeks to build the capability of registered training organisations (RTOs),
industry and community stakeholders to create more accessible training options and
facilitate new ways of learning through technology. It also aims to stimulate elearning ventures to support individual participation in training and employment, and
the alignment of workforce skill levels with economic needs.
The Strategy is driven by the vision:
A globally competitive Australian training system underpinned by world
class e-learning infrastructure and capability.
and has the following three goals:
1. Develop and utilise e-learning strategies to maximise the benefits of the
national investment in broadband.
2. Support workforce development in industry through innovative training
solutions.
3. Expand participation and access for individuals through targeted
e-learning approaches.
2.2 New Generation Technologies Business Activity
The New Generation Technologies Business Activity incorporates the E-standards
for Training activity and primarily contributes to Goal 1 of the National VET E-learning
Strategy. It has the following objective:
Support the capacity of the VET system to use broadband and emerging
technologies for learning, through research, standards development and advice.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 2
Capturing information to improve learner retention and completion of courses
Introduction
In 2013-2014, the Flexible Learning Advisory Group (FLAG) prioritised a program of
applied research into various specific technical issues. The goal of this project-“Capturing information to improve learner retention and completion of courses”--was
to identify tools for use by teachers and organisations to gain insight on data from
various sources, which will allow them to assess learner capability and develop early
intervention strategies to improve learner engagement, improve students’
performance and increase completion rates.
With growing use of online technologies there is an increasing amount of data
generated by students and teachers activity. Most of the available data originates
from the Student Information Management System (SMS), activity in Learning
Management Systems (LMSs) and Virtual Learning Environments (VLEs). For the
purpose of this document a LMS is where the majority of interaction is asynchronous
(delayed interaction) and a VLE is synchronous (live interaction).
Siemens and Long (2011) have claimed that this data can be used to contribute to
better decisions about learning by institutions, educators and students. Researchers
and practitioners alike are becoming more interested in how to best harvest the data
and use it to improve learning experiences, completion rates and teaching quality.
3.1 Purpose and Structure of the Project
This project is embedded in the Learning Analytics (LA) research field and utilises
knowledge generated by previously conducted research studies for application to the
Vocational Education and Training (VET) sector.
The project investigates the use of LMS server logs, forum participation and virtual
classroom participation as predictive indicators of successful student course
completion in a VET context. The project builds on previous work carried out in the
higher education context.
The project investigates and organises relevant streams of current LA research
literature with a focus on indicators that have emerged as useful for prediction of
learner success. To this end, structured search procedures on relevant bibliographic
search engines (e.g. Google Scholar) were carried out, relevant results used to
redefine search parameters and finally classified in different categories to organise
the resulting literature review. Research articles and reports that were found to be
relevant for this project were retained after a ranking exercise involving the members
of the research team.
In a similar fashion, a list of relevant LA tools was established to give an overview of
the most commonly referenced software packages.
The resulting bibliography and software review informed the subsequent
investigations and served as a guideline for the selection of software packages, data
sets, evaluation methodology and analysis.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 3
Capturing information to improve learner retention and completion of courses
According to Siemens (2013), research into learning analytics has been mainly
“focused on Higher Education and P – 12”, which could be why detailed information
about LA in the VET sector is scarce.
The results of this project contribute to the current debate on learning analytics in the
vocational and higher education sectors with a particular focus on application in a
TAFE setting.
4 Literature Review
4.1 Introduction
Learning analytics has only recently emerged as a field of research in its own right
(Siemens and Long, 2011). “As a cross-disciplinary field between educational,
statistical and computational sciences, much of its influence is owed to advances in
artificial intelligence, data mining and educational technology” (Kraker et al., 2011).
According to the NMC Horizon Report: 2014 Higher Education Edition (Johnson, L.,
Adams Becker, S., Estrada, V., Freeman, A., 2014), Learning Analytics is “steadily
gaining the interest of education policymakers, leaders, and practitioners”, and is a
development that the authors place on their one year or less time-to-adoption
horizon.
To understand and guide this project’s research efforts, a review of the available
literature was conducted and used in:



the selection of applicable learning analytics tools,
the research approach, and
development of testable hypotheses.
Due to the interdisciplinary nature of the research area, the literature review is
presented in three major sections:
1. In the first section, literature on the theoretical and practical foundations of the
learning analytics field is investigated.
2. The second section discusses completion rates and how learning analytics,
within the context of LMSs and VLEs, can be used to enhance them.
3. Building on the first two sections, the third, and major, section investigates
several currently available LA tools with a focus on identifying properties that
make them suitable to be tested within the limits of this study.
4.2 Learning Analytics
This section of the literature review looks at recent Learning Analytics research
including but not limited to work by:



The Society for Learning Analytics Research (SOLAR),
Centre for Educational Technology & Interoperability Standards,
EDUCAUSE,
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 4
Capturing information to improve learner retention and completion of courses


The International Learning Analytics & Knowledge Conference, and
the Moodle Research Conference.
In addition to the work of these organisations, academic and policy-based literature
has also been consulted to create an overview of the learning analytics field.
Definition
This review will use the definition of Learning Analytics provided by Siemens and
Long (2011):
“learning analytics is the measurement, collection, analysis and reporting
of data about learners and their contexts, for purposes of understanding
and optimising learning and the environments in which it occurs.”
Learning Analytics sits within a broader field of related initiatives that are using data
mining and analytics methods in the education sector.
According to Siemens et al. (2011), a concept closely linked to LA is the field of
academic analytics that provides information at the institutional level to be used for
intelligence and decision-making.
They distinguish the two as follows:
1. “Learning analytics is the measurement, collection, analysis and reporting of
data about learners and their contexts, for purposes of understanding and
optimizing learning and the environments in which it occurs. Learning
analytics are largely concerned with improving learner success.
2. Academic analytics is concerned with the improvement of organizational
processes, workflows, resource allocation, and institutional measurement
through the use of learner, academic, and institutional data. Academic
analytics, akin to business analytics, are concerned with improving
organizational effectiveness”
Some authors go further and provide a more detailed classification of the Learning
Analytics category. Ferguson and Shum (2012) developed a ‘taxonomy’ of Learning
Analytics to account for the variety of techniques available to educational
practitioners and researchers including:




“Social network analytics – a technique which is based on the analysis of
interpersonal relationships on social platforms;
Discourse analytics – in which language is the primary tool for knowledge
negotiation and construction;
Content analytics – where user-generated content is one of the defining
characteristics of Web 2.0;
Disposition analytics – for which intrinsic motivation to learn is a defining
feature of online social media, and lies at the heart of engaged learning, and
innovation; and
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 5
Capturing information to improve learner retention and completion of courses

Context analytics – based on mobile computing transforming access to both
people and content”.
Use of Learning Analytics
Siemens (2009) states that learning analytics are important to





( reduce attrition through early detection of at-risk students ;
personalize learning process and content; ‘
create adaptive content;
extend and enhance learner achievement; and
improve teacher time and effort.
Siemens and Long (2011) identify learning analytics as a simple idea with
transformative potential, which
“… provides a new model for college and university leaders to improve
teaching, learning, organizational efficiency, and decision making and,
as a consequence, serve as a foundation for system change”.
They warn, however, that it is important to “think carefully about what we need to
know and what data is most likely to tell us what we need to know”. Brown (2012)
concurs with this caution and adds that “In any analytics initiative, the selection of
data directly affects the accuracy of the predictions and the validity of the analysis”.
Suthers and Chu (2012) draw attention to the many different ways that learners can
participate in a digital learning environment listing, for example,
“…threaded discussion, synchronous chats, wikis, whiteboards, profiles,
and resource sharing” and signal that this variety of media can
complicate the task of analysis.
Data captured by SMS, VLE and LMS for e-learning administration can be a useful
resource in the identification of at-risk students that may require additional support
(Dawson et al., 2008). However, Diaz and Fowler (2012) put forward the idea that,
whilst the LMS can be a “gold mine of information” there is a risk that an approach to
learning analytics focused on tracking “…student behavior, such as frequency of
interactions within an LMS” may be “too narrow and inadvertently limiting”.
They agree that it is an important requirement “… to discover and determine what
data are significant and why”. They conclude that “the real challenge lies in
developing a process for actually defining learning: measuring individual student’s
content consumption, applications, even collaborative contributions and
understanding how these behaviours map to student success”.
There is agreement that Learning Analytics aims ‘to examine students’ engagement,
performance, and progress on tasks.’ (Phillips et al., 2012), and, as the previous
discussion shows, there is also agreement that the definition of specific indicators
can be a challenging task.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 6
Capturing information to improve learner retention and completion of courses
Identifying appropriate indicators for predictions of learning success must take into
account the difficulties students can face when engaging in an online study
environment. Engagement may be hampered by lack of contact with peers, loss of
motivation or technical problems (Mazza and Dimitrova, 2004). In addition, socioeconomic status and school performance has been found to be a factor impacting on
student success (Bradbury et al, 2014). These are factors not evident in the log files
of various online learning environments, however research by Whitmer (2012),
suggests that, when appropriately filtered, usage data from a Learning Management
System is a better indicator of likely success than traditional demographics.
The importance of selecting the right measures of activity can be illustrated with the
concept of “dwell time”. While it seems intuitive to measure the time a student
“dwells” on a particular online activity before moving on to the next task, dwell time is
not a reliable measurement of a students’ engagement and effort in an asynchronous
Web environment. “Dwell time” simply tells how long the page or activity was open
for or the time between opening one page and the next. It doesn’t factor in the
proportion of that time the student was answering the phone, making a cup of tea,
picking the kids up from school, etc. In other words, it does not provide complete
information about the quality of learning behaviour or how much cognitive effort was
spent on a given task. Additionally, students may shift their learning activity to nononline engagement environments, which cannot be captured by data from LMS
alone.
Whitmer et al (2012) have reported that statistical analysis suggests that dwell time is
not a reliable measure.
Kruse and Pongasajapan (2012) express concern that Learning Analytics places too
much emphasis on the analytics and not enough on the learning. They suggest that
“inquiry-guided analytics” needs to be implemented and “to reimagine analytics in the
service of learning, we should transform it into a practice characterized by a spirit of
questioning and inquiry”. The thought behind this paradigm shift is to move from an
intervention-based, teacher controlled approach aimed at students at risk to one that
puts the tools in the hands of learners. An approach that enables the learner to be a
partner and “co-interpreter” of their data and “…in the identification and gathering of
the data” giving them opportunities to become more actively involved in the
construction of their knowledge.
Whilst there are different approaches and methods being used to inform learning
analytics, a common underlying principle suggested is “determining actionable steps
to use the information gathered” (Diaz and Fowler, 2012). “Actionable steps”, as
defined by Diaz and Fowler (2012), should result from a conscious and effective
definition of indicators that can be used to inform institutions, educators and students
themselves about successful completion.
Ali and colleagues (2012) have drawn attention to “analysis of learner activities in
learning environments” and its value to teachers in the adaptation or continuous
improvement of online courses when they say
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 7
Capturing information to improve learner retention and completion of courses
“Educators’ awareness of how students engage in the learning process,
how they perform on the assigned learning and assessment tasks, and
where they experience difficulties is the imperative for this adaptation”.
However, Drachsler and Geller (2012) have reported that, when it comes to Learning
Analytics, there is “...much uncertainty and hesitation, even extending to scepticism”.
In their study of educators’ perceptions of Learning Analytics, respondents across 31
countries expressed general uncertainty and hesitation about Learning Analytics
(Drachsler et al, 2012). Expectations among educators were certainly high within the
context of this Learning Analytics survey, most of the respondents expressed
significant interest in timely information about students’ learning processes and better
insights with regards to course-level dynamics. In addition to receiving more detailed
information, a significant number of educators expected information as to their own
performance (37%) and access to otherwise “hidden” information (44%). In this
particular survey, only 16% of responding educators were convinced that Learning
Analytics can reliably predict students’ learning performance. This opinion was linked
to a low confidence in statistical methods that underlie Learning Analytics and
scepticism that the use of these methods would lead to more objective assessment
or provide insight into students’ true knowledge levels (Drachsler et al., 2012).
Other concerns expressed by educators in this study included privacy requirements,
ethical policies, data ownership, and transparency of education (Drachsler et al.,
2012). Ethical issues associated with Learning Analytics are also discussed by Slade
and Prinsloo (2013) who consider
“…the location and interpretation of data; informed consent, privacy and
the de-identification of data; and the classification and management of
data”.
While the information yielded in that study was international in nature and only 11%
of the respondents came from the VET sector, it is important to keep in mind that the
Learning Analytics movement is global in nature. For most educators, tracking data
can often appear to be incomprehensible and poorly organised (Mazza and
Dimitrova, 2007). While Learning Analytics can assist teaching staff by collecting,
analysing and presenting data in an appropriate format, data is typically provided in a
static format as pre-defined by system developers (Dyckhoff et al., 2012). This
problem can be overcome, as Ali et al. (2012) suggest, by visualisation as an
“effective means to deal with larger amounts of data in order to sustain
the cognitive load of educators at an acceptable level”.
Again, the indicators for which the data is collected need to be appropriate for better
data understanding (Glahn, 2009).
As well as providing students, teachers and Institutions with predictions about course
outcomes based on student study behaviours, data on LMS system usage can also
shed light on teacher involvement. LMS systems and Learning Analytic tools can be
useful for students and teachers to assess learner performance and can also help
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 8
Capturing information to improve learner retention and completion of courses
teachers to reflect on their work. Again, the definition and implementation of
appropriate indicators for measurement needs to be carefully considered for this
purpose. Studies that have focused on teacher needs include Ali et al (2012),
Drachsler and Geller (2012) and Dyckhoff et al. (2012).
While Learning Analytics is a promising and innovative addition to traditional
evaluation of education and provision of business intelligence, it is not without
criticism from practitioners and researchers alike.
One of the major drawbacks of Learning Analytics often found in its current form is
the assumption of students as passive recipients who need to be managed to avoid
educational failure (Kruse and Pongsajapan, 2012). In their report, these researchers
illustrate this point by using Purdue University’s “Signals” project as a case study.
This Learning Analytics tool uses a traffic light-like system where students at risk of
failing are alerted with Orange and Red “lights” while students on track for success
get a “green light”.
Yuan (no date) suggests that “learning is a complex social activity and all
technologies, regardless of how innovative or advanced they may be, remain unable
to capture the full scope and nuanced nature of learning”. In addition, the author
emphasises in citing Gardner “that analytics might encourage a more reductive
approach towards learning which is dangerous for promoting deeper learning and
meaningful learning”. Booth (2012) reiterates this point of view stating that “learning
analytics risks becoming a reductionist approach for measuring a bunch of "stuff" that
ultimately doesn't matter. In my world, learning matters.” Similar concerns have been
voiced by other researchers, e.g. Ferguson and Shum (2011) stating that learning
analytics has the potential to “devalue the role of teachers/mentors who will never be
replicated by machine intelligence, disempowering learners by making them rely on
continuous machine feedback”.
The literature reviewed here draws attention to the potential and shortcomings of
Learning Analytics and, importantly, exposes the difference between learning and
studying. Philips et al (2010) have drawn attention to this difference when they quote
Goodyear and Retalis (2010):
“it is useful to distinguish between learning – which we take as a label for
a set of psychological processes which lead to greater competence or
understanding – and studying – which is a useful descriptor for a set of
real-world activities in which people engage, for the purposes of
intentional learning” (2010, p.8).
Phillips et al (2010, P.763) make the important point that “… the process of learning
is relatively difficult to observe. What is easier to observe is studying”. In this project it
will be the activity of studying that is measured and interpreted in an attempt to make
predictions about learning, recognising that there are many factors that impact on
learning success, which cannot be captured by measures of activity.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 9
Capturing information to improve learner retention and completion of courses
4.3 Completion rates
This second section of the literature review focuses on student completion rates and
the defining of appropriate indicators for Learning Analytics methods and tools. It
recognises that information gathered from Learning Management Systems, Student
Management Systems and Virtual Learning Environments only provides a partial
component of the factors that influence the likelihood of successful course
completion.
Driving the push for increased completion rates
Australian Governments have indicated a need to increase the efficiency of the
Vocational Education and Training (VET) System. The National Agreement for Skills
and Workforce Development states that there is a need for “… high quality,
responsive, equitable and efficient training and training outcomes” (National
Agreement for Skills and Workforce Development).
The major focus of this push for increased efficiency is on qualification completion getting more people to complete more, higher-level, VET qualifications - but subject
completion rates are also of concern. The Productivity Commission’s 2012 report on
the impact of COAG reforms states that “…there can be gains when people acquire
competencies and skill sets, even if they do not obtain a qualification …” (Productivity
Commission p.95).
The value of subject completion rather than qualification completion has also been
raised during the current VET Reform consultations that “… full qualifications (as
opposed to skill sets) are not always needed or fit for purpose …” (Department of
Industry, 2012).
The Productivity Commission report also notes that there is a “… paucity of data ...”
in this area (p95) and recommends that collection of data be improved (p108).
Subject Load Pass Rate
The National Centre for Vocational Education Research (NCVER) use “Subject Load
Pass Rate” as a measure of subject completion
“A subject load pass rate is the ratio of hours studied by students who
passed their subject(s) to the total hours committed to by all students
who passed, failed or withdrew from the corresponding subject(s).”
(Bednarz, 2012)
The authors also state that “we can think of a subject load pass rate as the ratio of
‘profitable hours’ to the total hours undertaken by all students in the same year”.
The idea of cost effectiveness being impacted by completion rate is supported by
Tyler Smith when he states that attrition rates are “… important in assessing the
relative effectiveness of the cost of online learning compared to traditional classroombased teaching …” (Tyler-Smith, 2006,).
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 10
Capturing information to improve learner retention and completion of courses
Much has been written about the factors influencing overall drop-out rates over many
years – seven decades according to Berge and Huang (2004,). These authors
propose:



Personal variables,
Institutional variables and
Circumstantial variables
as the three key areas impacting drop-out rates but, in summarising their thoughts,
also warn that:
“Generalizations about retention can be misleading because each
institution is dynamically unique in terms of academic emphasis and
institutional culture. Retention issues can be further complicated because of
the necessity to understand the student population, their educational goals,
and their specific circumstances.” (Berge and Huang, 2004,)
A multivariate analysis of factors affecting Module Load Completion Rates at a West
Australian TAFE in 2000 identified



student factors,
college and delivery factors and
course and program factors,
as the three primary categories of determinants that may affect completion rates
(Uren, 2001, p2).
Uren defines Module Load Completion Rate as “… the proportion of hours of delivery
which result in a successful module completion. It is used as a surrogate measure of
output efficiency”. It is a measure comparable to Study Load Pass Rate.
Arnold (2010), in considering a student’s risk status, states that
“while demographic information, academic preparation (represented by
admissions data), and performance (indicated by grades in the course)
are important, we saw a vital need to include the hugely important,
dynamic variable of behaviour”.
Whilst it is acknowledged that there are many factors beyond the reach of LMS log
files, predictions about likely course completion based on student log data from a
Learning Management System are being used to improve student outcomes. (Baker
and Siemens, 2013) and Beer et al (2010) have identified a positive correlation
between the volume of student activity, counted as ‘clicks’, in an online course and
final grade for the course. Whitmer (2012) has also suggested that a possible way to
track student progress is to look at behaviour patterns, time series analysis, and
social interactions.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 11
Capturing information to improve learner retention and completion of courses
4.4 Analytic Tools
This section describes the result of a search and comparison of analytics tools and
has been guided by the papers used in the literature review, e.g. Kraan and Sherlock
(2012), Verbert et al. (2013) and Ferguson (2012).
Elias (2011) cites Campbell and Oblinger (2008) stating that appropriate tools should
incorporate “five steps: capture, report, predict, act, and refine”. Kraan and Sherlock
(2012) include analytic tools workflow, collection and acquisition, storage, cleaning,
integration, analysis, representation & virtualization, altering as a base for the
overview of the relevant tools. This workflow will be used as the basis for our tool
analysis.
Key
Collection and Acquisition
Storage
Cleaning
Integration
Analysis
Representation and Virtualization
Alerting
It is important to note that the majority of tools need to be configured for purpose,
requiring a user to exhibit a certain level of skill and knowledge to achieve a useful
configuration. Li (2012) states that there is a “gap between what technology promises
to do and what people can do with existing data sources and tools in reality.” Kraan
and Sherlock (2012) note “ready-made solutions may be too expensive for
experimentation”. Also, “analytics initiatives depend heavily on identifying the right
variables, if a ready-made solution doesn’t cover it, it may be of little use.”
In the following sections, we will review some of the currently available learning
analytic tools, separated into three main categories: teacher-centric, student-centric
and platform-centric tools. This classification does not preclude that some of the LA
tools discussed fall into several categories; if this happened, an LA tool was placed in
its main category.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 12
Capturing information to improve learner retention and completion of courses
Teacher-centric tools
SNAPP
Social Networks Adapting Pedagogical Practice (SNAPP) is a browser add-in. This
tool is mainly forum focused. A teacher selects the forum they are interested in for
analysis and then activates SNAPP. SNAPP gives a visual representation of how
interactive each participant is. Only the teacher can see this information. Teachers
can then encourage students to participate. Project partners include University of
Wollongong, RMIT, Murdoch University, University of Queensland and University of
British Columbia. SNAPP can be used with Internet Explorer, Firefox, Chrome and
Safari. It can analyse forums in Moodle, Blackboard Learn, Desire2Learn and Sakai.
LOCO-Analyst
“LOCO-Analyst is an educational tool aimed at providing teachers with feedback”
(Jovanovic, J., Gasevic, D., Brooks, C., Devedzic, V., & Hatala, M. 2007). It takes the
user tracking data and gives teachers information on how cohorts of students or
individual students are progressing. LOCO-Analyst is being tested by Simon Fraser
University, University of Saskatchewan and University of British Columbia. LOCOAnalyst can be used with the iHelp Courses LMS.
Pentaho
Pentaho http://www.pentaho.com/product/embedded-analytics is a big data analysis
tool that has been used with other educational products to analyse and report on
data. Pentaho is currently being used at University of Oklahoma and Loma Linda
University.
Gephi
Gehpi is an open interactive visualization and exploration tool. It will analyse data
from a range of sources, but may have benefits - mainly on the institutional or
educator-specific level. Mentioned in Dietz-Uhler & Hurn’s paper from Miami
University.
AWStats
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 13
Capturing information to improve learner retention and completion of courses
AWStats analyses server logs to generate web pages with graphics to show usage
statistics. San Francisco State University is using AWStats to show statistics for
which pages are most popular
Many Eyes
Many Eyes data sets in CSV form can be uploaded to be converted to a graphical
representation. University of Illinois at Urbana-Champaign is using Many Eyes as a
visualization tool for qualitative, quantitative and geographic data.
Excel
Excel is well-known and easy to use spreadsheet software with extensive analytic
and graphical capabilities. While its main purpose is not learning analytics, it provides
an easy to understand way of processing data. A disadvantage, however, may be
exhibited in its limited functions. Excel is currently being used or has been trialled at
a Student Analytic tool at the University of Puerto Rico, Beuth University Germany,
and used for the research of Phillips, Maor, Cumming-Potvin, Robers, Herrington,
Moore, Perry (Murdoch University), and Preston (University of Newcastle).
R
R-project is an open source language and environment for general purpose
statistical computing and graphics. Baker and Siemens (2013) have recommended it
for its “ability for researchers to create specific R packages to address research
needs in specific fields”. As it is very data-centric, users usually have to have a
certain knowledge level to understand its core functions. The University of Auckland
is known as the birthplace of R Project. R has also been used in the University of
Warwick.
Tableau Software
Tableau Software is a general purpose visualisation software program that can be
cloud or desktop-based. Its market has primarily been big companies and can be
used by an institution or by students individually (a desktop version is free for fulltime students). Information would need to be gathered from systems (learning
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 14
Capturing information to improve learner retention and completion of courses
management systems, student management systems, Google Analytics, etc) to be
displayed. Tableau is currently being used by University at Buffalo (part of University
of New York) in 2009 (Feroleto, L., Mantione, J & Sedor, M., Analytics in Institutional
Research), University of Washington, DePaul University, Cornwell University and
Utah State University, to name a few.
Student-centric Tools
E2Coach
E2Coach (Hubert, M., Michelotti, N., & McKay, T. 2012) is a Michigan University
tailored system that collates and analyses information from a range of systems. This
system has a strong student focus and alerts students with personalised messages
on how they are performing and what they can do to improve their results. It was
launched in 2012 with 2,234 students participating in the trial. This system sounds
very promising but doesn’t seem to be available as commercial or open source
format.
Course Signals
Course signals (Arnold, K. & Pistilli, M. 2012) is designed for early intervention with
at-risk students. Teachers and students get to see a traffic light status of the student.
Computer-generated emails are sent to at-risk students. Purdue University have
piloted the program and student feedback has generally been positive. Ellucian
Course Signals is the commercially licensed version of the product. Course signals
can be used with the Blackboard and Moodle LMS.
Persistence +PLUS
Persistence can be used via a mobile phone application to motivate students and
prompt them when upcoming work is due. This product is subject or course based
and claims to use “sophisticated data analytics”. University of Washington Tacoma
was involved in the initial pilot.
Platform-centric Tools
GISMO – Graphic Interactive Student Monitoring Tool for Moodle
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 15
Capturing information to improve learner retention and completion of courses
Gismo is a Moodle tool that automatically uses the LMS system’s log data to give the
teacher a “clear picture” of how the class is participating and progressing. The
teacher can see the class as a whole and also zoom in on individual student activity.
Blackboard Analytics for Learn
Blackboard Analytics for Learn is an enterprise product designed for use with
BlackBoard LMS. It provides analytics at an institute, course/class level and at a
student level for teachers and students.
Moodle Course Completion Block
Moodle course completion block is a core component of Moodle. It shows the
students the required course of action to complete the course (which is setup by the
teacher and can be automatic or manually marked as complete).
Moodle Progress Bar
The Progress Bar is a Moodle specific plug-in that helps students with time
management and a visualisation of their current learning status in the course. It also
shows teachers an overview of all the students’ progress in a course. The tool has
been shown to have positive effects on learning and retention (de Raadt & Dekeyser,
2009).
Desire2Learn Insights
Desire2Learn Insights is a Desire2Learn-specific tool. It provides analytics at an
institute, course/class level and at a student level for teachers and students.
Moodog
Moodog was a project by University of California in 2006, and has been presented in
numerous papers including Elias (2011), Shang et al (2010) & Govaerts et al (2010).
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 16
Capturing information to improve learner retention and completion of courses
Other
The following products have not been investigated in depth but have been noted as
they have been mentioned in research documentation sited for this project.
Product
Comment
Discourse Analytics
Poll driven, not academic focused
e-asTTLE
Online assessment tool not an analytic product
Argos reporting
Reporting tool that is typically used with a SMS
ELLI – Effective
lifelong learning
inventory
Looks at individual learning, information
gathered by an online questionnaire, doesn’t
look at their activity in a course
Next Generation
Analytics Consortium
Visual analytics research - not specifically
education focused
Contextualized
Attention Medata
(CAM)
Data is collected from digital repositories and
client-side sources like software and web
browsers
Oracle Student
Information Analytics
Very institute focused, more into grades,
scheduling, faculty workload rather than
education outcomes
Google Analytics
Gives general visitor information - not academic
specific
Processing
Data visualisation tool
IBM ILOG
Very business focused - not educational. Would
need tweaking
SoftActivity
Not directly useful, as it tracks every key stroke
iEducator
Apple product specific - does not appear to be
an analytic tool, limited information
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 17
Capturing information to improve learner retention and completion of courses
5 Hypotheses
Having discussed the relevant policy and academic background literature, as well as
currently available learning analytic tools, the following hypotheses were developed
for testing within the context of this study.
These hypotheses have been informed by research outcomes in comparable studies.
Our main contribution will be to test these assumptions within the context of an
Australian VET organisation.







Students with high participation rates in either online courses or blended
delivery courses will have better learning outcomes as evidenced by higher
pass rates and a smaller likelihood of drop-out.
Students are more likely to engage with content, i.e. be more active in the
course when the activity involves interactive resources including chat,
discussion, social media and collaboration activities;
Students are more likely to engage with content if this content will be assessed
(Dale and Lane (2007);
A student’s number of views on a discussion forum can be used as a reliable
indicator of interest. This differs from the number of postings, as some
students may read the postings but not create a post.
Generated Learning Analytics will be perceived as a helpful tool for teachers to
identify which type of activities engage students;
Log reports from Learning Management Systems are only a small component
of students’ outputs and only show their online behaviour; and
Analytics with real time visualizations for students and teachers will be
perceived as most effective by both groups.
These hypotheses are tested using data generated within the context of Canberra
Institute of Technology (CIT) Online Learning Environment administered courses.
6 Methodology
The project looks at data generated in Semester 2 2013 to allow comparison of
participation/engagement with results.
Baker and Siemens (2013) discuss different methodologies that are currently used in
educational data mining and learning analytics research.
Available methodologies include



prediction methods that involve model development to infer a single aspect of
data from combinations of other data aspects;
structure discovery algorithms to find structure in the data without a prior idea
of what results can be potentially identified;
relationship mining to discover relationships in large variable number
datasets;
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 18
Capturing information to improve learner retention and completion of courses



distillation methods for human judgement that focuses on data visualisation
for educators;
the use of previously established models and
generally available LA tools, such as Excel, R, SNAPP or GISMO.
After reviewing the methodologies discussed by Baker and Siemens (2013) it was
decided to use a combination of two of the methodologies discussed: applying a
prediction model in the context of two tools that can be used within the context of
learning analytic tools (i.e. R and Excel), and structure discovery method, social
network analysis in the context of two tools SNAPP and GISMO.
Prediction method was decided upon as it is most useful for small data sets - as in
the context of this study - and can be statistically validated and applied to greater
scale. The Structure Discovery method social network analysis was decided upon
for its visual analysis of strength of group connections.
Using log data from the LMS is one of the most commonly used data sources (e.g.
EDUCAUSE, 2011, Baker and Siemens 2013, Phillips et al. 2011, de Raadt 2012,
Hoel no date, Kraan & Sherlock, 2012), so this project will be using completion
results from our student management system to select courses with high and low
completion rates, and will analyse fully online and blended delivery courses to be
able to compare at least two main types of course delivery and their effects on
student course outcomes.
This data will be cleaned in Excel and then pivot tables will be used to analyse the
information, building on the work of Dierenfeld & Merceron 2012, and then R will be
used to compare the tools. The categories for our prediction models are listed below.
Table 1 – Classifications of log data into values
Assess
Engage
Content
Participation
quiz attempt
(learner started
quiz)
choice view (learner
views the choice
activity)
book print
(learner prints the
book activity)
quiz view (learner
views the quiz)
quiz close
(learner closed
quiz)
choice choose
(learner makes a
choice)
book view
(learner looks at
the book activity)
quiz view summary
(learner looks at
the summary of the
quiz)
Quiz close
attempt (learner
closed quiz)
forum view
discussion (learner
views forum
discussion)
book view
chapter (learner
looks at a chapter
in the book
activity)
Quiz view all
(learner views all
quizzes in the
course)
assign submit
(learner submits
assignment)
forum view forum
(learner views a
forum)
resource view
(learner views a
resource file)
assign view
(learner view an
assignments in the
course)
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 19
Capturing information to improve learner retention and completion of courses
Assess
Engage
Content
Participation
Assign submit
for grading
(learner has
submitted the
assignment for
grading)
forum add
discussion (learner
adds a thread to the
discussion)
glossary view
(learner looks at
the glossary
activity)
assign view all
(learner views all
assignments in the
course)
lesson end
(learner
completed the
lesson activity)
forum add post
(learner posts to
forum)
page view
(learner views a
page)
lesson view
(learner views a
lesson activity in
the course)
workshop view
(learner views the
workshop activity)
equella view
(learner viewing a
resources in the
digital repository)
course view
(learner is on the
course page of the
course)
lightbox gallery view
(learner views the
lightbox gallery
activity)
url view (learner
clicks on a link to
a url)
label view all
(learner views all
the labels in the
course – a list)
lightboxgallery view
all (learner views all
lightboxgallery
activities)
folder view all
(learner views all
the folders in the
course – a list)
Feedback
startcomplete
(learner starts and
completed the
feedback activity)
page view all
(learner views all
the pages in the
course – a list)
quiz review (learner
reviews quiz)
equella view all
(viewing all
resources in the
digital repository
for the course)
assign view
feedback (learner
views teacher
feedback for an
assignment)
resource view all
(learner views all
the resources in
the course – a list)
lesson start (learner
starts the lesson tool
activity)
While log data is commonly used in learning analytics as a data source, Baker &
Siemens (2012) recall that in “his first analysis of educational log data; almost two
months were needed to transform logged data into a usable form. They go on to say
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 20
Capturing information to improve learner retention and completion of courses
that “today, there are standardized formats for logging specific types of educational
data”.
This research used what Buckingham Shum & Deakin Crick (2012) described as
‘Data Exhaust’. Raw server log data does require cleaning; Kraan & Sherlock (2012)
define this as “Rectifying anomalies and inconsistencies, and normalising the syntax
of the data”. This study relied on the Moodle course logs. Using the course logs has
limited the hit count (number of times a student clicks on a link or activity), time on
course and time online to that directly relevant to the student’s activity in the selected
course, rather than general navigation in the LMS.
The data has been further aggregated by removing some of the detail contained in
the course logs for example:
resource view (http://elearn.cit.edu.au/mod/resource/view.php?id=627419)
(URL not public)
resource view (http://elearn.cit.edu.au/mod/resource/view.php?id=627421)
(URL not public)
were both truncated to Resource view, to count hits on resource views.
Across LMSs the cleaning process would be similar but not the same as illustrated by
the following Blackboard Log Data example:
content_type ‘resource/x-bb-folder/name’ or ‘resource/x=bb-journallink.name
Beer, Clark and Jones (2010) plotted the LMS hits by final grade using Blackboard
and Moodle for comparison, Blackboard LMS had much higher hits than students
using Moodle LMS. Richards (2011) reflects “that Moodle has more efficient access
architecture”.
This research chose to use the course logs, as this information is readily accessible
to course teachers and doesn’t require the use of SQL to extract the relevant data
from server logs. This is important as it keeps the activity in the hands of the teacher;
they don’t need special skills or access privileges to the server.
6.1 Data Selection and Collection
Three courses will be analysed in each set. Only courses with a single subject in
them will be considered (as opposed to courses that were delivered holistically with
multiple competencies).
Set 1 – Fully online courses
Student Management System – report on courses with highest completions
Report parameters will include: Qualification Level - Advanced Diploma or below,
minimum number of participants =16
Learning Management System – Course logs on selected courses
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 21
Capturing information to improve learner retention and completion of courses
Set 2 – Fully online courses
Student Management System – report on courses with lowest completions
Report parameters will include: Qualification Level -Advanced Diploma or below,
minimum number of participants = 16
Learning Management System – Course logs on selected courses
Set 3 – Blended Delivery courses
Student Management System – report on courses with highest completions
Report parameters will include: Qualification Level -Advanced Diploma or below,
minimum number of participants = 16
Learning Management System – Course logs on selected courses
Courses will be checked to ensure the learning management system was used in the
course
Set 4 – Blended Delivery courses
Student Management System – report on courses with lowest completions
Report parameters will include: Qualification Level -Advanced Diploma or below,
minimum number of participants = 16
Learning Management System – Course logs on selected courses
Courses will be checked to ensure the learning management system was used in the
course
Set 5 – Courses using Virtual Learning Environments
Virtual Learning Environment – report on largest number of users
Report parameters will include: Qualification Level -Advanced Diploma or below,
minimum number of participants = 16
Student Management System – selected courses completion data
Learning Management System – Course logs of selected courses
Set 6 – Courses using forums
Learning Management system – report on courses with forums
Student Management system – selected courses completion data
Report parameters will include: Qualification Level -Advanced Diploma or below,
minimum number of participants = 16
Data collection limitation
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 22
Capturing information to improve learner retention and completion of courses
It is acknowledged that the data being analysed is already 6 months old (2013
Semester 2), so deals with retrospective information, but the outcomes will be useful
in providing guidance to teachers about whether the prediction model is a good
indicator for student success. However it will not address if students would change
their learning behaviours if they were aware of their predicted results.
Tools Selection
As mentioned before, all Learning Analytic tools need to be configured and/or
adapted before being able to be used. Four tools were selected to analyse LMS
course log data: Excel, R, SNAPP and GISMO. Excel and R have not been
specifically developed for learning analytic purposes, but are both readily available.
Excel is also relatively easy to use while R, an open source tool, can be used for
professional quality statistical analysis. GISMO is designed for LA and for Moodle
and will be analysed for its effectiveness. SNAPP is designed for LA in forums and
will be used for data set 6.
6.2 Process
All sets of data will be analysed and visually represented using Excel, R, GISMO and
where appropriate SNAPP analytic tools. All tools will be evaluated for usefulness,
with a selection of teachers.
This will allow for comparison of results with outcomes as reported by Whitmer,
Fernandes and Allen (2012) in their paper
Analytics in Progress: Technology Use, Student Characteristics, and Student
Achievement that ‘Numerous studies have demonstrated a relationship between the
frequency of student LMS usage and academic performance.’
A review of the course content will be undertaken to see if specific activity is linked to
successful completion or engagement in the courses.
FIfteen courses will be selected, using a range of reports from the student
management system (course number and completions), the learning management
system (use and frequency of forums) and the virtual learning environment (minutes
of use).
Data on completion of these courses will be gathered from the student management
system. SNAPP tool will be used to see if the activity in the discussion boards had an
effect on the completion of the course. SNAPP is not LMS specific and is an internet
browser add-in. GISMO will be used to measure overall leaner activity in course
including forum use, posts versus post read and resources views.
After data collection and collation, teachers will be invited to participate in an
interview to give their opinion of the usefulness of the tools.
Interview process:
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 23
Capturing information to improve learner retention and completion of courses





Teachers were asked to comment on the student cohort, grades and how
they currently engage with students.
Teachers were shown Excel tables and charts and asked to comment on
them.
Teachers in the forum rich courses were shown the SNAPP tool in action and
asked to comment on its usefulness and compare it to GISMO forum
information.
Teachers were shown GISMO in action in their courses and asked to
comment on its usefulness.
Teachers were asked which tool they would prefer to use and would be the
most useful.
7 Results
These results focus on 15 courses selected using the data selection and collection
method described. The total number of students in the selected courses = 578. The
courses came from a range of colleges and programs including: Fashion Design,
Business, Music, Health and Disability, Aged care, Population Health, Information
and Communication Technology, and Tourism.
Key: F – forum rich courses, MM – mixed mode/blended delivery, On – fully online
Table 2 – Teacher minutes online
Course code
Teacher time minutes
Teacher
Minutes per
student
Completion rate
F1
983
46
94%
F2
491
26
56%
F3
521
31
48%
MM1
368
4
96%
MM2
1175
5
100%
MM3
238
8
26%
MM4
271
18
100%
MM5
359
24
20%
MM6
453
28
15%
On1
469
22
74%
On2
138
6
49%
On3
1524
63
72%
On4
1498
58
76%
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 24
Capturing information to improve learner retention and completion of courses
Course code
Teacher time minutes
Teacher
Minutes per
student
Completion rate
On5
1694
45
55%
On6
103
5
38%
Radar chart representation of Table 2 – Teacher minutes online
Teacher Minutes per student
On6 80
60
On5
40
On4
20
0
On3
F1
F2
F3
MM1
MM2
On2
On1
MM6
Teacher Minutes
per student
MM3
MM4
MM5
Table 2 Summary
As correlation measures between course completion and overall teacher time as well
as correlation between teacher time spent per student are not good predictors for
course completion, it was aimed to test whether the measures obtained per course
are statistically significant.
Table 2 presents aggregate data, including teacher time spent (in minutes) on
forum (F), mixed methods (MM) and online (ON) courses. The table also gives
information on teacher time spent per student (in minutes) and the overall course
completion rates. This project is interested in not only assessing whether overall
teacher time per student is a predictor for course completion but also compare
courses within a certain course category (i.e. low completion vs. high completion in
each of the three categories) as well as between course categories (i.e. high
completion rate courses vs. low completion rate courses), overall teacher minutes
were tested including time per student and course completion were statistically
significant. For this purpose, it was decided to use Student’s t-test statistical
procedure as a verification process. The t-test is one of the most common statistical
techniques for hypothesis testing on the basis of sample means differences. Hence,
the test can assist in determining whether a difference in two measurements is
statistically meaningful and not due to chance or randomness. Applied to Table 2, the
aim was to verify whether differences in teacher time spent overall or time spent on
students and completion rates were statistically different for a range of course
categories:
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 25
Capturing information to improve learner retention and completion of courses



High and Low completion rate courses across categories (e.g. a mixed mode
and online course with both completion rates above a certain threshold),
High and Low completion rate courses within a certain category (e.g. two high
performing mixed-methods courses), and
High vs. Low completion rate courses within a certain category (e.g. a high
vs. a low online course).
The analysis was conducted in Excel (with the add-on “Data analysis” package) by
performing paired two-sample t-tests for means. The two–sample tests were
performed for each possible combination of courses in the table to detect any
potential statistical pattern. The t-tests were conducted at the 95% confidence level.
As none of these was found to be statistically significant, the decision was taken to
create four course groups, as defined by the level of completion. The courses were
grouped, irrespective of delivery mode, in these four groups and again tested for
statistical significance.
Table 3 – Regrouping
Completion rate categories
Course name
Level 1: 90-100%
F1, MM1, MM2, MM4
Level 2: 70-90 %
On4,On1, On3
Level 3: 40-70%
F2, On5, On2, F3
Level 4: 15-40%
On6,MM3, MM5, MM6
The groups can be seen in Table 3 above. After re-grouping the courses into the four
categories, the following data table was used as a basis for analysis:
Table 4 – regroups minutes and completion rates
Course
Name
Teacher time minutes
Teacher
Minutes per
student
Completion rate
MM6
453
28
0.15
MM5
359
24
0.2
MM3
238
8
0.26
On6
103
5
0.38
F3
521
31
0.48
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 26
Capturing information to improve learner retention and completion of courses
Course
Name
Teacher time minutes
Teacher
Minutes per
student
Completion rate
On2
138
6
0.49
On5
1694
45
0.55
F2
491
26
0.56
On3
1524
63
0.72
On1
469
22
0.74
On4
1498
58
0.76
F1
983
46
0.94
MM1
368
4
0.96
MM2
1175
5
1
MM4
271
18
1
Testing course completion rate categories 1-4 for statistical significance (both overall
teacher time in minutes and per student) by adopting the same method described in
the previous section, significance was only established when differentiating between
category 2 and 4 courses. Hence, the results suggest that teacher time and teacher
time spent per student did seem to influence whether the completion rate would be
either between 70-90% or 15-40%.
While this seems to be an interesting result, it raises the question why this result was
achieved. Some words of caution seem appropriate in this context: as indicated, all
categories used for analysis included a number of different delivery modes (forum,
mixed-method and online), but this project was not able to differentiate between
these delivery modes. Additionally, due to the small number of data points in each
category, interpretation and recommendations based on these results needs to be
carefully considered.
As can be seen from the Table 4 data summary, the mixed–mode courses are
distributed in a bi-modal pattern and are either in the highest completion rate group
(90-100%) or in the lowest completion group (15-40%). Hence, the decision was
taken to test after the t-test according to course level completion specifically for the
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 27
Capturing information to improve learner retention and completion of courses
mixed-mode courses. When testing, however, no statistical significance was
detected. This result once more reinforces the need to be careful with the mere
interpretation and testing of data at the aggregate level, as pure teacher time spent
and time spent per student does not seem to automatically lend itself to be significant
in courses of a specific type (i.e. in this context mixed-mode courses) with different
completion outcomes.
Table 5 - Grades at CIT
Grade
Explanation
Distinction (D)
Pass with Distinction
Credit (CR)
Pass with Credit
Pass (P)
Pass/ All outcomes achieved
Ungraded Pass (UP)
Pass Achieved, ungraded pass
Fail (F)
Subject/Module outcome not achieved –
has attempted all assessment items but
hasn’t successfully completed
Withdrawal without attendance (WW)
Student enrolled in the course and didn’t
attend, or attended for a short period of
time but did not participate or engage
Withdrawal with attendance (WA)
Student enrolled in the course,
participated and completed an
assessment item but did not attempt the
final assessment item
Extension Granted (EG)
Students are granted an extension,
teachers need to enter a result within six
weeks
Academic Performance (AP)
Mid-term result for on the job training
when off the job assessment
satisfactorily completed
Table 6 - Grades and average minutes online per student
Course AP/EG
Code
WW
WA
F1
59
270
F2
.5
132
F3
0
99
MM1
MM2
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
0
F
UP
P
CR
D
497
265
506
223
222
311
170
111
Page 28
Capturing information to improve learner retention and completion of courses
Course AP/EG
Code
MM3
55
WW
WA
0
16
F
UP
CR
D
272
1165
643
79
MM4
109
MM5
19
0
66
MM6
32
0
116
On1
On2
P
72
0
0
807
336
5
On3
41
101
On4
0
334
On5
2
283
On6
10
266
864
63
Table 6 Summary
AP and EG grades indicate that the grade hasn’t been finalised, i.e. they have been
given more time to complete, whether that is completing workplace placement or
more time to do assessment items. Therefore, these were not considered in the
analysis.
Total time in course appears to be a good indicator for WA grade recipients in
courses with forums but this conclusion may not hold up when looking at time spent
on line on a weekly basis. There was a difference of around 200 online minutes
between a WA and a pass or above.
WA grade with courses with a Mixed Mode/ Blended Delivery is a good predictor of
their overall outcome/grade.
WA grade with fully online courses are a good predictor for overall grades
People who received an F grade in courses had spent a considerable amount of time
in the course. The difference between a F grade and a WA grade is that students
who receive an F grade have participated and engaged in the courses up to and
including the final assessment, where with a WA grade the student has completed
some activities or assessment but not all of them.
For grades UP and above, time on line is not a consistent predictor for overall grade,
but as shown in the table 6 there are only three of fifteen cases, meaning that is not a
good predictor.
Table 7 – Unsuccessful (WW, WA, F) compared to Successful (UP, P, CR, D),
student minutes online
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 29
Capturing information to improve learner retention and completion of courses
Type
Unsuccessful
Successful
F1
329
1268
F2
132
223
F3
321
311
MM3
16
79
On1
879
336
On3
142
266
On4
334
1830
On5
285
864
On6
10
63
Table 7 Summary
Courses that had either no unsuccessful participants or participants that had yet to
complete were excluded from this table. In all cases except F3, students who were
successful spent more time online then students who were unsuccessful. F3 data
contained students who received a Fail grade, which means they attempted all
assessment items but were not yet competent in at least one of them.
7.1 Comparing 4 week data to completion
Table 8 – F1 – Courses with forums average hits in values at 4 weeks and
course completion 94% completion rate
WA
P
CR
D
Participation 4
weeks
17
11
12
24
Participation
completion
30
106
122
122
Content 4
Weeks
6
13
7
16
Content on
Completion
9
39
41
51
Engagement 4
26
28
14
39
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 30
Capturing information to improve learner retention and completion of courses
WA
P
CR
D
Engagement on
Completion
61
79
153
132
Assess 4 weeks
0
0
0
0
Assess on
completion
0
4.8
3
2.5
weeks
Brief course content overview
This course is delivered in a blended delivery mode; the online component is
primarily used as a repository (resources only no activities) with the only engagement
activity being the forums. The forums are designed more as information giving rather
than a portal for discussion. Two of the forums were attached to assessment (which
was not compulsory). Out of the 19 participants, 16 participated in the summative
assessment forums. The only student who completed all nine forums did receive a
Distinction grade; however other students who received Distinction grades
participated in seven or fewer forums.
Only three people completed the course satisfaction survey. Two were “happy about
most things” with one “undecided about most things”.
Summary Table 8
Average number of hits in the first 4 weeks is not a predictor for successful
completion. Withdrawal with attendance (WA) students were not less active in this
timeframe than Pass (P) or Credit (CR) students, so it should be investigated what
happens in between 4 weeks and the end of the course for them to abandon the
learning process.
However, not surprisingly students who passed the course with Distinction were most
active during the 4 week participation timeframe. When it comes to participation,
students who obtained Distinctions were just as active as credit students, while
unsurprisingly pass students recorded less average number of hits. When it comes to
content completion, activity in the first 4 weeks there was no major difference
between Withdrawal with attendance (WA) students and credit (CR) students in that
time, while Pass (P) students and Distinction (D) students recorded almost double
the number of hits. This result could indicate that student engagement in the first four
weeks is more important when it comes to content than for participation. Interestingly,
engagement at 4 weeks was higher for WA students than P students who recorded
26 to 14 respectively. On completion, credit (CR) students had the highest number of
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 31
Capturing information to improve learner retention and completion of courses
hits while Distinction came in second, just a head of the Pass (P) students.
Table 9 – F2 – Courses with forums average hits in values at 4 weeks and
course completion 56% completion rate
WW
WA
UP
Participation 4
weeks
0
12
33
Participation
completion
0
19
46
Content 4
Weeks
0
11
17
Content on
Completion
0
16
19
Engagement 4
weeks
0
21
30
Engagement on
Completion
0
27
44
Assess 4 weeks
0
0.5
3
Assess on
completion
0
1.2
3.6
Brief course content overview
This course was delivered in a blended delivery format. The online component was
interactive with content, links, forums, quizzes and resources. The forum activity is
mainly students providing information and the teacher commenting, the forum
instructions do not require or encourage student to student discussions.
Table 9 Summary
The data presented in Table 5 is based on courses with 56% course completion
rates. Not surprisingly, withdrawal without attendance (WW) students did not record
any activity. Students withdrawing with attendance (WA) students were less active
across all four dimensions than upgraded pass (UP) students. The reasons for higher
activity of upgraded pass (UP) students as opposed to the Withdrawal with
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 32
Capturing information to improve learner retention and completion of courses
Attendance (WA) students should be investigated.
Table 10 – F3 – Courses with forums average hits in values at 4 weeks and
course completion 48% completion rate
WW
WA
F
UP
Participation 4
weeks
0
9
2
13
Participation
completion
.8
19
27
48
Content 4
Weeks
0
4
1
7
Content on
Completion
.1
10
26
16
Engagement 4
weeks
0
6
8
26
Engagement on
Completion
2
28
55
86
Assess 4 weeks
0
0
0
0
Assess on
completion
0
0
0
0
Brief course content overview
This course was delivered in a blended delivery format. The online component is
interactive with links, forums, quizzes and resources. The forum activities are setup
in a journal format; this did encourage some student to student interaction.
Summary Table 10
Compared with the 56% completion F2 courses, the difference in activity between 4
weeks and completion for UP students is larger than in the 56% completion courses.
Interestingly, Fail (F) students seem to be less active in the participation dimension
than the Withdrawal with attendance (WA) students, which could point to other
explanations for withdrawing than course content or not engaging with the material.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 33
Capturing information to improve learner retention and completion of courses
Table 11 – MM1 – Courses mixed mode delivery average hits in values at 4
weeks and course completion 96% completion rate
WA
UP
Participation 4
weeks
0
15
Participation
completion
0
71
Content 4
Weeks
0
11
Content on
Completion
0
39
Engagement 4
weeks
0
1
Engagement on
Completion
0
8
Assess 4 weeks
0
0
Assess on
completion
0
4
Brief course content overview
This course was delivered in a blended delivery format and the online component
was setup as a repository with one quiz online. One person who passed spent less
than five minutes in the online component of the course.
Summary Table 11
For the courses delivered in mixed-mode (MM1 – MM6), the number of average hits
in the participation, content and engagement dimension were not predictive for higher
course completion rates, when comparing courses with high and low completion
rates. This trend is already apparent when comparing the three courses with the
highest completion rates, MM2 (100%), MM4(100%) and MM1 (96%). MM1 had
higher hit numbers across all dimensions (participation, content and engagement)
than MM2 but a lower overall completion rate (96% vs. 100%). While this number
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 34
Capturing information to improve learner retention and completion of courses
may be due to including Withdrawal with Attendance (WA) students in the MM1 vs
the MM2 course, this difference still makes for an interesting result.
Table 12 – MM2 – Courses mixed mode delivery average hits in values at 4
weeks and course completion 100% completion rate
UP
Participation 4
weeks
7
Participation
completion
42
Content 4
Weeks
8
Content on
Completion
22
Engagement 4
weeks
0
Engagement on
Completion
.5
Assess 4 weeks
0
Assess on
completion
4
Brief course content overview
This course was delivered in a blended delivery mode. The online component
contains a range of resources, the online assessment online is a quiz. Two people
who successfully completed the courses did not attempt the quiz online.
Table 13 – MM3 – Courses mixed mode delivery average hits in values at 4
weeks and course completion 26% completion rate
Participation 4
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
AP
WW
WA
UP
29
0
7
34
Page 35
Capturing information to improve learner retention and completion of courses
AP
WW
WA
UP
Participation
completion
29
0
7
34
Content 4
Weeks
21
0
5
15
Content on
Completion
14
0
5
15
Engagement 4
weeks
3
0
0
9
Engagement on
Completion
3
0
0
10
Assess 4 weeks
1
0
0
3
Assess on
completion
1
0
0
3
weeks
Brief course content overview
This course is delivered in a blended delivery. The online course is well set out with
resources that support the face to face delivery. Online resources were discussed in
class. The only assessment online is a quiz not all students that successfully
completed participated in the online quiz.
Table 13 Summary
As for the mixed-mode delivery courses with low course completion rates of under
30% (MM3, MM5, MM6), all three courses were marked by student withdrawals with
our without attendance (WA and WW) at different stages and extensions granted
(EG) students. Granting of extensions overall seems to be an indicator for potential
low course completion rates in this delivery mode. An overall, average hit across all
dimensions was significantly lower than for the high completion rate courses (M1,
M2, M4), mainly though for content, engagement and assessment categories.
Table 14 – MM4 – Courses mixed mode delivery average hits in values at 4
weeks and course completion 100% completion rate
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 36
Capturing information to improve learner retention and completion of courses
UP
Participation 4
weeks
12
Participation
completion
43
Content 4
Weeks
4
Content on
Completion
17
Engagement 4
weeks
1
Engagement on
Completion
2
Assess 4 weeks
0
Assess on
completion
3
Brief course content overview
This course was delivered in a blended delivery model. This course had limited
amount of resources online but the teacher posted after each session with a
summary of the class. The course did contain four assignment drop boxes that were
used well by the students. There was only one forum and only one student posted.
Table 15 – MM5 – Courses mixed mode delivery average hits in values at 4
weeks and course completion 20% completion rate
EG
WW
UP
Participation 4
weeks
0
0
0
Participation
completion
7
1
20
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 37
Capturing information to improve learner retention and completion of courses
EG
WW
UP
Content 4
Weeks
0
0
0
Content on
Completion
7
0
10
Engagement 4
weeks
0
0
0
Engagement on
Completion
0
1
0
Assess 4 weeks
0
1
0
Assess on
completion
0
1
1
Brief course content overview
This course is a delivered in a blended mode. The online content is well set out, with
very clear instructions. The course is a mix of CIT developed content and links to
other resources. There is an assignment drop box online but only 9 participants used
it and all participants passed.
Table 16 – MM6 – Courses mixed mode delivery average hits in values at 4
weeks and course completion 15% completion rate
EG
WW
UP
Participation 4
weeks
0
1
1
Participation
completion
12
1
22
Content 4
Weeks
0
0
0
Content on
10
0
13
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 38
Capturing information to improve learner retention and completion of courses
EG
WW
UP
Engagement 4
weeks
0
0
0
Engagement on
Completion
0
0
0
Assess 4 weeks
0
0
0
Assess on
completion
0
0
1
Completion
Brief course content overview
This course was delivered in a blended delivery mode. The online content is well set
out with very clear instructions. The course is a mix of CIT developed resources and
links to other resources. There is an assignment drop box that not all of the students
that passed used.
Table 17 – ON1 – Courses online delivery average hits in values at 4 weeks and
course completion 74% completion rate
F
WA
UP
Participation 4
weeks
21
6
16
Participation
completion
118
13
62
Content 4
Weeks
12
4
9
Content on
Completion
51
10
31
Engagement 4
weeks
1
0
0
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 39
Capturing information to improve learner retention and completion of courses
F
WA
UP
Engagement on
Completion
1
0
1
Assess 4 weeks
0
0
0
Assess on
completion
2
0
1
Brief course content overview
This course is fully online, the course is well designed with lots of resources and
online assessment. All students who successful completed attempted the online
assessment.
Table 18 – On2 – Courses online delivery average hits in values at 4 weeks and
course completion 49% completion rate
Participation 4
weeks
EG
WW
UP
1
0
15
1
0
15
1
0
15
1
0
15
Participation
completion
Content 4
Weeks
Content on
Completion
Engagement 4
weeks
Engagement on
Completion
Assess 4 weeks
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 40
Capturing information to improve learner retention and completion of courses
EG
WW
UP
Assess on
completion
Brief course content overview
This course is fully online. This course primarily using third party resources with CIT
developed assessment. All students who completed participated in the online
assessment.
Table 18 Explanation
Data is missing from this course as the file was too big to be processed on our
learning management system.
Table 19– On3 – Courses online delivery average hits in values at 4 weeks and
course completion 72% completion rate
WW
WA
UP
Participation 4
weeks
6
18
17
Participation
completion
19
44
172
Content 4
Weeks
3
3
11
Content on
Completion
10
38
50
Engagement 4
weeks
1
6
4
Engagement on
Completion
1
6
18
Assess 4 weeks
0
1
1
Assess on
completion
1
4
27
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 41
Capturing information to improve learner retention and completion of courses
Brief course content overview
The course was delivered 100% online. The course is interactive with links, forums,
quizzes that were not assessable and a range of resources. The forum was not
compulsory and only 4 people participated in it. The non-assessable quizzes had
good activity with 62 people out of the 94 people attempting the quiz. All successful
participants completed the compulsory assessment tasks. The four people who
dropped out of the course spent less those 5 minutes online.
14 people completed the course evaluation most comments were positive with 8
saying that the teacher was responsive to their needs, one was undecided, 2
disagreed and 3 didn’t think it was a applicable.
Table 20 – On4 – online course average hits in values at 4 weeks and course
completion 74% completion rate
WW
WA
P
CR
D
Participation 4
weeks
0
42
12
65
74
Participation
completion
0
85
76
243
230
Content 4
Weeks
0
35
8
41
38
Content on
Completion
0
89
50
142
120
Engagement 4
weeks
0
1
0
11
7
Engagement on
Completion
0
4
2
31
10
Assess 4 weeks
0
0
1
1
2
Assess on
completion
0
0
5
5
6
Brief course content overview
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 42
Capturing information to improve learner retention and completion of courses
This fully online course is interactive with links, forums, voice over and other
resources. There are 15 forums in the courses with only one student posting in one
forum, there was not assessment attached to the forums. The course delivery
included weekly synchronised sessions in the online learning environment (Adobe
Connect room). The live classroom sessions were well attended.
Table 21 – On5 – online course average hits in values at 4 weeks and course
completion 55% completion rate
WW
WA
UP
Participation 4
weeks
1
21
91
Participation
completion
2
42
231
Content 4
Weeks
1
102
102
Content on
Completion
1
155
155
Engagement 4
weeks
0
58
58
Engagement on
Completion
0
130
130
Assess 4 weeks
0
1
1
Assess on
completion
0
6
6
Brief course content overview
This course is fully online. The course is well set out well, with very clear instructions.
The course contains resources, forums, links, assignment submissions and a theory
exam. The teacher spent 1694 minutes online.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 43
Capturing information to improve learner retention and completion of courses
Table 22– On6 – online course average hits in values at 4 weeks and course
completion 38% completion rate (course logs were too big to down load.
WW
UP
Participation 4
weeks
1
11
Participation
completion
7
55
0
0
0
0
0
0
Content 4
Weeks
Content on
Completion
Engagement 4
weeks
Engagement on
Completion
Assess 4 weeks
Assess on
completion
Brief course content overview
The course is fully online. The online component is very brief and gives links to
external resources that students need to complete and then submit the certificate in
an assignment drop box in the course.
Tables 17 – 22 Summary
For the online delivered courses (On1-On6), the three courses with high completion
rates (On1, On3 and On4) were marked by high activity rates increases between the
4 week and the participation completion stage with smaller increases for the content
and engagement dimensions.
For the courses with low completion rates (On2, On5 and On6), content and
engagement categories were almost non-existent for On2 and On6 while On5 had
very high average click activity data across the participation, content and
engagement dimensions at both the 4 weeks and completion stages. Interestingly,
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 44
Capturing information to improve learner retention and completion of courses
however, this high average numbers did not translate in higher activities at the
assessment stage.
When comparing high completion rate courses (Forum, Mixed-Mode and Online) on
one side of the spectrum and courses with low completion rates (Forum, Mixed-Mode
and Online) on the other, there is no definite trend visible at first sight between
activity and subsequent completion rates.
For all course types analysed (Forum, mixed-mode and online), participation at 4
weeks seems to be a poor indicator for activity at completion.
Hence, providing educator’s information at 2 stages (4 weeks and completion) may
seem to be insufficient information to give them enough information to monitor and
guide their students effectively. As it seems that there is some mechanism(s)
between 4 weeks and the completion stage that may drive subsequent course
completion rates, the information that will be provided to educators by learning
analytic tools needs to be able to be detailed enough (in terms of time and individual
student information) to provide insight for evidence based interventions.
7.2 Teacher interviews
For the teacher interviews, a semi-structured qualitative ethnographic approach
(e.g.Drever,1995) was taken to better understand educators’ assessments of
different learning analytics (LA) tools and their usefulness in helping to understand
student data. The student data emanates from the same data set that was used for
the analyses conducted above. The educators were selected based on their
experience in teaching the courses that were included in the data set. After
contacting the teachers by phone and email, in-person appointments in their work
environment (i.e. their office but not teaching) were made to demonstrate the
selected LA tools (Excel, SNAPP and GISMO) and to record their feedback.
Following the interviewing guidelines developed previously, the educators’ feedback
was audio-recorded to allow for subsequent transcription into Word and data
analysis.
Process
The overall research process is outlined below:






A brief project explanation and a description of the role of the teacher
interviews in the research project was provided.
The teachers were asked to look back at the course and to give general
comments about the student cohort.
Teachers were shown completion results and asked to comment.
Teachers were asked how they usually monitor their students.
Teachers were asked if the student results were a surprise.
Teachers, who had been selected because of their use of forums in their
courses, were asked specifically about the forums and the relation to student
success and shown the SNAPP tool in action.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 45
Capturing information to improve learner retention and completion of courses
Example of SNAPP forum participation visualisation

Teachers were then shown Excel graphs and asked questions about their
relevance and the teachers’ Excel skills. See examples below.
Excel spread sheet completion grades at four weeks
Excel spread sheet grades at Completion
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 46
Capturing information to improve learner retention and completion of courses

Teachers where shown GISMO in action on their courses. See examples
below.
GISMO overview of students’ use of resources
Key: The darker the pink the more times they are clicked on the resource
GISMO forum overview
Key: Red learner posted, Grey learner read a post
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 47
Capturing information to improve learner retention and completion of courses



Teachers were asked if these tools added to their current knowledge of how
students were going; did they add value?
Teachers were asked if they had any other comments that would help with
the research.
Finally, teachers were asked which tool they would be more likely to use.
Interview data
The interview data was transcribed in Word format, imported and coded into NVivo
(Richards 2005) software for qualitative text analysis. The resulting nodes were
grouped into thematic areas and linked using the software.
General comments:
The first group of comments related to general perceptions about the student group
and the course in general. Comments varied and included topics such as Language,
Literacy and Numeracy (LLN), lack of students’ workplace experience, the
importance of core competencies in industry placements, program characteristics,
eLearn (CIT LMS) access issues and assessment of different delivery modes.
The main two comment categories focused on (1) students’ competencies
“Couple of students that aren't improving have LLN issues. Different options
were suggested and offered [but] not taken up.” (Respondent 1)
and (2) program requirements:
“This program is very new. Half the group have scholarships, and sponsored
students take longer to finish. That accounts for the EGs.” (Respondent 2).
The complete transcripts were used to adopt word frequency clouds that can be used
to interpret and reflect the importance of topics on the educators’ minds. While
frequency and importance of words are not the same concept, the visualisation of the
word clouds in a semi-structured interview context can assist with furthering the
understanding of interviewees’ assessment of a topic (see e.g. Ramsden and Bate,
2008). As can be seen in the word cloud below, the main concepts and concerns for
teachers relate to student-centred topics in relation to Learning Analytics (LA). While
this result seems intuitive, it is also revealing, in that their focus is not mainly on their
own opinion of these types of tools.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 48
Capturing information to improve learner retention and completion of courses
Word cloud teacher general comments (generated with abcya.com)
Student underperformance and monitoring:
The second thematic category related to the issue of student performance and
monitoring. Teachers’ feedback mechanisms included written and oral feedback, an
open door policy, Q&As and generally keeping open communication channels.
Reasons for under-performance that were given included personal issues:
“WA's disengaged pretty quickly and dropped out of whole qualifications due to
personal issue. Another student was a surprise, he chose to withdraw as it
wasn't the career for him (Respondent 3).
“WA- disappointed he didn't complete but had family issues. WW withdrew from
very beginning. EG's were disappointing, as they were slack. All of the students
mature and employed” (Respondent 4).
Additionally, reasons mentioned for under-performance included students with
disabilities and potential issues with non-native speakers of English:
“lots of people with English as second language or English as 4th language or
3rd language of some of the students” (Respondent 5).
As expected, the range of responses for under-performance and monitoring success
and failure is very wide. This can also be seen from the word frequency clouds.
Students and teachers were the most frequently mentioned concepts while
resources, online, value and evidence were also commonly included.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 49
Capturing information to improve learner retention and completion of courses
Word cloud Student under-performance and monitoring (generated with
abcya.com)
Expectations
Most teachers interviewed were not surprised about the data with one exception who
indicated that… “…it did surprise me - how to interpret the data, physically on site
should indicate a better grade but might be that they are having difficulty interpreting
the information” (Respondent 5).
Most other respondents indicated that they knew what to expect from individual
students and student groups:
“ There isn't much online, won't expect them to be online more than ½ hour.
Not really surprised as it is one of the last units, so they might not get to it”
(Respondent 6).
Forums and student success
When probed whether the forums had contributed to student success, the evaluation
was generally positive, under condition that students engaged in the offered activities
online:
“When the student engage and there is discourse they work quite well.
Students used to Facebook so they are familiar to them. Works for teacher as a
delivery system” (Respondent 7)
Student participation and unexpected results
While educators generally have access to students with a wide range of abilities and
motivations, some teachers expressed surprise at student results as a consequence
of their participation rate. One of the teachers mentioned that mere activity does not
equate with better results:
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 50
Capturing information to improve learner retention and completion of courses
“[I was surprised only at]... The fail as he did spend a lot of time on the site.”
(Respondent 8)
The issue was raised that sometimes students were unaware of the participation
requirements online as
“sometimes students don’t know they are enrolled so would contact them to
check and get them to engage” (Respondent 9).
Generally, the 4 week data was considered a useful indicator to prevent potential
surprises. As an educator stated,
“ the 4 week data would help focus on which students might need a catch up,
spending either too much or too little time” (Respondent 10).
Word cloud Student participation and unexpected results (generated with
abcya.com)
Value of additional data points
When probed whether in addition to overall results the data at the 4 weeks stage
would be beneficial for the teachers to prepare adequate intervention strategies to
improve completion rates, most interviewed teachers indicated interest.
One of the main advantages mentioned was that
“…it would be useful…it would prompt communication” (Respondent 11)
as well as also promote teacher engagement
“…it would allow me to send message to students and enquire how they are
going to get more engagement” (Respondent 12)
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 51
Capturing information to improve learner retention and completion of courses
It was highlighted by several teachers that more data and information could be
specifically useful for them to assist international students with course completion
and success.
However, opinions also included caution as - even with more data - teachers may
not be able to understand what the underlying causes for engagement and
disengagement are; they may include confidence issues and computer access.
Word cloud Value of additional data points (generated with abcya.com)
Learning Analytics usefulness and skills: Excel
All of the interviewees agreed that Excel Pivot tables would be a useful tool for them
to review statistics and data as to their students’ online participation.
However, opinions were also voiced that the aim of an e-learning course is not to
collect LA statistics and that “students already receive emails from the coordinator”
(Respondent 13) in this regard.
Some respondents considered their skills up to par to analyse and interpret data in
Excel accurately, while many others did not currently have the abilities needed to
work on it without significant help.
It was suggested though by several participants that while their “Excel skills are ok, a
template would be better” (Respondent 14).
Hence, while Excel seemed to perform reasonably well as a LA tool in this context, it
is important to emphasise that even with a commonly used application like Excel,
assistance in the form of a user-friendly template would have to be provided.
All of them however were prepared to acquire the necessary skills to be able to use a
potential Excel LA template.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 52
Capturing information to improve learner retention and completion of courses
Learning Analytics usefulness and skills: GISMO
While feedback on Excel was overall positive, all interviewees expressed significant
interest in the information on student participation available in GISMO.
Reasons for this interest included the user-friendly interface, fewer clicks needed and
better visualisation than data logs.
Specifically information on how students use online resources proved to be popular
with one teacher stating that they “want it now… [It] gives a feel how they [i.e.
students] are using each resources, when they are looking at the more complex
resources. Really good to see what they are using! It would be one of the best
evaluation tools for the unit “(Respondent 15).
In a similar fashion, another respondent voiced similar support stating that
“Information would be handy especially when talking to students about
performance. Can see what students look at, would guide how much effort put
into certain resources especially looking at the hits on video. Attendance at
class and looking at how they are going online.” (Respondent 16).
Consequently, all respondents were interested in learning how to use GISMO. This
test suggested what was expected previously, i.e. that user-friendly interfaces and
data visualisation get more support from teachers than mere data logs.
Word cloud Learning Analytics usefulness and skills: GISMO (generated with
abcya.com)
Learning Analytics usefulness and skills: SNAPP
Opposed to Excel and GISMO, SNAPP was generally deemed to be “too slow”
(Respondent 17) and hence interest was not that marked among participants.
Perceived added value of Learning Analytics (LA) tools
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 53
Capturing information to improve learner retention and completion of courses
Learning Analytics (LA) tools overall were evaluated positively by the interviewees.
Positive aspects mentioned included bridging the “remoteness to teaching students
online” (Respondent 18) and detecting struggle earlier.
Additionally, the tools were considered to be a good verification tool in case
“someone blames the teacher” (Respondent 19).
Additionally, it was deemed to be a great assessment tool for “assessments and
knowledge” (Respondent 20) but only if incorporated “in the first four weeks”
(Respondent 21).
Other respondents indicated that it would alter their teaching style as it
“would change how I interact as it gives evidence to what they are doing online.
Gives more confidence to the teacher to recommend resources for the
students” (Respondent 22)
Perceived value of technology as learner engagement tool
All respondents expressed positive opinions on the role of e-learning technology as a
means to enhance engagement with learners.
Positive aspects mentioned included the potential for more interesting and fun
learning material and the fact that it “works well for students.” (Respondent 23) as
they “come already with their own technology” (Respondent 24)
Additionally, it was suggested that it actually increases student participation as “they
want to do the course via technology as they don't want to do face-to-face.”
(Respondent 25).
However, concerns were raised with regards to “students’ digital literacy and network
connection issues” (Respondent 26)
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 54
Capturing information to improve learner retention and completion of courses
Word cloud Perceived value of technology as learner engagement tool
(generated with abcya.com)
Final comments and preferred LA tool:
Final comments were encouraged by the participants to give them the opportunity to
mention aspects that had not been discussed during the interview.
Suggestions to improve e-learning experiences for both students and teachers
included more feedback sessions during the semester, using technology for sending
reminders; improve student’s digital literacy and integration with other internal
student engagement platforms such as Moodle.
Overall, teachers deemed students to be tech-savvy enough to be able to cope with
increasing technology integration.
Among the three LA tools presented (Excel, GISMO and SNAPP), GISMO was
deemed to be the most useful.
Word cloud Final comments and preferred LA tool (generated with abcya.com)
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 55
Capturing information to improve learner retention and completion of courses
8 Discussion
8.1 LMS usage and academic performance
Numerous studies (Campbell, 2007, Macfadyen & Dawson, 2010, Morris, Finnegan &
Wu (2005) & Rafaeli & Ravid, 2000), most of which focus on higher education, have
suggested that there is a relationship between LMS usage and academic
performance. This project has not tried to replicate the causal/correlational studies
but presents descriptive data and focuses on how the LMS usage data can be
represented for use by teaching staff. Therefore, in the case of hypothesis one,
students with high participation rates in either online courses or blended delivery
courses will have better learning outcomes as evidence by higher grades and a
smaller likelihood of drop-outs, the null hypothesis cannot be rejected.
There is some risk that the results of earlier research are not applicable to the VET
environment. For example assessment in VET is generally criterion based rather
than norm referenced. Also, VET academic outcomes are usually dichotomous
(competent/not yet competent).
In the VET sector the competency based approach used is closely allied to Bloom’s
‘Mastery Learning’ (Bloom, 1968) which is described as an educational method
where each student continues with their learning activities until they are deemed to
have mastered it. In this approach where the aim is to for all students to become
competent, it may take more time for some students than others to complete.
An example is that at the time this research data was initially collected, courses
MM3, MM5 and MM6 had a total of 26 grades outstanding (AP/EG). Nearly half of
these have now been updated to a pass as the work placement or the assignments
have now been successfully completed.
In the absence of scores a more useful measure for VET could be time to achieve
competence.
8.2 Excel
Moodle allows logs to be exported into Excel format. The log reports were used as
they contained all reportable actions on the LMS, which needed to be cleaned to be
useful (Le, 2002). Buckingham Shum & Deakin Crick (2012) coin the phrase ‘data
exhaust, “this data is a by-product of learner activity left in the online platform as
learners engage with each other and learning resources”.
For this project it was decided to use the action column of course log data as the
primary source of the action the student performed e.g.:
course view (http://elearn.cit.edu.au/course/view.php?id=96333)
(URL not public)
url view (http://elearn.cit.edu.au/mod/url/view.php?id=627424)
(URL not public)
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 56
Capturing information to improve learner retention and completion of courses
resource view (http://elearn.cit.edu.au/mod/resource/view.php?id=627419)
(URL not public)
In Excel the Moodle log data was cleaned to trim this column to ignore everything in
brackets, e.g. in the example above the clean data is:



course view
url view
resource view
A pivot table in Excel is a highly flexible contingency table (Dierenfeld & Merceron,
2012), which is created from datasets. The cleaned log data was used to construct
the data sets that were analysed (see table 1 – classifications of log data into values
above).
Example of Excel Pivot Tables: On the left is the original data source, in the
centre is the data set value count, and on the right is the pivot table.
On interview the teachers consistently expressed a preference for visual
representation of the data but the majority of them don’t currently have the Excel
skills to produce it. As an outcome of the research CIT is considering looking at using
an Access Database to help them.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 57
Capturing information to improve learner retention and completion of courses
Mock-up of Access database
Mock-up of Access graph output
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 58
Capturing information to improve learner retention and completion of courses
8.3 R
The second tool that was used to show data that had been exported from Moodle to
the teachers was the statistical software package R. This open source software can
be customized and adapted to several purposes, including statistical analysis and
data mining. Despite the fact that the tool is very versatile, it is scripted in the R
language and hence does not allow for intuitive use.
While R can be used for analytical purposes such as the analysis of Twitter hash
tags and tweet content (e.g. Walker, 2012), it must be emphasised that this
purposing, just as many other analytical options, does require the development of a
“package”, which allows for this type of analysis. As mentioned above, scripting and
developing a package for learning analytic purpose is not an easy task and requires
above average technical computer skills.
When it comes to testing R as a learning analytic tool for teachers to review their
students’ activities, problems were encountered that prevented the software from
being a useful alternative to Excel and other Learning Analytics systems.
As R needs to be downloaded and installed on each research participants work
station (as opposed to Excel, which is already installed as part of CIT’s standard
operating environment), and no readily available installation file is available in a
format most people are used to in commercial packages, the starting screen for the
installation can be enough to discourage potential users. While there is an online
handbook available that can be downloaded for free, many teachers may not be
comfortable enough to refer to a 500 page document. In addition, information has to
be gathered from several sections in the handbook to guarantee all required
functions. Consequently, to be perfectly comfortable with the system, a considerable
time investment is required.
The R interface is designed to provide advanced features for analysis but does not
aim to be particularly visually appealing. To partially remedy this issue, RStudio (a
data editor) was installed. RStudio is also open source but it provides a more intuitive
interface for conducting data analysis.
Problems started to occur with both R and RStudio interfaces in the installation
phase.
Most users (Windows, Mac and Linux) can install R from binaries, which must be
obtained from a Comprehensive R Archive Network (CRAN) mirror. Most users will
want to install R packages that are additional to those included in the initial
installation, and an Internet connection is needed as the program seeks to connect to
a data repository upon user command.
Different functions and programs in R have to be downloaded separately; when
attempting to use a function in R, the software usually warns that the package is not
installed. If the user wishes to proceed, the software will attempt to download and
install all the packages necessary to perform a specific action.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 59
Capturing information to improve learner retention and completion of courses
In our case, however, due to institutional arrangements and ICT policy, software
accessing external data repositories is severely restricted. Hence, we had to
manually download all required packages for a specific data transformation function
and install it from a computer folder to the program. As R does not immediately list all
the packages needed for a specific function, users need to go back and install all the
packages needed one by one, which takes a substantial amount of time and was
impractical for this study. Hence, to allow for a smooth flow of the interviews with
participants, it was decided not to confront the staff with this exercise.
8.4 SNAPP
The general idea about presenting student engagement and activity in a visual format
that is readily accessible is a good thing; however, the project wasn’t able to get
SNAPP to produce results that were consistent with analysis of raw data. The
teacher’s opinions were favourable in as much as they could see the potential,
however they perceive the tool as being in its early stage of development. The value
of SNAPP is limited in our environment as the pedagogical assessment of the forums
shows that forums are not being used for discussion so the social relationships
between participants aren’t forming. SNAPPs strength is reporting on the social
networks.
8.5 GISMO
The visual representation of student activity is valued by teachers as it gives them
quick access to not only how the students are going but which resources they are
using. One teacher noted that the videos he spends large amounts of time preparing
are not visited by the students after the first two or three in the course, thus
supporting the project hypotheses, number 5, that generated learning analytics will
be perceived as helpful tool for teachers to identify which type of activities engage
students.
When comparing SNAPP’s visual format with the GISMO forum overview teachers
found GISMO to be more useful and informative. Of special note was the time line
slider where teachers could look back at when students were more active. One of the
issues is with the version of the LMS being used; some of GISMO’s functionality
wasn’t available for Moodle 2.6 at the time of testing.
8.6 Correlational Analysis
The difficulties presented by the data sets (group sizes) are that they were too small
for statistical correlation to be a valuable measurement. Group/class sizes ranged
from 16 to 202 participants; these numbers are small compared the massive online
open courses (MOOCs) or some higher education class sizes.
8.7 Hits/Clicks
The number of hits/clicks between different LMSs is potentially different due to the
navigation structure of the LMS (Beer et al, 2010). This could be an issue with course
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 60
Capturing information to improve learner retention and completion of courses
design as well poorly designed courses will have less hits or clicks. Therefore, our
analysis has been course specific when investigating hits as a measure; there were
no comparisons across courses with the raw data.
8.8 Dwell time
Log files were filtered by duration. Only activities taking longer than one minute were
included in the data set; this is similar to Whitmer (2012 p.55) who “excluded dwell
time of less than five seconds.” Dwell time, or time online in the course, does seem to
correlate to the type of grade students receive. Withdraw with attendance grades
(students have attempted at least on assessment item but have not attempted the
final piece) in some cases spent more time than pass students. This might indicate
that students are struggling to make sense of the information. When teachers were
presented with the four week data as opposed to the completion data, they indicated
that they would contact all students to see if they needed addition support, but would
maybe change the content of the contact depending if the student was yet to get
online or if they had spent excessive amounts of time engaging.
Lodge and Lewis (2012 p.3) comment: “although more time on the learning
management system may correlate with higher grades, this may reflect a strategic
rather than a deep, lasting engagement with the content and body of knowledge, and
that is reliant on the contested idea that learning can be categorised cleanly into one
of three categories: deep, surface or strategic.” This research showed that not all
students interact with information in the same way - some students go back to the
same information repeatedly while others only have a brief look, but both complete
successfully. “The amount of time spend on the LMS is not an indication of deep or
surface learning” (Lodge & Lewis, 2012, p.3)
8.9 Virtual Classroom Participation
Virtual classrooms are part of the virtual learning environment. The virtual classroom
is used to support learners by simulating a class room or tutorial experience online.
Six out of the fifteen teachers have access to the CIT Virtual Classroom environment
(Adobe Connect). One teacher activity engages in using the virtual classroom weekly
with the students. Students, who activity participated in these sessions, did
successfully complete the course. The teacher commented that once student
technology skills were established, the sessions added value to the course.
8.10 Forum rich courses
This research hypothesized (hypothesis number 4) that a student’s number of views
on a discussion forum can be used as a reliable indicator of interest. To explain
further, this differs from the number of postings, as some students may read the
postings but not create a post. This hypothesis could neither be confirmed nor
rejected by this research, but on closer examination of data on students with
excessively high number of views (in one case 592 views on 10 forums) in the
teacher interviews, potential explanations for this phenomenon included English as
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 61
Capturing information to improve learner retention and completion of courses
second language, as well as Language, Literacy and Numeracy (LLN) issues, in
conjunction with learning difficulties.
8.11 Fully Online courses
Courses that are fully online included the first four weeks data and this information
was considered to be an unreliable predictor of participant’s completions.
Of special note is that students who received a Fail grade spent more time in the
online course than other participants, but the Fail students in forum rich courses with
a blended delivery did not engage in the online component of the course.
Student who received a Withdraw with Attendance grade often spent more time in
the course than students who received Pass grades. This could indicate that the
students are struggling with the content.
8.12 Teachers
In the interviews conducted for this project, teachers expressed a conviction that they
have a more holistic view of students than that provided by LA and that the analytic
data only opened a window on a small component of the students’ study activity, thus
concurring with hypothesis six.
The teacher interviews revealed that teachers with smaller class sizes have a really
good idea of who their students are and what they are capable of. The learning
analytic data presented to the teachers led them to conclude that the data did not
provide a full picture of who the students are. This supports Yuan (no date) who
states “learning is a complex social activity and all technologies, regardless of how
innovative or advanced they may be, remain unable to capture the full scope and
nuanced nature of learning”.
LA data needs to be interpreted by a skilled analyst (Richards, 2011), Richards goes
on to say “ideally, data mining enables the visualization of interesting data that in turn
sparks the investigation of apparent patterns”, sparking a reaction in teachers or
students to make suggestions for improvements. Further research into tools that
students can see at a glance or click how they are performing (Progress Bar, de
Raadt, 2012), would help to either confirm or reject hypothesis 7, and analytics with
real time visualisations for students and teachers will be perceived as most effective
by both groups. This research confirmed that teachers did value the real time
visualizations of the analytics.
8.13 Students
Project hypotheses one (students with high participation rates in either online
courses or blended delivery courses will have better learning outcomes as evidenced
by higher grades and a smaller likelihood of drop-out) was not conclusively confirmed
for students who received a Withdrawal without attendance (WW) learners less time
in their online course (Table 6). However, is it not conclusive for Withdrawal with
Attendance (WA) grades as a participants in On4 (Table 6) who received WA spent
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 62
Capturing information to improve learner retention and completion of courses
more time on line than students who received a Pass (P) grade. Hypotheses one is
inclusive for students who received a Fail (F) grade as the two fails in this study (F3
and On1) spent more time in their online course than students who were successful.
Project hypotheses two (students are more likely to engage with content, i.e. be more
active in their courses when the activity involves interactive resources including chat,
discussions, social media and collaborative activities) is neither confirmed or denied
as the data showed it is confirmed in some cases (F1, On4, On5) but not in others
(F3 or On1).
Project hypotheses three (students are more likely to engage with content if this
content is assessed, i.e. be more active in the course when the activity involves
interactive resources including chat, discussion, social media and collaboration
activities when it is going to be part of the assessment strategy. This not proven in
this research
This research did not look at LA from a student perspective, because data from a
previous semester was used to measure activity against results.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 63
Capturing information to improve learner retention and completion of courses
9 Conclusion
The original intention of this research was to determine if using learner analytic
information could help to improve learner retention and completion rates. While some
of the original hypotheses in this projects could not be conclusively proved or
rejected, teachers involved did express keen interest in using ‘one click’ visualization
software to monitor student progress and prompt them to make student contact.
Tools like GISMO and SNAPP that are relatively simple to use and do not require
extensive configuration overcome the significant technology skill hurdle that other
tools such as Excel and R can present for teachers
This project has not found tools currently available that could add to VET teachers’
knowledge of their students’ progress. This could be due to the small class sizes
(under 25 participants) and the teachers’ accumulated knowledge (students often
work with the same teacher for multiple courses over whole qualifications), a fact
highlighted in the teacher interviews. Many teachers already look at LMS logs to
monitor online engagement prior to meeting with students.
Gathering LA data from LMS log data would be more effective if courses were
designed with this in mind, which is not a new discovery and has been discussed
extensively in the relevant literature. Whitmer, Fernandes and Allen (2012) for their
study “Analytics in Progress: Technology use, Student Characteristics and Student
Achievement”, had teachers team up “with instructional designers to redesign their
courses for increased student engagement”. For example, a course would need to
have engagement exercises, individualised feedback or assessment oriented content
in the first four weeks. Future research projects with purpose designed courses and
visualization tools that are available to students and teachers are the next logical
step.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 64
Capturing information to improve learner retention and completion of courses
10 References
1st International Conference on Learning Analytics and Knowledge, Banff, Alberta,
February 27–March 1, 2011, https://tekri.athabascau.ca/analytics/
Ali, L., Hatala, M., Gašević, D., & Jovanović, J. (2012). A qualitative evaluation of
evolution of a learning analytics tool. Computers & Education, 58(1), 470-489.
Arnold, K.E. (2010). Signals: Applying Academic Analytics. Educause Quarterly,
33(1)
http://www.educause.edu/ero/article/signals-applying-academic-analytics (accessed
03-Apr-2014)
Arnold, K., & Pistilli, M. (2012) Course Signals at Prudue: Using Learning Analytics to
Increase Student Success, LAK’12, 29 April – 2 May 2012, Vancouver, BC, Canada
Bednarz, Alice, 2012, Lifting the lid on completion rates in the VET sector: how they
are defined and derived, National Centre for Vocational Education Research,
Adelaide
Berge, Z & Huang, Y (2004) A Model for Sustainable Student Retention: A Holistic
Perspective on the Student Dropout Problem with Special Attention to e-Learning.
DEOSNEWS, Volume 13 (5) (http://learningdesign.psu.edu/deos/deosnews13_5.pdf
accessed 18-Mar-2014)
Bailey, T., Calcagno, J.C., Jenkins, D., Kierzl, G., and Leinbach, T. (2005)
Community College Student Success: What institutional characteristics make a
difference? Community College Research Center Teachers College, Columbia
University
Baker, R.F.J.d., and Siemens, G (2013) Educational Data Mining and Learning
Analytics, Baker Siemens Handbook 2013.
http://www.columbia.edu/~rsb2162/BakerSiemensHandbook2013.pdf
Beer, C, Clark, K., & Jones, D. (2010). Indicators of engagement. In C.H. Steel, M.J.
Keppell, P. Gerbic & S. Housego (Eds.), Curriculum, technology & transformation for
an unknown future. Proceedings ascilite Sydney 2010 (pp. 75-86).
Bloom, B (1968) Learning for Mastery. Instruction and Curriculum: RELCV Topical
Papers and Reprints No. 1 Reprinted from Evaluation Comments 1 (2). May 1967
university of California at Los Angeles. Retrieved from
http://ruby.fgcu.edu/courses/ikohn/summer/PDFfiles/LearnMastery2.pdf Accessed
online 27 May 2014
Booth, M. (2012) Learning Analytics: The New Black, New Horizons (Technologies
Ahead) http://www.educause.edu/ero/article/learning-analytics-new-black
Bradbury, B., Redmond, G., Katz, I & Wong, M (2014) Intergenerational mobility: new
evidence from the Longitudinal Surveys of Australian Youth, NCVER
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 65
Capturing information to improve learner retention and completion of courses
Brown, M. (2012) learning Analytics: Moving from Concept to Practice, EDUCAUSE
Learning Initiative http://net.educause.edu/ir/library/pdf/ELIB1203.pdf
Buckingham Shum, S & Deakin Crick, R (2012). Learning dispositions and
transferable competencies: pedagogy, modelling and learning analytics. In: 2nd
International Conference on Learning Analytics & Knowledge. 29 Apr- 02 May 2012:
Vancouver, British Columbia, Canada
Campbell, J. (2007) Utilizing Student Data within the Course Management System to
determine undergraduate Student Academic Success: An Exploratory Study, doctoral
dissertation, Educational Studies, Purdie University. USA
COAG 2012, National Agreement for Skills and Workforce Development
(http://www.federalfinancialrelations.gov.au/content/npa/skills/skillsreform/national_agreement.pdf accessed 18-Mar-2014)
Dale, C and Lane, A. (2007) A Wolf in Sheep’s Clothing? An Analysis of Student
Engagement with Virtual Learning Environments, Journal of Hospitality, Leisure,
Sport & Tourism Education, Vol. 6, No 2 (p101 – 108)
Dawson, S., Bakharia, A., & Heathcote, E. (2010). SNAPP: Realising the affordances
of real-time SNA within networked learning environments. Paper presented at the
Networked Learning Conference 2010, Aalborg, Denmark
http://www.snappvis.org/?page_id=4
Dawson, S., McWilliam, E. & Tan, J.P.L (2008). Teaching smarter: How mining ICT
data can inform and improve learning and teaching practice. In Hello! Where are you
in the landscape of educational technology? Proceedings ascilite Melbourne 2008.
Department of Industry (2014) Vocational Education and Training Reform, Summary
of stakeholder feedback from Canberra workshops February 2014,
http://vetreform.industry.gov.au/node/300 accessed 18 March 2014
de Raadt, M (2012) Salvetore The Blog of Michael de Raadt
http://salvetore.wordpress.com/2012/07/26/analytics-getting-helpful-information-outof-an-lms/
de Raadt, M., & Dekeyser, S. (2009): A simple time-management tool for students'
online learning activities. Proceedings of the 26th Annual Ascilite International
Conference, Auckland, 6-9 December 2009. 194 - 199.
Diaz, V., & Gowler, S. (2012) Leadership and Learning Analytics, EDUCAUSE
Learning Initiative https://net.educause.edu/ir/library/pdf/ELIB1205.pdf
Dierenfeld, H., Merceron, A (2012) Learning Analytics with Excel Pivot Tables. 1st
Moodle Research Conference Proceedings (pp115 – 121).
Dietz-Uhler, B and Hurn, J (2013) Using Learning Analytics to Predict (and Improve)
Student Success: A Faculty Perspective, Journal of Interactive Online Learning,
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 66
Capturing information to improve learner retention and completion of courses
Volume 12, Number 1, Spring 2013 [ONLINE]
http://www.ncolr.org/jiol/issues/pdf/12.1.2.pdf Accessed April 2014
Dimopoulos, I., Petropoulor, O., Boloudakis, M., and Retalis, S. (2012) Using
Learning Analytics in Moodle for assessing students’ performance. 1st Moodle
Research Conference Proceedings (p40 – 46).
Drachsler, H., & Greller, W. (2012, December). Confidence in Learning Analytics. In
LAK12: 2nd International Conference on Learning Analytics & Knowledge.
Drever, Eric. Using semi-structured interviews in small-scale research: a teacher's
guide. No. 129. Glasgow: Scottish Council for Research in Education, 1995.
Dyckhoff, A. L., Zielke, D., Bültmann, M., Chatti, M. A., & Schroeder, U. (2012).
Design and Implementation of a Learning Analytics Toolkit for Teachers. Journal of
Educational Technology & Society, 15(3).
EDUCAUSE (2011) 7 Things You Should Know about …. First-Generation Learning
Analytics. http://net.educause.edu/ir/library/pdf/ELI7079.pdf
Elias, T (2011) Learning Analytics: Definitions, Processes and Potential, Learning
Analytics
http://learninganalytics.net/LearningAnalyticsDefinitionsProcessesPotential.pdf
Feroleto, L., Mantione, J., & Sedor, M. (2007) Analytics in Institutional Research: The
Final Frontier, presented at tableau Customer Conference
http://apb.buffalo.edu/data-reports/tableau-analytics.html
Ferguson, R. and Buckingham Shum, S. (2012). Social Learning Analytics: Five
Approaches. Proc. 2nd International Conference on Learning Analytics & Knowledge,
(29 Apr-2 May, Vancouver, BC). ACM Press: New York. Eprint:
http://oro.open.ac.uk/32910
Goodyear, P., & Retalis, S. (2010). Learning, technology and design. In P. Goodyear
& S. Retalis (Eds.), Technology-enhanced learning: design patterns and pattern
languages (Vol. 2, pp. 1-28). Rotterdam: Sense Publishers.
Govaerts, S., Verbert, K., Klerkx, J & Duval,E. (2010) Visualizing Activities for Selfreflection and Awareness,
https://lirias.kuleuven.be/bitstream/123456789/283362/1/icwlsten.pdf
Hoel, T. (no date) Will Analytics transform Education? A critical view on the data we
gather about the learners. Learning Frontiers
http://www.learningfrontiers.eu/?q=story/will-analytics-transform-education [accessed
20 March 2014]
Hubert, M., Michelotti, N & McKay, T. (2013) E2Coach: tailoring Support for Students
in Introductory STEM Courses, EDUCAUSEreview,
http://www.educause.edu/ero/article/e2coach-tailoring-support-students-introductorystem-courses
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 67
Capturing information to improve learner retention and completion of courses
Johnson, L., Adams Becker, S., Cummins, M., Estrada, V., Freeman, A., and
Ludgate, h. (2013). NMC Horizon Report: 2013 Higher Education Edition. Austin,
Texas: The New Media Consortium
Johnson, L., Adams Becker, S., Estrada, V., Freeman, A. (2014).NMC Horizon
Report: 2014 Higher Education Edition. Austin, Texas: The New Media Consortium
Jovanovic, J., Gasevic, D., Brooks, C., Devedzic, V & Hatala, M. (2007) LOCOAnalyst: A Tool for Raising Teachers’ Awareness in Online Learning Environments,
EC-TECH 2007, LNCS 4753. (p112-126) http://www.sfu.ca/~mhatala/pubs/2007Jovanovic-et-al-EC-TEL07.pdf
Kraan, W & Sherlock, D. (2012) Analytics Tools and Infrastructure, Vol 1, No 11.
CETIS Analytics Series. Bolton: The University of Bolton.
Kraker, P., Leony, D., Reinhardt, W., & Beham, G. (2011). The case for an open
science in technology enhanced learning. International Journal of Technology
Enhanced Learning, 3(6), 643-654.
Kruse, A., & Rongsajapan, R. (2012) Student-Centered Learning Analytics, CNDLS
Thought Papers https://cndls.georgetown.edu/m/documents/thoughtpaperkrusepongsajapan.pdf
Lodge, J & Lewis, M (2012) Pigeon pecks and mouse clicks: Putting the learning
back into learning analytics. Proceeding ascilite 2012, Wellington NZ Available at:
http://www.ascilite.org.au/conferences/wellington12/2012/images/custom/lodge,_jaso
n_-_pigeon_pecks.pdf [Accessed 20 December 2013]
Li, Y.(2012) Will Analytics transform Education? A critical view on the data we gather
about the learners. Learning Frontiers http://www.learningfrontiers.eu/?q=story/willanalytics-transform-education and http://blogs.cetis.ac.uk/cetisli/ [accessed 20 March
2014]
Mazza, R., & Dimitrova, V. (2004, May). Visualising student tracking data to support
instructors in web-based distance education. In Proceedings of the 13th international
World Wide Web conference on Alternate track papers & posters (pp. 154-161).
ACM.
Mazza, R., & Dimitrova, V. (2007). CourseVis: A graphical student monitoring tool for
supporting instructors in web-based distance courses. International Journal of
Human-Computer Studies, 65(2), 125-139.
Meares.C., What is Online Engagement And Why Should You Measure It? (Part
One) | MaassMedia | E-Marketing Analytics. 2013. What is Online Engagement and
Why Should You Measure It? (Part One) | MaassMedia | E-Marketing Analytics.
[ONLINE] Available at: http://maassmedia.com/blog/what-is-online-engagement-andwhy-should-you-measure-it-part-one/. [Accessed 27 December 2013].
Mercado, M and Ramos, (2011)What software is in almost every desktop pc? MS
Excel: Academic analytics of historical retention patterns for decision – making using
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 68
Capturing information to improve learner retention and completion of courses
customized software for Excel. [ONLINE] Available at:
http://www.arecibo.inter.edu/biblioteca/pdf/what_software_is_in_almost_every_deskt
op_pc.pdf [Accessed January 2014]
Macfayden, L & Dawson, S. (2010) Mining LMS Data to Develop an ‘Early Warning
System’ for Educators: A proof of Concept, Computers & Education, 54 (2) (February
2010): 588 – 599; see page 12.
Morris, L, Finnegan, C & Wu, S. (2005) Tracking student behaviour, persistence, and
achievement in online courses. The Internet and Higher Education , Volume 8, Issue
3, 3rd Quarter 2005, Pages 221 - 231
NVivo qualitative data analysis software; QSR International Pty Ltd. Version 10,
2012.
Phillips, R., Maor, D., Preston, G. and Cumming-Potvin, W. (2012) Exploring learning
analytics as indicators of study behaviour. In: World Conference on Educational
Multimedia, Hypermedia and Telecommunications (EDMEDIA) 2012, 25 - 29 June
2012, Denver, CO.
Phillips, R., Maor, D., Cumming-Potvin., Roberts, P., Herrington, J., Preston, G. &
Moore, E. (2011). Learning analytics and study behaviour: A pilot study. In G.
Williams, P. Statham, N. Brown & B. Cleland (Eds.), Changing Demands, Changing
Directions. Proceedings ascilite Hobart 2011. (pp. 997-1007).
Phillips, R. A., Preston, G., Roberts, P., Cumming-Potvin, W., Herrington, J., & Maor,
D. (2010). Using academic analytic tools to investigate studying behaviours in
technology-supported learning environments. In C.H. Steel, M.J. Keppell, P. Gerbic &
S. Housego (Eds.), Curriculum, technology & transformation for an unknown future.
Proceedings ascilite Sydney 2010 (pp.761-771).
http://ascilite.org.au/conferences/sydney10/Ascilite%20conference%20proceedings%
202010/Phillips-full.pdf (accessed 03-Apr-2014)
Productivity Commission 2012, Impacts of COAG Reforms: Business Regulation and
VET, Research Report, Volume 1 – Overview, Canberra.
(http://www.pc.gov.au/projects/study/coag-reporting/report accessed 18-Mar-2014)
Rafaeli, S & Ravid, G. (2000) Online, Web Based Learning Environment for an
information systems course: Access logs, linearity and performance”, presented at
Information Systems Education Conference (ISECON ’97), Information systems
journal (2000): 1 – 14
Ramsden, Andrew, and Andrew Bate. (2008) "Using word clouds in teaching and
learning." University of Bath.
Richards, Lyn. Handling Qualitative Data: A Practical Guide. Sage Publications,
London, 2005. ISBN 0-7619-4259-9
Richards, G. (2011) Measuring Engagement: Learning Analytics in Online Learning,
presented at “Electronic Kazan 2011” Accessed online 26 May 2014
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 69
Capturing information to improve learner retention and completion of courses
http://www.academia.edu/779650/Measuring_Engagement_Learning_Analytics_in_O
nline_Learning?login=neuepe@gmail.com&email_was_taken=true
Siemens, G. (2009) Technology Enhanced Knowledge Research Institute (TEKRI)
https://tekri.athabascau.ca/content/learning-analytics-knowledge
Siemens, G. (2013) Structure and Logic of Analytics (Video) Presentation at
Columbia U’s teaching College, http://www.learninganalytics.net/
Siemens, G., and Long, P. (2011) “Penetrating the Fog: Analytics in Learning and
Education,” EDUCAUSE Review 46, no. 5 (Sept./Oct. 2011),
http://www.educauseedu/ero/article/penetrating-fog-analytics-learning-and-education
Slade, Sharon and Prinsloo, Paul (2013). Learning analytics: ethical issues and
dilemmas. American Behavioral Scientist, 57(10) pp. 1509–1528.
http://oro.open.ac.uk/36594/2/ECE12B6B.pdf (accessed 03-Apr-2014)
Suthers, Dan and Chu, Kar-Hai, 2012, Multi-mediated Community Structure in a
Socio-Technical Network, LAK '12, Proceedings of the 2nd International Conference
on Learning Analytics and Knowledge, Pages 43-53
http://lilt.ics.hawaii.edu/papers/2012/Suthers-Chu-LAK-2012.pdf accessed online 03Apr-2014
Tyler-Smith, Keith, 2006, Early Attrition among First Time eLearners: A Review of
Factors that Contribute to Drop-out, Withdrawal and Non-completion Rates of Adult
Learners undertaking eLearning Programmes, MERLOT Journal of Online Learning
and Teaching / Vol. 2 / No. 2 / June 2006 (http://jolt.merlot.org/vol2no2/tylersmith.htm accessed 18-Mar-2014)
Uren, Judith, 2001, Increasing successful outcomes for TAFE students, 2001
AVETRA Conference Research to Reality: Putting VET Research to Work 28-30
March 2001, Adelaide.
http://www.avetra.org.au/abstracts_and_papers_2001/Uren_full.pdf (accessed 18Mar-2014)
Walker and Lyndo, 2012. Australasian Society for Computers in Learning in Tertiary
Education; Twitter learning analytics in R.
Whitmer, J. (2012) Logging on to Improve Achievement, Using learning analytics to
explore relationships between use of the learning management system, student
characteristics and academic achievement in a hybrid large enrolment undergraduate
course. Dissertation University of California
http://johnwhitmerdotnet.files.wordpress.com/2013/01/jwhitmer_dissertation_complet
e_1-21-2013.pdf Accessed online 7 December 2013.
Whitmer, J., Fernandes, K., & Allen, W.R. (2012). Analytics in Progress: Technology
Use, Student Characteristics, and Student Achievement. Accessed online 7
December 2013.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 70
Capturing information to improve learner retention and completion of courses
Yuan, Li (no date), Will Analytics transform Education? A critical view on the data we
gather about the learners, Learning Frontiers
http://www.learningfrontiers.eu/?q=story/will-analytics-transform-education accessed
online 12 March 2014
Zhang, H. & Almeroth, K. (2010). Moodog: Tracking Student Activity in Online Course
Management Systems. Journal of Interactive Learning Research, 21(3), 407-429.
Chesapeake, VA: AACE
More Information
National VET E-learning Strategy
Email: flag_enquiries@natese.gov.au
Website: flexiblelearning.net.au
New Generation Technologies
incorporating E-standards for Training
Email: e-standards@flexiblelearning.net.au
Websites:
New Generation Technologies: ngt.flexiblelearning.net.au
E-standards for Training: e-standards.flexiblelearning.net.au
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 71
Download