Open-source LA and *what the student does*

advertisement
Open-source LA and
“what the student does”
Dr Phill Dawson, Monash
Dr Tom Apperley, Melbourne
Structure
•
•
•
•
•
•
•
Until 3:30pm
You will get a break
I will talk a bit (background concepts)
I will show some tools
You will talk a lot
You write an algorithm
You will probably argue about ethics
Dr Phillip (Phill) Dawson
• Lecturer in Learning
and Teaching at
Monash
• Led a small grant in
learning analytics
• Interested in how
academics make
decisions
Who are you?
•
•
•
•
•
•
•
LA researchers?
Educational designers?
LMS administrators?
University managers?
Faculty-based academics?
Academic developers?
Non-university?
“the measurement, collection,
analysis and reporting of data
about learners and their
contexts, for purposes of
understanding and optimizing
learning and the environments
in which it occurs” (SoLAR)
What LA do you use currently in
everyday teaching? (ie not research)
• Nothing?
• Reports? (eg ‘who has logged in?’; ‘who has
submitted assignment one?’)
• Dashboards?
• Something else?
“Who is struggling or
not engaging with my
course?”
Free,
modular,
configurable
extendable,
open-source
learning analytics
block
for teachers
to identify students at risk
“the measurement, collection, analysis and
reporting of data about learners and their
contexts, for purposes of understanding and
optimizing
learning and the
environments in which it occurs” (SoLAR)
Learning happens because of…
1.What the student is?
2.What the teacher does?
3.What the student does?
Biggs 1999
“learning is what the
student does”
Activity - pairs
• What do students do to learn in your context?
– Long list
– Specific
– Verb stems
– Online and offline
– Effective and ineffective
– Deep and strategic
Which of these can easily be
captured by LA?
Which of these can’t possibly
be captured by LA?
Flickr user sndrv http://www.flickr.com/photos/sndrv/4519088620/ CC-BY
Sci-fi LA vs Real LA
Flickr user dpape http://www.flickr.com/photos/dpape/2720632752/ CC-BY
Typical Open-Source LA Tools
• Gather data on student use of parts of LMS
• No integration with Student Management
Systems
• Teacher dashboards
• Reports
• Some synthesis
• Inconsistent design and language
– At code + UI levels
State of the Actual:
Free/open/built-in LA
• Open-source tools
– Engagement Analytics (documentation, demo vid)
– Gismo
– Analytics and Recommendations
• Vendor-supplied reports
– (eg Desire2Learn)
0..100%
Weightings
0..1
Forum
Posting
Reading
0..1
0..1
Assessment
Login
Submitting
on time
Logins per
week
Modular ‘indicator’ architecture
(so you can make additional indicator plugins)
Facebook
Attendance
Completio
n
Downloads
How does the assessment
indicator work?
• If an assignment is very late it is riskier than if it is
just a little late
• If an assignment is worth a greater percentage
then it is riskier than if it is worth a small
percentage
• If an assignment is past its due date and not
submitted then it is riskier than if it was submitted
late
for each assignment, quiz, lesson in the course whose due date has
passed
{
daysLateWeighting = ((number of days late) - overdueGraceDays) /
(overdueMaximumDays - overdueGraceDays)
assessmentValueWeighting = (value of this task) /
totalAssessmentValue
if (daysLateWeighting > 1)
{
daysLateWeighting = 1
}
else if (daysLateWeighting < 0)
{
daysLateWeighting = 0
}
if (task was submitted late)
{
risk = risk + daysLateWeighting * assessmentValueWeighting *
overdueSubmittedWeighting
}
else if (task was not submitted)
{
risk = risk + daysLateWeighting * assessmentValueWeighting *
overdueNotSubmittedWeighting
}
}
Activity: specify a new ‘indicator’ of
learning in your context
• What it does in one sentence
• Procedure to give a number between 0% (no
risk) and 100% (high risk)
– Words?
– Pictures?
– Flowchart?
– Algorithm?
• What variables could we tweak?
• How important is this indicator?
“From our students’
point of view, the
assessment always
defines the actual
curriculum”
Ramsden 1992, p. 187
“Students can, with
difficulty, escape
from the effects of
poor teaching…”
Boud 1995, p. 35
“…they cannot (by
definition if they
want to graduate)
escape the effects of
poor assessment”
Boud 1995, p. 35
Teaching/Learning
Objectives/Outcomes
Assessment
LA are only as good as curriculum
• Training statistical LA on assessment or
retention outcomes ≠ learning
• Teacher needs to be in control of LA use and
specify learning
• LA makes it more difficult to “escape the
effects of bad teaching”
“whether through
denial, pride, or
ignorance, students
who need help the
most are least likely
to request it”
Martin & Arendale 1993 p. 2
It’s the end of week 2 and student
X hasn’t ever logged in.
What do we do?
How can we make follow-up effective?
•
•
•
•
Personal or robotic?
Paint a grim picture?
Refer on or see personally?
Specific guidance
It’s the week of the census and
modeling suggests student X is 70%
likely to fail.
What do we do?
How can we make follow-up ethical?
• Do students have a
– Right to try (and fail?)
– Right to give up
– Right to be strategic
• Will draconian measures lead to LMS-farming?
• Student-identified triggers
It’s week 10 and student X already
has 60% of the course grade but
hasn’t logged in for two weeks.
What do we do?
Discussion and close:
the near-future for open-source
learning analytics
Extra time options
• Live demonstration of Engagement Analytics
tool
• Further specifying an indicator into
pseudocode for a developer
• Develop strategies for following up students at
risk
• Discuss student views of analytics tools
• Discuss open-source
References and sources
•
•
•
•
•
•
•
SoLAR definition of LA
Memes are from Quickmeme
NetSpot Innovation Fund logo courtesy of NetSpot. (You should apply for an opensource development grant through them.)
Biggs, J. (1999). What the Student Does: teaching for enhanced learning. Higher
Education Research & Development, 18(1), 57-75. doi:
10.1080/0729436990180105
Boud, D. (1995). Assessment and learning: contradictory or complementary. In P.
Knight (Ed.), Assessment for Learning in Higher Education (pp. 35-48). London:
Kogan Page.
Ramsden, P. (1992). Learning to teach in higher education. London: Routledge.
Martin, D., & Arendale, D. (1993). Supplemental Instruction: Improving First-Year
Student Success in High-Risk Courses The Freshman Year Experience: Monograph
Series (2nd ed., Vol. 7). Columbia, SC: National Resource Center for the First Year
Experience and Students in Transition, University of South Carolina.
Download