The Role of Learning Analytics: A personal Journey… 5/22/2013 Oded Meyer Department of Mathematics and Statistics Georgetown University Some background about myself….. Learning Analytics Data driven approach for the purpose of understanding and improving learning and the and the environment in which it occurs. Carnegie Mellon Open Learning Initiative (OLI) Scientifically-based online learning environments based on the integration of technology and the science of learning with teaching. OLI is designed to simultaneously improve learning and facilitate learning research. The OLI Statistics Course Educational Mission of Funder (2002) (The William and Flora Hewlett Foundation) Provide open access to high quality post-secondary education and educational materials to those who otherwise would be excluded due to: – Geographical constraints – Financial difficulties – Social barriers To meet this goal: – A complete stand-alone web-based introductory statistics course. – openly and freely available to individual learners online. Important form of analytics: The Science of Learning General: • Make the structure and “big picture” salient Learning Science Principles - continued • Immediate and targeted feedback students achieve desired level of performance faster. Science of Learning - continued Discipline specific principles: • Hands-on activities • Use real data • De-emphasize calculations. Start Analytics Early in the Design process After one module : • Qualitative feedback • Observe students • Talk-allowed protocols Instructor feedback loop analytics • Feedback to the instructor about students’ learning Learning Dashboard - Presents the instructor with a measure of student learning for each learning objective. - More detailed information: • Class’s learning of sub-objectives • Learning of individual students • Common misconceptions Learning Dashboard Team led by Dr. Marsha Lovett Instructor feedback loop analytics Key: • Clearly defined learning objectives. • Tying each activity to a learning objective. Benefits: • Can reveal misconceptions • Impacts how your spend class time • Can reveal “expert blind spots” Students who achieved proficiency in finding median did poorly in the following assessment item tied to the same sub-skill: Larger Scale Analytics: Assessing the Effectiveness of the Course When used in the Blended (Hybrid) Mode Larger Scale Analytics: Assessing the Effectiveness of the Course The Hewlett Foundation’s “Accelerate Learning Challenge”: Can students using the OLI course in the blended mode learn the same material as they would in the traditional course in shorter time and still have equal or better learning gains. Three accelerated Studies #1 Small class, expert instructor (2007) #2 Replication with larger class (2009) – With retention follow-up 4+ months later #3 Replication with new instructor (2010) – Experienced statistics instructor – New to OLI Statistics course and hybrid mode Study 1: Method ~180 students enrolled 68 volunteers for special section 24 students, adaptive/ accelerated condition 44 students, traditional control condition Adaptive/Accelerated vs. Traditional Two 50-minute classes/wk Eight weeks of instruction Homework: complete OLI activities on a schedule Tests: Three in-class exams, final exam, and CAOS test < < ? Four 50-minute classes/wk = Tests: Three in-class exams, final exam, and CAOS test Fifteen weeks of instruction Homework: read textbook & complete problem sets Same content but different kind of instruction Dependent Measure CAOS = Comprehensive Assessment of Outcomes in a First Statistics course (delMas, Garfield, Ooms, Chance, 2006) – Forty multiple-choice items measuring students’ “conceptual understanding of important statistical ideas” – Content validity – positive evaluation by 18 content experts – Reliability – high internal consistency – Aligned with content of course (both sections) – Administered as a pre/posttest Study 1: CAOS Test Results Chance Adaptive/Accelerated group gained more (18% vs 3%)pre/post on CAOS than did Traditional Control, p < .01. These analytics got the attention of education leaders in the U.S. who are facing the “cost disease” in higher ed. William Bowen (former President of Princeton) replicated our study in the “Interactive Learning Online at Public Universities” study. The study further indicated that blended learning offers the potential of more economical and rapid pathways to mastery. Analytics on Students’ Learning Habits • • • • Students in both groups recruited to complete time-logs Self-report for both groups Analogous point in the course (2/3 through) Six consecutive days: Wednesday - Monday Study 1: Time Spent Outside of Class No significant difference between groups in the time students spent on Statistics outside of class Study 2: Replication & Extension • Same method, same procedure, same instructor • Larger class (52 students in Adaptive / Accelerated) • Follow-up retention study conducted 4+ months later Study 2: CAOS Test Results Chance Adaptive/Accelerated group gained more pre/post on CAOS than did Traditional Control, p < .01. Using Analytics to Assess Retention Follow-up Begins Trad’l Ends Adapt/Acc Ends Jan Feb Mar Apr May Jun Jul Aug Sep Oct Adapt/Acc Delay (13 Students) Trad Delay (14 Students) Study 2, Retention: Re-taking CAOS Chance At 6-month delay, Adaptive/Accelerated group scored higher on CAOS than Traditional Control, p < .01. Study 3: Further Replication & Extension • Same method, same procedure • New instructor • Not involved in development of OLI course • New to OLI statistics and hybrid teaching mode • Instructor held constant for both Adapt/Acc and Control conditions • Larger class (40 students in Adaptive / Accelerated) Study 3: CAOS Test Results 70 60 50 40 30 Pretest 20 Posttest 10 Chance 0 Adapt/Acc Trad Control Adaptive/Accelerated group gained more pre/post on CAOS than did Traditional Control, p < .01. Current and Future Analytics • Continued “gap analysis”. • Better alignment between learning objectives/subobjectives and activities. • Student Dashboard provide learners with insight into their own learning habits and can give recommendations for improvement. • Learning-facing analytics allows learners to compare their own performance against an anonymous summary if their course peers Lessons Learned… 1. Pedagogy must drive technology and not the other way. 2. Developing online materials is a collaborative effort. 3. Developing online materials is an iterative process. 4. Steep learning curve. 5. Growth as a teacher. “Improvement in postsecondary education will require converting teaching from a ‘solo sport’ to a community-based research activity” Herbert Simon, Last Lecture Series, Carnegie Mellon, 1998