CSU learning analytics Strategy

advertisement
CSU LEARNING ANALYTICS STRATEGY
Version 1.3
Table of Contents
1. Introduction............................................................................................................................................................... 3
2. Glossary ..................................................................................................................................................................... 4
3. Purpose and Vision .................................................................................................................................................... 5
4. Drivers ....................................................................................................................................................................... 5
5. Values ........................................................................................................................................................................ 6
6. Objectives .................................................................................................................................................................. 6
7. Strategies ................................................................................................................................................................... 7
A. Gathering learning analytics data .................................................................................................................... 7
B. Technical architecture ..................................................................................................................................... 8
C. Proactive Analytic Strategies ......................................................................................................................... 10
8. Development and review of policies ....................................................................................................................... 10
9. Governance ............................................................................................................................................................. 12
10. Specific roles and responsiblities .......................................................................................................................... 12
11. Ethical issues and risk mitigation mechanisms ..................................................................................................... 13
12. High level indicators of successful analytics usage at CSU .................................................................................... 13
2|P a g e
1. INTRODUCTION
This strategy covers learning analytics, and acknowledges the critical role of the learner, the teacher and course and
subject design in the educational process. Note that learning and academic analytics overlap and a clear distinction
might not be possible. Learning analytics is situated within the general notion of “analytics” (see Glossary below).
Learning analytics can be uni-dimensional but the goal is to develop multi-dimensional models where various factors
are integrated and analysed in a systemic way and being an expression of the agency of those involved in the
learning and teaching systems at all levels.
The Working Party acknowledges that learning is a complex social activity and that technical methods do not fully
capture the scope and nuanced nature of learning. It is further critical that the type and nature of interactions are
analysed and not merely the amount of interactions.
Aspects of learning analytics are dependent on the availability of “big data” and as such are research that uses
massive amounts of data.
Learning analytics can be provided to the student, teaching staff, student support staff, teaching support staff, and
administrators to support adaptive practice and adaptive systems. It can be used to support a wide range of
activities from day-to-day teaching and learning activities, to institutional support for at risk students, and the
generation of new insights regarding teaching and learning patterns and effectiveness.
Below is a description of possible uses of learning analytics which is an adaption of work by George Siemens (2012). 1
Focus of analytics
Who Benefits?
Items below can be described as “learning analytics”
Subject-level: alignment with learning
experience design, social networks, conceptual
development, language analysis
Learners, teaching staff, support staff
Aggregate (big data) predictive modeling,
patterns of success/failure
Learners, teaching staff, support staff
Items below can be described as “academic analytics”
1
Institutional: learner profiles, performance of
teaching staff, quality of course and subject
design, resource allocation
Administrators, IR, funders, marketing, learners
Regional & National (state/provincial):
comparisons between systems
Governments, administrators
International: ‘world class universities’
National governments (OECD)
Siemens, G. (2012). Learning analytics: new insight or new buzzword? ACODE webinar. October 2012
3|P a g e
Internationally there is progression to multi-dimensional and systemic learning analytics that e.g. include data
external to the LMS and from support services. Focussed research occurs through the international Society for
Learning Analytics Research (SoLAR) (http://www.solaresearch.org/)
There are a number of types of learning analytics systems (George Siemens, 2012): Dashboards; Recommender
Systems; Predictive models; and Alerts/warnings/interventions.
Analytics can be provided in the following areas (George Siemens, 2012) that may be applied to students or teaching
staff:






Analytics around social interactions;
Analytics around learning content;
Analytics in different spaces;
Analytics on interaction with the university system;
Analytics on intervention and adaptation; and
Assessment of analytics.
Please see Section 10 on roles and responsibilities of various players and for more information on developments at
CSU.
The working party that was established by the ILSC to create this strategy were: Assoc Prof Philip Uys (convenor)
(DSL); Nina Clemson (P&A); Simon Thomson (Office of Student Services); Liz Smith (Academic Support); Paul Bristow
(DIT); Nadine Mckeown (DSL); Assoc Prof Barney Dalgarno (Sub-Dean L&T, Faculty of Education). The working party
was supported by Kate Rose (DSL).
Special input was received from Assoc Prof Alan Bain (Faculty of Education).
2. GLOSSARY
Academic analytics
Analytics
AUSSE
BICC
Business Intelligence
Business Intelligence
Competency Centre
CEQ
Community of
practice
4|P a g e
“The application of business intelligence tools and strategies to guide decision-making
practices in educational institutions. The goal ... is to help those charged with strategic
planning in a learning environment to measure, collect, decipher, report and share data
in an effective manner.” (http://searchcio.techtarget.com/definition/academicanalytics)
Note that learning analytics and academic analytics are interrelated and that the course
and learning designs determine what a university might wish to analyse through
academic analytics.
“The use of data, statistical analysis, and explanatory and predictive models to gain
insights and act on complex issues.” (EDUCAUSE)
Australasian Survey of Student Engagement
Business Intelligence Competency Centre
A set of methods and techniques that are used by organisations for tactical and
strategic decision making.
An organisational team that has defined tasks, roles, responsibilities and processes for
supporting and promoting the effective use of Business Intelligence across an
organisation.
Course Experience Questionnaire
A group of people who share a craft and/or a profession and/or an interest
Data warehousing
DWBI
ILSC
Learning analytics
Master data
SES
UES
A consolidation of data from a variety of sources that is designed to support strategic
and tactical decision making.
Data warehousing and business intelligence
Information and Learning Systems Committee
Learning analytics is the measurement, collection, analysis and reporting of data about
learners and their contexts, for purposes of understanding and optimizing learning and
the environments in which it occurs. (SoLAR) Note that the learner context referred to
above includes relevant computer systems, learning experience design, the role of
teaching staff as well as learning and teaching support staff.
Persistent, non-transactional data that defines a business entity for which there is, or
should be, an agreed upon view across the organisation.
CSU Student Experience Survey
University Experience Survey
3. PURPOSE AND VISION
Purpose:


To provide a university-wide framework to guide the use of learning analytics at CSU , and to improve our
understanding of models of learning and teaching at CSU and its performance in the context of:
o International and national developments and drivers contextualised to CSU;
o Institutional values, objectives and strategic priorities;
o Relevant institutional policies, governance, roles and responsibilities; and
o Ethical issues and institutional risks;
Including:
o High level objectives and strategies;
o High level indicators of successful usage;
o Infrastructure and human resource issues;
o Risk mitigation strategies; and
o Policy and governance issues.
This Strategy informs the CSU Educational Technology Framework, the DIT ICT Strategy in this area and the work of
the BI Steering Committee.
Vision:

A high quality student centred learning experience supported by subject, course and institutional analytics
that reflect the activities of students, teaching and support staff.
4. DRIVERS
EXTERNAL
1. Remaining competitive in an increasingly global market;
2. Increasing public accountability and transparency;
3. Performance in external benchmarking quality indicators such as CEQ, UES, AUSSE and International Student
Barometer;
4. CSU’s Compact with the Government highlights retention in the context of increasingly diverse intakes;
5|P a g e
5. TEQSA is increasingly focussing on course delivery and support processes requiring increased accountability;
6. Evidence of successful use of Analytics within Higher Education internationally (e.g. Purdue University) and in
other sectors of society (e.g. business); and
7. Increasing expectations by students as customers of a rich and responsive online environment
INTERNAL
The need for:
1. Improved student retention and progress;
2. More efficient and effective targeting of teaching and support activities and resources;
3. Improved understanding of the relationship between models of learning and teaching and performance
indicators at CSU;
4. Evidence-based professional learning of teaching staff;
5. Prioritisation of investment in learning systems and activities; and
6. Improved overall satisfaction of CSU students
5. VALUES
1. Alignment – we seek to improve alignment between models of learning and teaching and performance
indicators.
2. Supporting student success – we recognise that the analysis of learning and teaching related behaviours and
data can provide valuable insights into the student experience. We use this data for the purpose of
supporting student progress and retention and promoting teaching excellence.
3. Student centred – we place students at the centre of the learning experience by accommodating diverse
individual characteristics in the learning process; providing them with choice; and allowing them to be
active learners who are capable of managing their own learning, also through the use of analytics.
4. Continuous improvement – we recognise the learning and teaching environment and needs of students
change constantly. We reassess our practice on a regular basis to ensure we are collecting and analysing
relevant and actionable data that presents significant opportunities for improving the student learning and
teaching experience.
5. Evidence based decision making – we refine learning and teaching and the support of students in response
to differing and changing needs and behaviours and ensure that actions are linked to strong evidence.
6. Respect – we abide by all related confidentiality and privacy regulations and ensure that actions taken as a
result of access to student and staff data is done so in an ethical, supportive and respectful manner.
6. OBJECTIVES
LEARNING AND TEACHING
1. To provide feedback to students on their learning interactions and progress to improve their likelihood of
success, their choice of subjects and their self-management of learning on the basis of models of learning
and teaching at CSU.
6|P a g e
2. To provide alerts and reports relating to student activity and progress to teaching and teaching support staff
that would enable and inform appropriate intervention strategies on the basis of models of learning and
teaching at CSU.
3. To provide feedback to teaching staff on the effectiveness of their learning designs and learning and teaching
interactions on the basis of models of learning and teaching at CSU.
4. To provide reports to Course Directors to help inform revisions to the learning designs within subjects and
courses as part of curriculum renewal and course reviews using the Smart Tools project as appropriate.
5. To provide alerts and reports to Heads of School that would enable and inform appropriate management
interventions and professional development strategies.
ORGANISATIONAL
1. To support the student experience through the enhancement of University systems and processes.
2. To provide information to support institutional strategic planning in the learning and teaching area, including
indicators to allow the achievement of learning and teaching objectives to be more effectively monitored.
3. To provide information to support the improvement of relevant administrative processes.
4. To allow benchmarking of online learning interactions against other institutions.
5. Support selection, and prioritisation of systems and services through increased awareness of actual usage by
students and staff.
6. Provide dynamically, adaptive systems informed by analytics to ensure adaptive learning.
7. STRATEGIES
The CSU Learning Analytics Strategy will be underpinned by a whole of institution approach that ensures consistency
of approach for all students combined with the flexibility to allow teaching staff and school specific initiatives
tailored to the needs of specific student cohorts.
Relevant theories, learning and teaching models, analytical methods and technologies will be used to implement the
strategies below.
A. GATHERING LEARNING ANALYTICS DATA
At an institutional level we will collect and use data related to areas like the following:
1. The design of courses and subjects;
2. Student activities in relation to learning design;
3. Historical data such as students Grade Point Average (GPA); previous performance by the student in subjects’
historical performance rates for the subject;
4. Student access of key information (e.g. access to online subject outline within first 3 weeks of session);
5. Student assessment (e.g. failure to submit an assignment, failure of an assignment);
6. Student engagement in compulsory online activities (e.g. introductory forum posting, download essential
readings);
7. Student feedback on university wide surveys (e.g. OES, SES, CEQ, UES and AUSSE);
8. Student participation in online orientation and induction activities;
9. Staff uploading of online subject outlines within required timeframe;
10. Staff posting welcome/expectations message in Week 1;
11. Staff turnaround of assignments within appropriate timeframe (currently within 21 days); and
7|P a g e
12. Teaching staff participation in online communication strategies i.e. forum activity; or chat activity; or online
meeting activity;
13. Teaching staff access to teaching resources.
Note: These factors will be reviewed on a regular basis to ensure continued relevance by relevant parties including
Academic support and DSL.
Faculties, schools and teaching staff may develop more specific data collection and response strategies according to
the individual needs and characteristics of their cohorts for instance using online dashboards. Such strategies will
normally be agreed to by the relevant Head of School or Executive Dean, clearly documenting the purpose for data
collection, the method and the organisational responsibilities for such data collection and the strategies that will be
put in place as a result of this data collection. Such strategies should complement rather than duplicate university
wide processes and where possible should integrate existing resources, processes and information.
Predictors of student success will be developed based on the cohort, national and international good practice.
In addition to predictors of success, further indicators need to be identified that describe deeper student
engagement and higher levels of learning such as critical curiosity; meaning making; creativity; resilience; strategic
awareness; and the building of learning relationships (Shum, 20122)
B. TECHNICAL ARCHITECTURE
SOURCE SYSTEMS
1. All University information systems should be considered as potential analytics source systems unless
specifically excluded.
2. Learning Management and related systems acquired or built by the University (such as Smart Tools and
other course design and management tools) must capture data relevant to learning and academic analytics.
3. Learning Management and related systems acquired or built by the University must provide a reasonable
level of ‘operational’ level reporting for student and teaching users.
4. Learning Management and related systems acquired or built by the University must provide direct access to
captured learning and academic analytics data for extraction into other systems, such as a data warehouse
or other analytic tools.
ENTERPRISE ANALYTICS SYSTEMS
1. Learning and teaching data should be processed and consumed as per the pipeline below. That is, learning
and teaching data may exist in either transactional data tables, unstructured files or other source. All data
sources should be extracted from the source location and transformed into data warehouse structures. In
most cases users and tools should consume the learning and teaching data from the data warehouse and
associated reporting functionality where needs are not met by the application’s reports.
2. Data warehouse structures should ensure that data is stored at the lowest appropriate level of granularity
and that dimensions are ‘conformed’ to other student related data structures i.e. data need to be aligned
structurally with other university data.
3. Where application reports exist, as far as practicable data warehouse reports should be consistent with
results displayed in source system reports.
2
Shum, S.B. (2012). Our learning analytics are our pedagogy. Ascilite webinar. October 2012.
8|P a g e
4. Learning Management and related systems could be capable of sophisticated learning analytics that could
parallel some of the stages in the diagram above.
ENTERPRISE REPORTS
1. A suite of standard reports should be developed in consultation with relevant groups, and may be
embedded within the Learning Management system, if the system does not provide appropriate reporting
functionality. Additional reports and ad hoc requests will be developed on an as needs basis.
ACCESS TO ANALYTIC DATA
1. Access to data will be controlled by roles, with appropriate access privileges set depending on the sensitivity
of the data and the role of the individual. Policies need to be developed to clarify the approvals required for
access to data at various levels. Access to analytics systems should provide audit trails of what data is
accessed by whom.
2. Where reports don’t directly require identifying individuals data will be aggregated and anonimised to unlink
personal identification with captured data. The core data set will, however, retain the identification data.
3. A community of practice should be responsible for providing contextual information regarding teaching and
learning data and reports, such as the meaning of and limitations regarding each data element, the use and
inference of analytics and to refer questions to subject matter experts as appropriate.
REQUESTING ANALYTICS SERVICES
This section relates to the creation of reporting channels as well as access to such channels.
1.
2.
3.
4.
5.
6.
Source system reporting tools – eg LMS dashboards, traffic lights.
Faculty or School requests authorised by Executive Dean or Head of School for analytics from DSL.
Academic Support request from P&A.
Committee and executive requests.
Self serve access to authorised data using pre built data cubes and visualisation/reporting tools.
Custom requests may frequently evolve into standard reporting services.
9|P a g e
C. PROACTIVE ANALYTICS STRATEGIES
In addition to the provision of a variety of reports for students, subject coordinators, course directors, heads of
school and teaching support staff relating to individual or collective student interactions and progress and staff
activities, a number of types of alerts will be generated, which will either be delivered via emails or will be
presented inside the LMS. Some categories of alert will be generated automatically in all cases, while other
categories will be configurable either by the student or the subject coordinator.
The following table shows a non-exhaustive list of examples of some of the types of alerts which could be further
developed through close consultation with stakeholders especially the faculties. In each case policies need to be
developed to make clear whether the display of such alerts is governed by either opt in or opt out decisions.
Degree to which
configurable
Trigger for alert
How delivered
To whom
Issues with the design of
courses and subjects
Non reading of student
forum postings by
subject coordinator for
period of 3 weeks
Unanswered messages
on subject forum for
period of one week
Non access of subject
outline by student 2
weeks into the session
Non submission of an
assignment by the due
date
Non access by a student
to a resource marked as
important by the subject
coordinator
Non completion of a
course or subject survey
Through the Smart Tools
system
HOS, Subject coordinator
Always generated
Message
HOS, Subject coordinator
Always generated
Message
Subject coordinator
Configurable
Message
Student
Always generated
Message
Student, International
Student Support Officer
Configurable
Message within LMS
Student
Configurable
Message
Student
Always generated
Traffic indicator within
LMS and perhaps also
Email depending on
severity
Student, Subject
coordinator
Always generated
Learning activity profile
indicating that student is
at risk of failure
8. DEVELOPMENT AND REVIEW OF POLICIES
Policies around analytics need to be deeply embedded in the core policies of CSU around all of its learning and
teaching functions. Possible relationships between learning analytics and a number of CSU policies are identified
below. Critical policy areas include student access to analytics; use of historical data; privacy; access to
information; and collection and storage of data
Content delivered via a CSU owned or controlled service and accessed via a browser is covered by the CSU Web
Policy and is subject to the governance of the Web Management Committee (chaired by the Executive Director,
Marketing and Communications). This policy sets out CSU’s management principles on the development,
maintenance and use of its website, including CSU Interact.
10 | P a g e
o
CSU Web Policy Document (2008)
Other relevant polices that will need to be reviewed and/or reworded:
1. The conditions and obligations associated with authorised use of CSU’s computing and communication
facilities are set out in the CSU Policy for the Use of University Computing and Communication Facilities
(2006)
o The policy does not explicitly refer to the monitoring and capture of online behaviour for the
purpose of L&T analytics. There are several existing sections that could be reworded or
strengthened – for example:
12.
OBLIGATIONS – Staff (p. 8)
12.2 To assist in fulfilling the above obligations, CSU reserves the right to
audit and monitor the use by employees of CSU’s computing and communication
network and facilities.
18.
CONFIDENTIALITY - Students (p.12)
20.2 CSU reserves the right to audit and to monitor the use of CSU
computing and communication facilities.
2. Policies relating to privacy and confidentiality, including the use of records databases will also need to be
amended to cover the collection, storage and use of data:
o Privacy Management Plan (INF42, 2001)
 Privacy – A Brief Summary (INF41, 2001)
o DIT Privacy Statement
 This page is also referenced on the ‘Changing your Password’ webpage
o It includes text such as ‘What Information is collected’ and ‘Use of Personal Information’ that will
need to be reviewed. There is currently no mention of capturing data for analytics.
o Division of Human Resources - Staff
 “Privacy – Your Responsibility” “Charles Sturt University collects and retains
considerable personal data regarding its students and staff during the course of their
candidature and employment.”
o Several Divisions at CSU maintain specific privacy statements on their websites
 Library Services: Privacy of Information on Online Forms
 Student Services
3. Use of, and linkage to external services (import and export)
o The Policy and Guidelines for the use of External Educational Technologies (under development)
to be updated to investigate and/or facilitate the collection of learner analytics across external
systems
4. Collection, use and storage of data - Students and teaching staff
o Academic Communication With Students Policy
o The CSU Student Charter
 “the expectations students and the University may have of each other”
o Changing your Password (staff and students)
 “By clicking the button below, I am signifying my agreement with the University's Policy
for computing and communications and with the conditions stated above”
 Alternative mechanisms of ‘signifying agreement’ may have to be sought if
students are no longer compelled to regularly change their password
11 | P a g e
o
o
Access to Student Records - Overview of rights and responsibilities
 Access to Student Records “The University's policy that details staff and student access
to records, the amendment of records and to assessment items and information”
DIT Policy for the Storage and Transmission of Personal/Private Information
 “This policy covers the storage and transmission of any personal and/or private
information within and outside CSU’s network”
5. Stakeholder relationship to other policies and initiatives within the University
o Smart Tools, ie to include analytics on the course and subject design process,
o Curriculum Renewal and CSU Degree,
o BUSS,
o STAR.
9. GOVERNANCE
This Strategy informs the CSU Educational Technology Framework, the DIT ICT Strategy in this area and the work of
the Business Intelligence Steering Committee.
1. Role of the ILSC: monitor implementation of the Learning analytics strategy.
2. Role of BIC Steering Group: to set clear and concise guidelines on the collection and use of student related
data.
3. Research Ethics Committee: data for research
4. Academic Senate: managing policies related to learning analytics
5. Curriculum Learning and Teaching Committee: summary reports related to the CLT Plan
6. Learning Analytics implementation working party: a multi-disciplinary team that will coordinate the
implementation of the Strategy under the auspices of the ILSC taking into account the needs of diverse
stakeholders.
Note: Interaction between the ILSC and the Web Management Committee is necessary to ensure that developments
pertaining to analytics are properly integrated.
10. SPECIFIC ROLES AND RESPONSIBLITIES
1. Teaching staff: Involvement in the development of the use of learning analytics at CSU. Use the data to
improve learning and teaching.
2. Dean of Students, Executive Deans, Heads of School, Course Directors and Sub-Deans and Associate Deans
Learning and Teaching: User. Identify data types required. Request and act on data.
3. DSL: Contact point for Faculties, Schools and teaching staff for learning and teaching data. Gathering data,
analysis & reporting, as part of DSL’s own sub-data warehouse reporting infrastructure. Resolve data quality
issues relating to systems that DSL is the custodian for. Provision of professional learning activities and
support to users (teaching staff and managers) of analytics of systems that DSL is the custodian for, and
other systems in which learning analytics are provided. Conceptual leadership about learning analytics as it
pertains to teaching staff.
4. Planning and Audit: Location of CSU’s Business Intelligence Competency Centre (BICC) which manages CSU’s
data warehouse and supports the effective use of BI and analytical data at CSU. This area will be involved in
extracting source system data, combining it with other relevant data and processing into reporting
structures, and then providing reports, analysis, tools and guidance to consume the final data sets.
12 | P a g e
5. DIT: Supply system usage information from activity logs and other very large information sources in a highly
aggregated form. Assess new information systems for ability to supply information to meet analytics needs.
Where possible, facilitate fixing data quality issues in source systems. Provision of integrations systems and
services to support the transporting and transforming of information between key systems. In partnership
with P&A and DSL facilitate the building of reports and dashboards and data warehouse feeds of relevant
information. Supply user identity and affiliation information to support access control to analytic information.
Ensure the integrity of the data being stored and extracted for analysis.
6. Academic Support: Contact point. Identifying relevant data types and strategies to address results.
Production and dissemination of relevant resources identified as a result of analytics and integration with
programs such as Student Success Team. Relevant training of students.
7. Student Services: User of analytics and record interactions with students
8. Smart Tools project: Direct access to analytics on course and subject design
9. Student Central: User of analytics and record interactions with students
10. Division of Library Services: User of analytics and record interactions with students
11. ETHICAL ISSUES AND RISK MITIGATION MECHANISMS
1. Privacy and appropriate use of learning analytics are key ethical issues.
2. Policies will need to be developed on who has access to what data about students’ and teaching staff online
learning activity and in what circumstances and these will need to be developed to be consistent with some
broad ethical principles.
3. Policies and guidelines will also need to be developed specifying how analytics data can be used and for
what purposes.
Note: These policies would be developed in collaboration with appropriate parties at CSU such as the CSU
Ombudsman and Legal Office.
12. HIGH LEVEL INDICATORS OF SUCCESSFUL ANALYTICS USAGE AT CSU
1. Increase in the quality and effectiveness of online teaching as per Student Experience Survey due to adaptive
online teaching practice and/or adaptive online systems.
2. Increase in student retention rates through more effective interventions either automated or human.
3. Increase in the quality and effectiveness of online learning as per assessment results.
4. Increase in online engagement due to feedback on learning practices.
5. Increase in the appropriateness of subjects selected by students.
13 | P a g e
Download