the e-assessment Guide (MS Word 1.4MB)

advertisement
An Australian guide to the risk
management of VET online
e-assessment:
a companion document to the research report into the
veracity and authenticity concerns of stakeholders.
Polytechnic West
30 June 2014
flexiblelearning.net.au
An Australian guide to the risk management of VET online e-assessment
Acknowledgements
This report was produced for the Flexible Learning Advisory Group (New Generation
Technologies - National VET E-Learning Strategy) by Tom Morris (Polytechnic West).
The enquiry benefited from a significant amount of stakeholder input. I am grateful for the
assistance of the Western Australian Industry Training Councils for their distribution of
Assessment e-Risk Surveys and the provision of employer feedback; to ASQA and TAC
auditors for their survey responses, and Dr. Russell Docking for his advice; to a large number
of Polytechnic West online students who responded to the invitation posted on their Learning
Management System; and to the assessors at Polytechnic West and Central who completed
surveys.
I have been fortunate to receive the unwavering support of three people: Greg Martin,
Polytechnic’s Portfolio Manager, Business and Information Technology (and custodian of the
Polytechnics Online Centre of Excellence); Josie Daniele, Advanced Skills Lecturer; and
Sue Morris, my wife and industry skills trainer.
My appreciation also goes to Sue Dowson, eLearning Advisor, for assisting me to embrace
the Survey Monkey learning journey and to David Harris for his exceptional proof reading
and editorial advice.
Disclaimer
The Australian Government, through the Department of Industry, does not accept any liability to any person for
the information or advice (or the use of such information or advice) which is provided in this material or
incorporated into it by reference. The information is provided on the basis that all persons accessing this material
undertake responsibility for assessing the relevance and accuracy of its content. No liability is accepted for any
information or services which may appear in any other format. No responsibility is taken for any information or
services which may appear on any linked websites.
With the exception of the Commonwealth Coat of Arms, the Department’s logo, any material protected by a trade
mark and where otherwise noted all material presented in this document is provided under a Creative Commons
Attribution 3.0 Australia (http://creativecommons.org/licenses/by/3.0/au/) licence.
New Generation Technologies
incorporating E-standards for Training
National VET E-learning Strategy
An Australian guide to the risk management of VET online e-assessment
Table of Contents
Executive summary ............................................................................................................. 5
The Enquiry......................................................................................................................................5
1 Introduction ...................................................................................................................... 8
Ministerial priorities ....................................................................................................................8
This enquiry’s approach .............................................................................................................8
2 A framework for the analysis of risks ........................................................................... 10
E-assessment risk components framework - diagram.............................................................10
3 Risk evaluation ............................................................................................................... 11
Hierarchy of VET (and e-assessment) objectives ...................................................................11
The risk evaluation process .....................................................................................................11
4 Treatment options .......................................................................................................... 13
Addressing the online e-assessment risks ..............................................................................13
Specification of competence ....................................................................................................14
New standards for Training Packages ............................................................................................... 14
Industry communication and consultation, at all levels ......................................................................15
New quality assurance standards for RTOs and assessors .............................................................. 15
Enrolment Process ..................................................................................................................16
Motivation to enrol and marketing practices ......................................................................................16
Unique Student Identifier (USI)..........................................................................................................16
Baseline authentication of identity .....................................................................................................17
Student Integrity Code .......................................................................................................................17
Integrity pledges built into online assessment process ......................................................................18
Learning Process .....................................................................................................................18
Learning Management System..........................................................................................................19
Assessment Process ...............................................................................................................19
Understanding the competency expectations ....................................................................................20
Assessor Competence – designers and developers .........................................................................20
Plagiarism..........................................................................................................................................20
Assessment Resources – national strategy.......................................................................................21
Rethinking online e-assessment........................................................................................................21
Video options for skills and applied knowledge .................................................................................22
Workplace Performance ..........................................................................................................22
Employer feedback – competence in the workplace .........................................................................22
Industry engagement in monitoring VET system performance ..........................................................23
Systematic moderation and validation ............................................................................................... 24
Online proctoring ............................................................................................................................... 25
Lockdown technologies .....................................................................................................................25
A Blob-based presence verification system in summative e-assessment .........................................25
New Generation Technologies
incorporating E-standards for Training
National VET E-learning Strategy
An Australian guide to the risk management of VET online e-assessment
Verification and biometric identification ............................................................................................. 26
Timed assessments and anomaly detection......................................................................................26
Challenge questions – identification and authentication ....................................................................27
Oral contact to verify identity/competence .......................................................................................27
Monitoring the net - social media and other web sites .....................................................................27
Monitoring and Review – identifying high / low risk RTOs ............................................................... 28
5 In conclusion - an integrated approach ........................................................................ 29
COAG ISC/ Advisory Council ........................................................................................................29
Industry Training Councils (ITCs) ..................................................................................................29
Employers and ITCs ......................................................................................................................29
Innovation and Business Skills Australia - ITC ..............................................................................29
Registered Training Organisations (RTOs) ...................................................................................29
Department of Industry ..................................................................................................................30
RTOs/ course developers ..............................................................................................................30
Assessors / RTOs ..........................................................................................................................30
Overarching Priorities .................................................................................................................................30
Risk management for high-stake, high-risk assessments ..........................................................................30
Appendix – a tool to assist risk management ................................................................. 31
Specification of competence ..........................................................................................................31
Enrolment Process.........................................................................................................................31
Learning Process ...........................................................................................................................32
Assessment Process .....................................................................................................................32
Workplace Performance ................................................................................................................32
Monitoring and Review ..................................................................................................................32
References ......................................................................................................................... 33
More Information ............................................................................................................... 38
Research Report - Companion Document
An Australian enquiry into the veracity and authenticity of online
e-assessment: a risk management approach to stakeholder concerns.
Support Document
Assessment e-Risk Survey of key stakeholders 2014: an Australian enquiry
into VET online e-assessment - support document.
Both documents may be accessed through the New Generation Technologies website
New Generation Technologies
incorporating E-standards for Training
National VET E-learning Strategy
An Australian guide to the risk management of VET online e-assessment
Executive summary
This enquiry was commissioned by the Flexible Learning Advisory Group (FLAG) to
address stakeholder concerns regarding the veracity and authenticity of online
e-assessment and the quality of training outcomes (FLAG 2013, p11).
The enquiry involved a wide ranging review of Australian and international research
and policy reports, and includes an Assessment e-Risk Survey of key stakeholders:
employer representatives, ASQA and Western Australian TAC auditors, Polytechnic
West and Central online assessors and online students.
It was found:
That the quality of VET online e-assessment outcomes cannot be treated as
discrete ‘e-assessment’ issues. What is required is a holistic approach
grounded in an understanding of the assessment lifecycle.
The results of the research are presented in three documents.
1. An Australian Enquiry into the veracity and authenticity of online e-assessment: a
risk management approach to stakeholder concerns.
This report, the Enquiry, is a comprehensive research report and analysis of
findings. The intended audiences are researchers and policy developers who are
interested in examining the evidence and detailed logic of the findings.
2.
An Australian Guide to the risk management of VET online e-assessment.
The companion Guide document is essentially an abridged version of the Enquiry.
It extracts from the full report a guide to the findings and treatment options for
practitioners. It is intended that this document may be read independently of the
first report.
The purpose of the Guide is to present an overview of the treatment options within
a risk assessment context. A ‘treatment options’ checklist is included in the
Appendix to assist practitioners to undertake their own context specific risk
assessment.
3. Assessment e-Risk Survey of key stakeholders 2014: an Australian enquiry into
VET online e-assessment - support document.
The third report, like the first, is intended for researchers and policy developers
interested in the detail. It presents the e Risk Survey methodology and collated
responses.
The Enquiry
The overarching approach and structure of the research Enquiry that informs this Guide is
provided by the international standard for risk management (AS/NZS ISO31000: 2009). The
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 5
An Australian guide to the risk management of VET online e-assessment
research follows the steps of the ISO risk management process from an exploration of the
context and the identification of risks, through the analysis and evaluation of these risks, to
the coherent presentation of treatment options.
The enquiry identified three key contributory components of online e-assessment that
influence its veracity and authenticity: the specification of competence; the quality of the
assessment process; and the integrity of evidence. The analysis demonstrated that the most
significant areas of concern to employers, auditors, assessors and students relate to the
rules of evidence and principles of assessment. The primary areas of concern in these two
areas are the validity and reliability of the assessment process, and the validity, sufficiency
and authenticity of evidence (including plagiarism, inappropriate collaboration, cheating, and
identity fraud).
Three other specific areas of concern were identified: the lack of clarity in the documented
competency requirements; the competence of assessors, both as assessor/practitioners and
assessment designer/developers; and available assessment resources.
In order to evaluate the significance of these concerns for vocational education and training
in Australia, the employment productivity, quality assurance, and efficiency objectives of VET
were explored. This exploration revealed that the job related benefits for attaining a
qualification fell in the four year period to 2012 (COAG 2013, p.41). This finding clearly
supports the importance of enhanced industry engagement in the specification of
competence and determination of qualification requirements. In the words of the Council of
Australian Government (COAG) Ministers: “while the vocational education and training
system has significant strengths, ongoing reform is necessary to ensure it effectively
supports the current and future skills needs of businesses across all sectors of the Australian
economy” (CSIC, 3 April 2014).
The veracity of the relationship between qualifications and employment productivity is
fundamental to the veracity of the VET sector and therefore to the veracity of online
e-assessment. Furthermore, the integrity of the relationship between qualifications and job
related benefits can have an indirect influence on the authenticity and integrity of assessment
evidence.
Research in the area of student integrity has consistently demonstrated that student
outcome expectations, and the ‘intrinsic’ value of the assessment process has a significant
impact on the amount of cheating that students engage in (Airely 2012, Lang 2013, and
McCabe et al., 2012). James Lang concludes his research with the statement that the
“fundamental principle” is that the “best means we have to reduce cheating is to increase
motivation and learning” (p.125). Consistent with this view, the best way to assist the majority
of candidates stay honest is for all stakeholders to veraciously play their part in line with their
professional responsibilities.
This enquiry has highlighted the importance of a multi-level integration of stakeholder
involvement in the management of online e-assessment risks. A ‘holistic’ risk management
approach is required, which “implies adopting a small number of strategies that deal with
many risks at once – preferably strategies that support the organisation’s mission rather than
have risk mitigation as their sole objective” (Kowszun and Struijve 2005 p3).
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 6
An Australian guide to the risk management of VET online e-assessment
Furthermore, the Enquiry found that many of the causes of concern about online
e-assessment are not directly related to the online e-assessment process itself. As a
consequence the appropriate root cause treatment of concerns about online e-assessment
range from: addressing concerns about the declines in the job related benefits of
qualifications and the marketing practices of RTOs; through to initiatives that are already
under way such as the New Standards for Training Packages and the Unique Student
Identifier.
In conclusion, it may be seen that there are two levels of risk management to be integrated in
seeking to ensure the veracity and authenticity of online e-assessment.
The first, an overarching level that involves the engagement of all key stakeholders in
managing the impact of risk on the key objectives of the VET sector - employment
productivity, quality assurance, and efficiency.
Second, a context specific level that involves the identification and management of
high-risk, high-stake, competencies and online e-assessment contexts. This is
primarily the responsibility of assessors and RTOs.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 7
An Australian guide to the risk management of VET online e-assessment
1 Introduction
This enquiry was commissioned by the Flexible Learning Advisory Group (FLAG) to do
two things:
A. to explore the perception among some stakeholders that the use of technologyassisted assessment (online e-assessment) represents a risk to quality training
outcomes; and
B. to seek ways to address these concerns by examining available solutions that
ensure the veracity of online e-assessment and the reliable authentication of
candidate identity.
The enquiry was a wide ranging investigation of Australian and international research
and policy reports, and included an Assessment e-Risk Survey of key stakeholders:
employer representatives, ASQA and Western Australian TAC auditors, Polytechnic
West and Central online assessors and online students.
The enquiry concludes that the way to address concerns about online e-assessment is
through an integrated stakeholder approach to the assessment lifecycle.
Ministerial priorities
The role of the VET system in ensuring that the requirements of employers and the
workplace are addressed was recently confirmed by Australian, State and Territory industry
Ministers. The following three priorities were agreed during the inaugural meeting of the
Council of Australian Governments (COAG) Industry and Skills Council.
1. To examine the standards for providers and regulators to ensure they better
recognise the different level of risk posed by different providers, enable the regulators
to deal more effectively with poor quality in the sector to improve confidence, and
meet the Australian Government’s deregulation objectives.
2. To reduce the burden on the VET sector arising from the constant updates to training
packages; and
3. To ensure that industry is involved in policy development and oversight of the
performance of the VET sector and to streamline governance arrangements and
committees.
(COAG ISC 2014, April 3)
The above communique also explained that industry stakeholders had communicated that
they require an “integrated approach to training, education and employment, and for data
that supports governments and industry to better understand future job needs.” (CISC, 3
April 2014)
This enquiry’s approach
The approach and key findings of this enquiry are consistent and supportive of these
ministerial priorities.
First, this enquiry adopted a risk assessment approach to the research and a risk
management approach to addressing the identified concerns of stakeholders. The
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 8
An Australian guide to the risk management of VET online e-assessment
recommended solution explicitly recognises the need to ‘deal more effectively with poor
quality’ at two levels:
 An overarching level that involves the engagement of all key stakeholders in
managing the impact of risk on the employment productivity, quality assurance,
and efficiency objectives of the VET sector; and
 A context specific assessment and management of high-risk, high-stake,
competencies and online e-assessment contexts.
Secondly, while the challenges posed by ‘constant updates’ to Training Packages may be of
concern to some stakeholders, it is imperative that the transition to the ‘new’ Training
Package Standards embraces a risk management approach. The process should not be
unduly rushed; it needs to provide industry stakeholders enough consultation and review
time to ensure that the competency standards and qualifications appropriately reflect ‘future
job needs’.
Thirdly, it is evident that an integrated stakeholder approach is needed. Industry must be
involved in ‘policy development’ and ‘oversight of performance’. Through on-the-job
performance management, industry is best placed to judge the employment productivity
benefits of VET, and the integrity of assessment processes. It is also evident that better
feedback and performance data is required in respect to the ‘different level of risk posed by
different providers’; and online e-assessment as a strategy.
In the words of the Ministers: “while the vocational education and training system has
significant strengths, ongoing reform is necessary to ensure it effectively supports the current
and future skills needs of businesses across all sectors of the Australian economy” (CSIC, 3
April 2014).
The purpose of the full research report and this companion document is to contribute to a
holistic understanding of the quality concerns of stakeholders; and a risk management
approach to ensuring the veracity and authenticity of e-assessment.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 9
An Australian guide to the risk management of VET online e-assessment
2 A framework for the analysis of risks
Analysis of stakeholder concerns suggests that it is useful to identify three contributory
components of online e-assessment that influence its veracity and authenticity: the
specification of competence, the assessment process, and the integrity of assessment
evidence.
These three components, along with their associated sub-components, are presented
diagrammatically below.
E-assessment risk components framework - diagram
Specification of
Competence
Assessment
Process
Integrity of
Evidence
Industry Input
Rules of
Evidence
Plagiarism
Clarity of
Documentation
Principles of
Assessment
Inappropriate
Collaboration
Relevant and Up
to Date
Assessor
Competence
Cheating
Assessment
Resources
Identity Fraud
This framework was used to review the findings of seven key stakeholder research reports
and stakeholder policy document and to analyse the strength of stakeholder concern as
indicated by stakeholder responses to the Assessment e-Risk Survey.
The analysis demonstrated that the most significant areas of concern to employers, auditors,
assessors and students relate to the rules of evidence and principles of assessment. The
primary areas of concern in these two areas were the validity and reliability of the
assessment process, and the validity, sufficiency and authenticity of evidence (including
plagiarism, inappropriate collaboration, cheating, and identity fraud).
The three other significant areas of concern identified were: the lack of clarity in the
documented competency requirements; the competence of assessors (both as
assessor/practitioners and assessment designer/developers); and available assessment
resources.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 10
An Australian guide to the risk management of VET online e-assessment
3 Risk evaluation
The quality assurance risks associated with online e-assessment need to be understood
within the broader context of the workforce productivity role of VET, and the role of online
e-assessment in achieving efficient, cost effective assessment outcomes.
The workforce productivity objective provides the underlying rationale for all assessment
within the VET sector. The undisputed role of VET is to improve the workforce productivity of
individuals and thereby improve the economic well-being of the nation. This VET productivity
role provides the fundamental objective on which quality assured e-assessment sits. If online
e-assessment in a VET context does not validly assess labour force productivity then by
definition the online e-assessment would not be quality assured.
Furthermore, efficiency is an objective that ‘transcends and includes’1 both the quality
assurance and the employment productivity objectives. The diagram below presents this
hierarchy of objectives as a pyramid.
Hierarchy of VET (and e-assessment) objectives
Efficiency
Quality
assurance
Employment and
Productivity
The diagram portrays the employment productivity role of VET as the foundation on which
quality assured e-assessment is built. In common with any structure if the foundation is
removed then the rest of the structure collapses. This holds true for all levels in this and all
hierarchies: remove the layer below and the layer above collapses. The efficiency objective
of e-assessment clearly makes no sense if both the workforce productivity and quality
assurance objectives are not satisfactory.
The risk evaluation process
In one of the few comprehensive risk assessment evaluations of an e-learning project, Jojo
Kowszun and Oscar Struijve include a brief review of the risk management literature (2005,
The phrase ‘transcend and include’ explains the relationship between the levels in a hierarchy. As
Ken Wilber explains: all that is included in the lower level of a hierarchy is included in the higher level,
but not all that is included in the higher level is in the lower level (2000, p.59). For a comprehensive
exploration of the nature of hierarchies see Wilber 2000 (pp.40-85) and 2001 (pp. 24-26).
1
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 11
An Australian guide to the risk management of VET online e-assessment
p.3-5 & p.21-26). The authors conclude that a holistic approach that supports the purpose of
the organisation is preferable to strategies that have risk mitigation as their sole objective.
“Research in this area (e.g. Smallman, 1996 or Royal Society, 1992) indicates
that holistic approaches to risk management are more likely to be successful
than reductionist methods. In practice, this implies adopting a small number of
strategies that deal with many risks at once – preferably strategies that support
the organisation’s mission rather than have risk mitigation as their sole
objective. In setting objectives for risk mitigation, specification of the process
rather than particular outcomes is more likely to reduce the unintended
consequences of crude target-setting.”
(Cited in Kowszun and Struijve 2005, p.3.)
The above quote reinforces the importance of incorporating consideration of the employment
productivity and efficiency objectives in the prioritisation of strategies to address concerns
about the quality of online e-assessment.
The National Agreement for Skills and Workforce Development between the Commonwealth
of Australia and all States and Territories (2012, April 13) explains that:
The objective of this National Agreement is a VET system that delivers a
productive and highly skilled workforce and which enables all working age
Australians to develop the skills and qualifications needed to participate
effectively in the labour market and contribute to Australia's economic future; and
supports the achievement of increased rates of workforce participation.
(Clause 18, COAG 2012)
Efficiency is a key driver of the national VET e-learning strategy. The strategy’s primary
objective is to develop the Australian training system’s capacity to capitalise on the rollout of
the National Broadband Network (p.3). The vision is a globally competitive training system
underpinned by world class e-learning infrastructure and capability.
The goals are threefold:
1. Develop and utilise e-learning strategies to maximise the benefits of the national
investment in broadband.
2. Support workforce development in industry through innovative e-learning.
3. Expand participation and access for individuals through targeted e-learning
approaches.
It is evident that strategies to address concerns about the veracity and authenticity of online
e-assessment must be based on a risk management approach that explicitly includes a
cost-benefit analysis so as to not undermine the potential employment productivity benefits of
the National Broadband Network.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 12
An Australian guide to the risk management of VET online e-assessment
4 Treatment options
The following presents a ‘lifecycle’ approach to addressing concerns about the veracity and
authenticity of online e-assessment. The lifecycle sequence and structure of the following
section is presented below in an ISO 31000 ‘inspired’ diagram.
Addressing the online e-assessment risks
Treatment options for each of the seven elements of this ‘assessment lifecycle’ are
summarised below (the nature of these options are presented in more detail in the
companion research report to this guide; An Australian enquiry into the veracity and
authenticity of online e-assessment).
Each of the above ‘elements’ is introduced with a brief synopsis and a table of treatment
options. The table nominates the key stakeholders responsible for implementing the
treatment options and presents an ‘indicative’ evaluation of the likely influence of the
treatment option on the employment productivity, quality assurance, and efficiency objectives
of VET.
It should be noted that the ratings are offered as a generic guide only. Selection of the
appropriate treatment options at a practitioner, or strategic policy level, will depend on the
nature and context of the online e-assessment being risk managed (a tool that may assist a
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 13
An Australian guide to the risk management of VET online e-assessment
context specific risk management approach at an RTO/assessor level is included in the
appendix to this Guide).
The tool is designed to highlight the importance of context in the management of risk, and
seeks to ensure a holistic approach to concerns about the veracity and authenticity of online
e-assessment.
The indicative ratings are based on the following schematic and definitions:
 Provides support to the objective.
? Supports the objective – but needs to be monitored/reviewed and/or further
?
consulted/communicated.
Uncertain or unclear relationship to the objective.

No apparent impact on achieving the objective.
Specification of competence
In the Australian competency based environment the fundamental building block is the
specification of competence as set out in the Units of Competence. The follow treatment
options address a range of concerns about the clarity of the specification of competence and
industry’s opportunity to engage with the process.
Treatment Option
4.2.1 New standards for
Training Packages
4.2.2 Industry communication
and consultation, at all levels
4.2.3 The New quality
assurance standards for RTOs
and assessors
Key
Stakeholders
Employment
Productivity
Quality
Assurance
Efficiency
COAG ISC/
Advisory
Council
?
?
?
Industry
Training
Councils (ITCs)

?

COAG ISC/
Advisory
Council
?
?
?
New standards for Training Packages
New training package standards were endorsed in November 2012. These new ‘Standards
for Training Packages’ were required for all new Training Package cases for endorsement
from 1 January 2014, with the aim of all Training Packages meeting the new standards by
31 December 2015. As explained in a recent NSSC policy paper:
“The new Standards have a strong focus on ensuring that assessment
requirements for the achievement of competency outcomes in Training
Packages are clear and specific. This clarity and specificity will give industry,
LTOs2 [RTOs], and regulators confidence in a shared understanding of the
requirements to be met.”
(NSSC, 2013, p.24.)
2
NB: The NSSC proposed a change of nomenclature from RTO to (LTO) Licenced Training
Organisation.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 14
An Australian guide to the risk management of VET online e-assessment
It seems apparent that improved clarity in the documented specification of competence
has the potential to have a broad impact on all three objectives: employment
productivity, quality assurance and efficiency. However, while the need for the
specification of competency to be improved is apparent, it is also evident that there is a
need for appropriate communication and consultation, and monitoring and review
processes. This is required to ensure that the ‘new specifications’ do not compromise
longer term efficiency and flexibility in pursuit of a narrower, prescriptive version of
consistency (see Clayton et al. 2001).
Industry communication and consultation, at all levels
The eleven Industry Skills Councils (ISCs) provide a nation-wide mechanism for industry
communication and consultation in regard to the specification of competence. The mandate
of Australia’s Industry Skills Councils is to “bring together industry, educators and
governments and unite them on a common industry-led agenda for action on skills and
workforce development” (http://www.isc.org.au/about.php accessed 21 May 2014).
The COAG ISC Advisory Committee and the network of Industry Skills Councils are
designed to enhance business and industry engagement in the pursuit of efficient, quality
assured, outcomes that enhance employment productivity. The engagement of all levels of
industry across all industry skill sectors is critical to the achievement of the fundamental
employment productivity objective of VET; as is the Ministers’ stated commitment to
enhancing industry engagement and consultation processes.
New quality assurance standards for RTOs and assessors
The quality assurance standards for RTOs were initially prescribed in the Australian Quality
Training Framework (AQTF) in 2002, and with the formation of ASQA in 2011 were redefined
as the NVR. In 2013 the now dissolved NSSC published a standards policy framework
entitled Improving Vocational Education and Training: the Australian vocational qualification
system.
With the dissolution of the NSSC and the transition to the new COAG Industry and Skills
Council, (and Advisory Committee arrangements) incomplete, it is not clear how much of the
NSSC’s proposed “Standards Policy Framework” (2013) will be retained. The Australian
federal Minister, and State and Territory Ministers with responsibility for industry and skills
portfolios, have agreed to support a risk management approach to quality assurance, and an
enhanced focus on employment productivity and efficiency (COAG ISC 2014, April 3). We
can also be confident that the principles of assessment and rules of evidence will continue to
underpin the quality assurance standards for assessment.
As already mentioned the COAG ISC Advisory Committee and the network of Industry Skills
Councils are designed to enhance business and industry engagement in the efficient pursuit
of quality assured outcomes.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 15
An Australian guide to the risk management of VET online e-assessment
Enrolment Process
The enrolment process may be seen to play a key role in the ‘base-line’ authentication of the
identity of students and in establishing their ‘code of conduct’ in respect to plagiarism,
inappropriate collaboration, cheating, and identity fraud. The enrolment process is also an
appropriate time to advise students about the range of integrity monitoring and review
practices in place, the procedures that will be followed in the event of a suspected breach,
and the consequences if found guilty of transgressing.
The enrolment process is also where students make their decision as to which learning and
assessment pathway matches their skills, knowledge and motivation. Marketing and
advertising can have a significant influence on the decision to enrol.
Treatment Option
4.3.1 Motivation to enrol and
marketing practices
Key
Stakeholders
Employment
Productivity
Quality
Assurance
Efficiency
RTOs



4.3.2 Unique Student Identifier
(USI)
Department of
Industry

?

4.3.3 Baseline authentication of
identity
RTOs



4.3.4 Integrity pledges built into
online assessment process
RTOs



Motivation to enrol and marketing practices
Students may be motivated to enrol in AQF qualifications with a summative assessment
component for essentially two reasons: to develop and demonstrate competence (through a
learning and assessment pathway) or to demonstrate that they are already competent
(through an assessment/RPL pathway). The motivation to enrol may have a significant
impact on the students approach to both the learning and the assessment processes.
The enrolment process begins with the ‘decision’ to enrol. Research has demonstrated that
‘outcome expectations’ and ‘efficacy expectations’ can have a key impact on the cheating
behaviours of students (Lang 2013, p.146). It may therefore be seen that the veracity and
authenticity of job related outcomes, and the marketing and enrolment process will be
important determinants of the veracity and authenticity of the online e-assessment process.
A recent ASQA (2013c) strategic audit raises some important questions in regard to the
marketing practices of some RTOs. It is evident that the marketing integrity of RTOs needs to
be monitored. The impact of job outcome expectations on motivation (and the cheating
behaviour) of students would also appear to deserve further consideration.
Unique Student Identifier (USI)
While the details of the Unique Student Identifier (USI) proposal are still being finalised, the
Australian Government has re-introduced legislation to support the introduction of the USI
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 16
An Australian guide to the risk management of VET online e-assessment
and the scheme is scheduled to commence on 1 January 2015. The USI has the potential to
contribute to the authentication of the identity of students in two ways.
A. Combined with appropriately rigorous ‘registration’ and ‘validation’ processes the USI
could play a direct role in the authentication of identity (the current public information
on the proposal does not indicate how rigorous the process is intended to be or what
the responsibilities of RTOs will be).
B. Alternatively, if the USI is not implemented with a rigorous registration and identity
validation requirement the national data base will nevertheless provide RTOs and
employers with access to ‘authentic’ statements of attainment. Such an authentic
national register of attainment will provide a convenient opportunity for individuals to
verify their claims. As it stands, the current process for verifying the claims and
documents produced by individuals for advanced standing or employment purposes
requires direct contact with the issuing RTO. The USI will potentially make the
verification process relatively straight forward. And the USI register will be a current
and authentic record of attainment in the event of confirmed cheating and identity
fraud.
A Unique Student Identifier could efficiently support the quality assurance and employment
productivity objectives of the VET sector. The importance of appropriate safeguards to
protect the privacy of individuals and the integrity of the data is undoubtedly important and
needs to be informed by appropriate consultation and monitoring processes.
Baseline authentication of identity
There is a distinction to be made between the baseline ‘authentication’ of a candidate’s
identity3 and the subsequent ‘verification’ that the individual undertaking the assessment
process is the ‘authenticated’ person (Foster and Layman, 2013, p.7). It may be seen that
without a reliable baseline ‘authentication’, subsequent ‘verification’ processes are reduced
to statements of consistency that do not necessarily validate the identity of the candidate.
The baseline authentication of the identity of an individual needs to be an integral part of the
enrolment process. A minimum authentication protocol and checklist is required for this
purpose. If subsequent assessment processes are to include biometric ‘verification’ of
identity, (see discussion below) then the enrolment process will need to provide for
‘authentic’ baseline biometric identity information to be gathered.
Student Integrity Code
McCabe (et al. 2012) have been long term advocates of what is referred to in the American
context as an ‘honor code’. An honour code system includes a signed pledge and a broader
‘culture-building strategy’ (McCabe et al. 2012, p.194). In an online e-assessment context,
the opportunity for ‘culture building’ may be relatively limited; particularly if the assessment is
an online RPL process, or if e-assessment follows online learning (rather than a face-to-face
It is noted that the terms ‘authentication’ and ‘verification’ are not used consistently in the literature.
The above usage seems to be the most intuitive and is consistent with a significant number of
published reports; even though it is not consistent with Foster and Laymen’s (2013) use of the terms.
3
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 17
An Australian guide to the risk management of VET online e-assessment
learning process). This being the case, the signing of an ‘integrity code’ may be more
significant in an online e-assessment context than it is in a face-to-face context.
Integrity codes could efficiently support quality assurance and the employment productivity
objectives of VET. These ‘integrity codes’ could be developed to provide nationally consistent
guidelines and the efficient development of ‘integrity’ related teaching and learning resources
for use by all RTOs (see AUTC 2002).
Integrity pledges built into online assessment process
In a recent publication entitled The (honest) truth about dishonesty Dan Ariely (2012) reports
the findings from over ten years of innovative social experiments. One clear outcome of
Ariely’s experiments has been the demonstrated practical value of ‘integrity’ reminders. It
would therefore appear that a well-designed integrity code, backed up by appropriate and
strategic reminders, could have a positive impact on student integrity. It would seem to be a
relatively straight forward process to build ‘integrity reminders’ into online e-assessment
processes and this could occur for the online assessment of skills and knowledge.
Integrity codes and reminders could represent a win-win solution that efficiently supports the
quality assurance and employment productivity objectives of the VET sector.
Learning Process
The focus of the current enquiry is on the risks associated with online e-assessment. For a
significant number of students online assessment follows their formal involvement in a
structured learning process. The following recognises that where the assessment process is
preceded by a learning process there are opportunities to manage the risk of cheating and
identity fraud that are not available in the case of an online Recognition of Prior Learning
(RPL) process, or single unit of competence assessments, like the construction industry
White Card unit (ASQA 2013c, see section 4.5.1 below).
These differential risk levels may be seen to concurrently arise from two sources. First, with
protracted engagement in a learning process there is more opportunity to authenticate and
verify the identity of a candidate. Identity fraud is more difficult if the deception must be
maintained over a longer period of time and over a variety of contexts and circumstances.
The longer and the more varied are the contexts in which an assessor has interacted with a
candidate the more likely they are to identify anomalies in a student’s assessment
submission.
Second, there is evidence that students are less likely to cheat when a sound rapport has
been built between the candidate and assessor (Lang 2013, McCabe 2012, Ariely 2013, and
Driver 2013). The quality of the Learning Management System can have a significant impact
on the quality and efficiency of both the rapport building process, the quality of the learning
and assessment process, and the ease with which a trainer/assessor can monitor and review
a student’s progress, coach and mentor their development, and detect integrity anomalies.
Treatment Option
4.4.1 Learning Management
Systems
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Key Stakeholders
RTOs/ course
developers
Employment
Productivity
?
Quality
Assurance
Efficiency
?
?
Page 18
An Australian guide to the risk management of VET online e-assessment
Learning Management System
There are an extensive range of Learning Management Systems (LMS) with a range of
prices and features. These Systems may also include, or can provide access to, a variety of
options to assist in the authentication of a candidate’s summative assessment submissions.
The selection of an appropriate LMS is important for the quality of the learning process
presented to students, and for the monitoring and review options that may be integrated into
the process. (Examples of comparative reviews of LMS include Lewis et al 2005, Machado
and Tao 2007, Pappas 2014.)
In an online environment it is the LMS that links students to assessors, to other students and
to the organisation. It is apparent that the quality of the LMS may therefore have a significant
impact on whether students feel part of an ‘online community’.
Learning Management Systems (LMS) provide a range of features and opportunities to build
rapport and monitor and review student progress that can assist in the identification of
summative assessment anomalies suggestive of cheating. The selection of an appropriate
LMS and monitoring and review options may be an important determinant of the efficiency,
and quality of the learning and assessment processes, and of the ability to integrate learning
and assessment activities relevant to enhanced employment productivity.
Assessment Process
The findings are unequivocal; the quality of the assessment process is the key to the quality
of assessment outcomes. Strategies designed to address cheating, and authentication and
verification issues, will be largely irrelevant if the basic assessment design is inadequate.
Furthermore, and encouragingly from an educational perspective, good assessment design
can result in what James Lang refers to as “cheating-resistant grounded assessments” with a
bit of “creative thinking” (2013 p.76).
The following section explores a broad range of strategies that focus on the nature and
quality of the assessment process: from understanding and interpreting the competency,
through to questions about the qualification and skill requirements of assessors. Monitoring
and review strategies to address authenticity and cheating issues are explored below in
Section 4.7.
Treatment Option
4.5.1 Understanding of the
competency expectations
Key
Stakeholders
Employment
Productivity
Quality
Assurance
Efficiency
Assessors



IBSA Industry
Training
Council


?
4.5.3 Plagiarism
Assessors



4.5.4Assessment Resources –
national strategy
COAG ISC/
Advisory
Council



4.5.2 Assessor Competence –
practitioners and
designer/developers
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 19
An Australian guide to the risk management of VET online e-assessment
Treatment Option
Key
Stakeholders
Employment
Productivity
Quality
Assurance
Efficiency
4.5.5 Rethinking online
e-assessment
Assessors/
RTO


?
4.5.6 Video options for skills and
applied knowledge
Assessors/
RTO


?
Understanding the competency expectations
It has been observed above that concerns about the interpretation and specification of
competence led to the development of new Training Package Guidelines (see section 4.2.1
above). A recent NCVER stakeholder survey confirmed that “most stakeholders believe that
the language of units of competency and training packages could be made clearer and the
structure of these texts could be simpler” (Hodge 2014, p. 4; also see section 4.7.1 below).
The online assessment issues identified in the recent ASQA audit of the construction industry
White Card (a core unit from the AQF level one construction industry qualification
CPC10111) were not a result of limitations in online e-assessment processes themselves but
the result of limitations in respect to the interpretation of competence as specified in the Unit
(CPCCOHS1001A); particularly as this related to skills and the application of knowledge. It is
evident that there is a need for professional development in this area, including the applied
knowledge development that comes from participating in formal and well-structured
assessment validation/moderation activities (Hodge 2014, p.8).
Assessor Competence – designers and developers
The recent NCVER research by Steven Hodge led him to “suggest that interpreting
competencies is a sophisticated ability and that its development may require different initial
and continuing education and training from that currently provided” (2014, p.27).
As Hodge suggests the design and development of a user friendly and comprehensive
assessment process, in accordance with the rules of evidence and principles of
assessment, may require a level of competence above that of the current Certificate IV
requirement for assessment practitioners: a level of competence that would appear to
equate to the AQF Diploma level.
If this higher level AQF design competence requirement were to be implemented the
specification of competence and the assessment process would need to ensure
appropriate provision for the Recognition of Prior Learning.
Plagiarism
While the use of detection software can identify ‘cut and paste’ plagiarism the more effective
strategy is clearly prevention rather than detection. The 2003 NCVER report, The
development of quality online assessment in vocational education and training, cites the
following conclusion from a UK Joint Information Systems Committee.
“It became clear during the project and subsequent dissemination workshops that
the solution to the problem of plagiarism is prevention and that this solution
should come from within the institution, not from a detection product.”
(Cited in Booth et al 2003, p.76)
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 20
An Australian guide to the risk management of VET online e-assessment
In the context of online assessment it is important to remind students that ‘cut and paste’
does not demonstrate their applied understanding of the knowledge, even if the source is
acknowledged.
To a significant extent the requirement that students demonstrate understanding and
application is the responsibility of the assessment design process. The use of self-reflection
that engages students in making judgments about the quality of the process and the results
achieved can be a useful tool in reducing plagiarism and in reducing the opportunity for
candidates to plagiarise the work of other students (see Crisp 2011). The skills assessed in
such assessment processes have the potential to positively impact workplace employment
productivity and to efficiently contribute to assuring the integrity of the evidence gathered.
Assessment Resources – national strategy
The National VET E-learning Strategy 2012-2015 seeks to play a key role in enabling the
Australian VET sector to take advantage of the rollout of the National Broadband Network
(NBN) and to make major advances in the achievement of government training objectives
(Executive Summary, p.1). The Strategy and its predecessor, the Australian Flexible
Learning Framework, have commissioned and collated an extensive range of resources in
support of online learning and assessment.
The National VET E-learning Strategy, and the Flexible Learning Advisory Group that has
oversight of the Strategy, have resulted in the production of an extensive range of resources
designed to assist the VET sector to take advantage of the rollout of the NBN and achieve
government training objectives.
The New Generation Technologies initiative manages the E-standards for Training and
provides the VET sector with access to an extensive collection of online learning and
assessment resources (http://ngt.flexiblelearning.net.au/ accessed 2014 May 3).
Rethinking online e-assessment
Reservations about the validity of online e-assessment strategies to assess all four
dimensions of competence appear to arise from two sources. First, an inappropriate reliance
by assessors and RTOs on knowledge based assessment activities. And second, a lack of
awareness of the range of online e-assessment strategies that can be used to assess skills
and all four dimensions of competence.
It is clear that the valid online e-assessment of all four dimensions of competence is possible
as demonstrated in auditor responses to the Assessment e-Risk Survey. It is also clear that a
range of current and emerging technology continues to expand the options available. Options
that move beyond the inappropriate assessment of competence through knowledge based
strategies to include the use of point of view and identity verification video technology (see
discussion below) to the more innovative ‘rethinking’ of online e-assessment (see Crisp
2011, Crisp and Hillier 2014 and Stowell and Lamshed 2011).
While it is evident that the ‘rethinking’ of online assessment has the potential to explicitly
address concerns about the relevance and quality of less innovative approaches, it is also
evident that the design, development and implementation phases of a more comprehensive
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 21
An Australian guide to the risk management of VET online e-assessment
range of online e-assessment will require a longer-term view; and may not deliver a shortterm efficiency dividend.
Video options for skills and applied knowledge
As explained by Stowell and Lamshed (2011) in their ‘E-assessment guidelines for the VET
sector’, education point of view glasses, video and image sharing, digital stories, and video
streaming may be used to collect direct evidence of candidate performance on real work
tasks in real time. This may be accompanied by online self and peer assessments as well as
comments from workplace supervisors.
The innovative use of the advances being made in video and audio technologies is opening
up a wider range of skill and applied knowledge assessment possibilities. These possibilities
can address all four dimensions of competence and provide for comprehensive verification of
identity protocols (from the use of mirrors to multiple cameras).
These options address quality assurance issues in a relatively cost effective manner. They
can reduce the need for candidates to travel to assessment centres and may open up a
greater range of workplace evidence gathering possibilities.
Workplace Performance
One of the less tangible yet concerning findings of this enquiry has been the observation that
concurrent with an increase in the number of persons with higher level qualifications, there
has been a drop in the proportion of qualification holders reporting job related benefits from
the qualifications they have acquired (COAG 2013).
Strategies that improve the job related benefits arising from qualifications have the potential
to positively affect all three core VET objectives: the efficiency of the return on public and
private expenditure, employment productivity, and the incentive to ‘cheat’ to merely get the
credential. The following strategies seek to complement the Ministers’ COAG Industry and
Skills Council statement reaffirming the need for enhanced industry engagement in the VET
sector (see also the above discussion of the role of Industry Skills Councils Section 4.2.2).
Treatment Option
Key
Stakeholders
Employment
Productivity
Quality
Assurance
Efficiency
4.6.1 Employer feedback –
competence in the workplace
Employers and
ITCs



4.6.2 Industry engagement in
monitoring VET system
performance
Employers and
ITCs

?

Employer feedback – competence in the workplace
It would appear to be self-evident (true by definition of the purpose of VET) that employers
are the best and ultimate judges of the employment productivity of an assessment candidate.
Employer responses to the Assessment e-Risk Survey clearly indicate that the vast majority
of employers are of the view that they would ‘soon know if someone is not competent once
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 22
An Australian guide to the risk management of VET online e-assessment
they were on the job’, and furthermore, ‘if they found a gap in the training and assessment of
an employee they would like to confidentially register their concern’.
Adding weight to this proposition, over three quarters of students surveyed indicated that
they believe ‘if any student is deemed competent when they are not, this will be identified in
most workplaces’.
ASQA already has a ‘complaints’ process. This enquiry did not examine the efficacy of this
process or explore whether employers were aware of the process. It appears this may be a
feedback process that could be enhanced and further promoted.
Workplace performance monitoring, combined with appropriate feedback and review
mechanisms (such as ASQA strategic audits), has the potential to make a significant
contribution to addressing concerns about the veracity and authenticity of e-assessment.
Industry engagement in monitoring VET system performance
In proclaiming the formation of the COAG Industry Skills Council, national and State/Territory
Ministers agreed to three key priorities. The third of these priorities is to ensure that industry
is involved in policy development and oversight of the performance of the VET sector and to
streamline governance arrangements and committees (COAG ISC, 3 April 2014).
Subject to appropriate privacy and security safeguards the Unique Student Identifier has the
potential to significantly increase not only the capacity of students and RTOs to collate and
verify the VET attainment of individuals but to efficiently collate and track the performance of
RTOs, industry skill sectors, and the system as a whole. The preceding discussion of
employer feedback mechanisms also identified value in the enhanced communication of
feedback and performance information such as ‘complaints’ information.
Monitoring and Review
The following section explores a range of strategies designed to provide for the monitoring
and review of the integrity of assessment evidence. It includes consideration of current and
emerging technologies designed to address concerns about the integrity of evidence
provided by candidates, including: plagiarism, collaboration, cheating, and identity fraud.
While the candidate has the primary responsibility for the integrity of evidence, responsibility
for the integrity of evidence extends to all key stakeholders: RTOs, assessors, employers
and auditors. All stakeholders have a responsibility to behave with integrity in respect to the
assessment process. Key stakeholders also have a responsibility to monitor the integrity of
those aspects of the assessment process they are responsible for, and to take appropriate
action when integrity breaches are identified (see ASQA 2013a).
Treatment Option
Key
Stakeholders
Employment
Productivity
Quality
Assurance
Efficiency
4.7.1 Systematic validation and
moderation
RTOs/
Assessors



4.7.2 Online proctoring
RTOs/
Assessors
?
?
?
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 23
An Australian guide to the risk management of VET online e-assessment
4.7.3 Lockdown technologies
RTOs/
Assessors
?
?
?
4.7.4 A Blob-based presence
verification system in summative
e-assessment
RTOs/
Assessors


?
4.7.5 Verification and biometric
identification
RTOs/
Assessors
?
?
?
4.7.6 Timed assessments and
anomaly detection
RTOs/
Assessors


?
4.7.7 Challenge questions –
identification and authentication
RTOs/
Assessors

?
?
4.7.8 Oral contact to verify
identity/competence
RTOs/
Assessors



4.7.9 Monitoring the net - social
media and other web sites
RTOs/
Assessors
?
?
?
4.7.10 Monitoring and Review –
identifying high / low risk RTOs
COAG ISC/
Advisory
Council
?
?
?
Systematic moderation and validation
NVR Standard 15.5 (d) for Registered Training Organisations states that assessment
including Recognition of Prior Learning must be ‘systematically validated’. Validation is a
required monitoring and review process for RTOs.
The term ‘moderation’ is often used in conjunction with the term ‘validation’. The following
clarification of the difference in the meaning of these terms is derived from Lawlor and Tovey
(2011, p.268-269).
Validation may be seen to be an all-encompassing review process that can occur at any
stage of the assessment lifecycle: from design of the competency standards and the
enrolment process, through the assessment process, into the workplace. It includes a full
consideration of the competency standards, and the assessment lifecycle in accordance with
the principles of assessment and rules of evidence.
Moderation on the other hand, is an aspect of validation that is integral to the
assessment process itself. It may be part of the review of evidence after it has been
collected and the assessment decision made, and it may very usefully occur as part of
‘monitoring’ the assessment process as evidence is collected, and prior to an
assessment decision being made.
Validation, and the associated moderation process, is integral to ensuring the veracity
and authenticity of assessment. The validation process has implications for all five of
the most significant quality concerns identified by stakeholders (Morris 2014a).
There are also a growing number of excellent examples of NVR compliant e-validation;
and examples where it has been used to engage geographically remote industry and
assessors (Grills and Clayton 2013).
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 24
An Australian guide to the risk management of VET online e-assessment
Systematic validation of assessment is not a ‘treatment option’, it is a required NVR
compliance standard for all assessors and RTOs. Validation is an efficient quality assurance
process designed to ensure the employment productivity of all candidates assessed
competent.
Online proctoring
In a recent international review of online proctoring systems David Foster and Harry Laymen
explain that online proctoring refers to the “monitoring of an exam over the internet through a
webcam” and may be extended to include “any automated process that help to secure a test
administration event” (2013, p.2).
As Foster and Laymen observe, “there is not a great deal of published research on online
proctoring”, consequently it is not possible to draw reliable conclusions about the
comparative security effectiveness of online proctoring (2013, p.11). They note, however,
that: “Test security, like other types of security, is not effective if done piecemeal”
(2013 p.10).
Discussion of ‘online proctoring’ services in the literature has been restricted to its role
in verifying the online assessment of knowledge under exam type conditions. It is
evident that proctoring alone does not provide adequate security in high-stake
knowledge based assessments. In ‘high-risk’ assessment contexts, online proctoring
needs to be combined with ‘lock down’ and other authentication/verification strategies.
Online proctoring, even before these additional components are added, can be a
relatively expensive exercise.
The value of ‘online proctoring’ technology to the VET sector depends on two things:
first, the importance of a secure test of a candidate’s knowledge, independent of the
application of that knowledge. And secondly, how well online proctoring can be
extended to the observation of skills based assessment tasks. The role for online
proctoring in skills based assessment does not appear to have been explored in the
research literature.
Lockdown technologies
In their review of ‘lockdown’ technologies Foster and Laymen explain that these technologies
may range from simply not allowing the test taker to access other URL’s during the
assessment task, to taking control of the operating system, detecting the use of peripheral
devices or the various computer ports, and the detection of inappropriate key strokes or
functions to print or change screens (2013, p.6).
Where the risk of unauthorised collaboration, cheating and fraud is significant then
‘lockdown’ technologies will be a necessary compliment to online proctoring of
knowledge, and other computer based assessments. Without it, the ‘work around’
opportunities appear to be well known and readily accessible through a quick internet
search (Hsu 2013). As with online proctoring the solution is unlikely to be cost-effective
in all but high-risk, high-stake, assessments that cannot be assessed more effectively
in other ways (such as face-to-face or in the workplace).
A Blob-based presence verification system in summative e-assessment
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 25
An Australian guide to the risk management of VET online e-assessment
A 2011 paper by Kikelomo Apampa, based on her 2010 PhD Thesis, explicitly seeks to
“go beyond ensuring the ‘right’ student is authenticated at the initial logon… to verify
the presence of an authenticated student for the duration of the test” (2010, p.1).
Apampa proposes the use of an object tracking approach, ‘blob-based analyses’
decisions about the student’s presence are determined by a set of, what the author
says are, ‘well-defined Fuzzy Logic rules’ (Apampa et al. 2011).
Blob-based, or any form of simple online video monitoring of candidate presence is
inadequate to the task of preventing cheating if it is not associated with an appropriate
range of lockdown and authentication/ verification strategies (see discussion above).
Furthermore, blob based video monitoring will be incapable of monitoring skills and the
application of knowledge assessments where an unpredictable set of movements will
make it impossible to create an appropriate set of ‘fuzzy logic rules’.
Verification and biometric identification
While video monitoring and proctoring technology may verify that the person
undertaking the assessment did not collaborate or get out-side assistance during the
assessment task, it does not verify the candidate is who they say they are. As a
consequence a number of the online proctoring services also include a range of
authentication options: from the use of Government-issued ID and password
protection; to the collation and verification of biometric information.
Some of the advanced proctoring programs provide access to technology that provides
facial recognition and keystroke analysis (see Foster and Laymen 2013). The technical
opportunities for biometric identification and verification have been developed and are
available now. More ‘extravagant’ biometric options are rapidly being developed (Mack
2014, pp.3-4). Where the risk of unauthorised collaboration, cheating and fraud is
significant then these technologies may be justified.
Biometric verification need not be limited to in front of computer assessments. This
technology could be used in a variety of innovative ways to verify the candidate’s
identity in skills and applied knowledge assessments. It should be noted that cost
factors, convenience and security/privacy considerations may be significant, (Foster
and Laymen 2013, p.10) and will need to be considered as part of the quality
assurance process and cost-benefit analysis.
Timed assessments and anomaly detection
The use of timed assessments has been found to provide a disincentive to collaboration and
to provide an opportunity through ‘anomaly monitoring’ to detect cheating. Anomaly
monitoring involves looking for unusual behaviour patterns in the time taken by students to
complete an assessment task. If a student takes ‘less time than anticipated’ for the task, and
got the correct answer, then the system raises an ‘anomaly’ flag (Young 2010).
It would appear that the setting of appropriate ‘workplace relevant’ time constraints and
monitoring the time candidates take to complete an assessment task may be a relatively
unobtrusive way to mitigate cheating. The challenge inherent in approaches like this is to
avoid the ‘us versus them’ conflict between candidate and assessors that concerned
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 26
An Australian guide to the risk management of VET online e-assessment
Professor Prichard when he developed his ‘anomaly detection’ software (Young 2010, p.2).
The solution to this concern, as Professor Prichard suggested lies in the quality and
relevance of the assessment tasks in the first place.
Challenge questions – identification and authentication
A review by Foster and Laymen of online proctoring services (2014) indicates that several of
the leading providers include a challenge question application among their
authentication/verification options. This approach makes use of public and private data about
an individual to construct ‘challenge questions’ that are put to candidates before or during the
assessment process. These ‘personal’ questions are designed to provide ‘proof of identity’.
The application of this technology is not limited to knowledge based assessments. It
could be used to verify the identity of candidate’s undertaking skills and applied
knowledge assessments (although this enquiry did not locate examples where this has
occurred).
In a similar vein to concerns expressed above about the security of biometric-data and
privacy considerations, the promotion and use of data bases that inform the ‘challenge
questions’ raise a range of privacy, data security, and quality assurance issues.
Oral contact to verify identity/competence
The idea that ‘to keep people honest’ a random sample of students assessed online should
be contacted and asked course related questions was supported by the vast majority of
students, assessors and auditors who responded to Assessment e-Risk Surveys. While
employers were not directly asked to comment on this treatment option their responses to
related questions indicate overwhelming support for strategies designed to verify the integrity
of assessment.
This monitoring and review strategy could be implemented in a number of ways and with a
wide range of sampling protocols. While the policy and protocols adopted by institutions and
assessors will determine the overall efficiency of this approach it is evident that it is a quality
assurance approach that could be cost-effective in high-stake, high-risk, assessment
contexts.
Monitoring the net - social media and other web sites
Students have been found to post assessment questions online and students have even
been caught using Twitter during exams. This has led Exam Boards in the United Kingdom,
and the Assessment and Qualification Alliance (AQA) to start monitoring in earnest social
media such as Twitter (Harris 2013, p.1).
In situations where there is a concern that students may be ‘out sourcing’ their work to
‘essay-mills’ (like Boost My Grade) or are ‘pooling’ answers through social media (see
Course Hero; Young 2010) then in an Australian VET context it would seem that
plagiarism detection strategies will be more cost effective than the monitoring of social
media.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 27
An Australian guide to the risk management of VET online e-assessment
Monitoring and Review – identifying high / low risk RTOs
John Ross in The Australian reported that ASQA has been asked to choose “about 15 to 25
per cent of Australia’s approximately 4,700 registered training providers (RTOs) to be trusted
with “greater autonomy to accredit their courses and introduce new ones”. The article
suggests that this task is proving to be challenging.
It appears evident that what is required is a comprehensive risk management strategy that
ensures that the criteria and risk assessment processes are adequate to the task. This
enquiry has come to the conclusion that there are significant gaps in our understanding of
the current performance of the VET sector and the likelihood, consequences and
onset-visibility risks of online e-assessment. This information gap clearly has implications for
the monitoring and review of ‘more autonomous’ RTOs.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 28
An Australian guide to the risk management of VET online e-assessment
5 In conclusion - an integrated approach
In conclusion, it may be seen that there are two levels of risk management to be integrated in
seeking to ensure the veracity and authenticity of online e-assessment.
The first, an overarching level that involves the engagement of all key stakeholders in
managing the impact of risk on the key objectives of the VET sector - employment
productivity, quality assurance, and efficiency.
Second, a context specific level that involves the identification and management of
high-risk, high-stake, competencies and online e-assessment contexts. This is
primarily the responsibility of assessors and RTOs.
The range of identified treatment options are clustered below under the ‘responsible’ key
stakeholders. The number before each of the treatments refers to the sections above where
the treatment option is briefly discussed (the treatment options are explored in more detail in
the full report An Australian enquiry into the veracity and authenticity of online e-assessment,
2014).
It should be noted that in all cases the responsibilities of key stakeholders include
appropriate consultation/communication and monitoring/review.
COAG ISC/ Advisory Council
4.2.1 New standards for Training Packages
4.2.3 The New quality assurance standards for RTOs and assessors
4.5.4 Assessment Resources – national strategy
4.7.9 Monitoring and Review – identifying high / low risk RTOs
Industry Training Councils (ITCs)
4.2.2 Industry communication and consultation, at all levels
Employers and ITCs
4.6.1 Employer feedback – competence in the workplace
4.6.2 Industry engagement in monitoring VET system performance
Innovation and Business Skills Australia - ITC
4.5.2 Assessor Competence – designers and developers
Registered Training Organisations (RTOs)
4.3.1 Motivation to enrol and marketing practices
4.3.3 Baseline authentication of identity
4.3.4 Integrity pledges built into online assessment process
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 29
An Australian guide to the risk management of VET online e-assessment
Department of Industry
4.3.2 Unique Student Identifier (USI)
RTOs/ course developers
4.4.1 Learning Management Systems
Assessors / RTOs
Overarching Priorities
4.5.1 Understanding the competency expectations
4.5.3 Plagiarism
4.5.5 Rethinking online e-assessment
4.5.6 Video options for skills and applied knowledge
4.7.1 Systematic validation and moderation
Risk management for high-stake, high-risk assessments
4.7.1 Online proctoring – knowledge based assessments
4.7.2 Lockdown technologies - knowledge based assessments
4.7.3 A Blob-based presence verification system in summative e-assessment
4.7.4 Verification and biometric identification
4.7.5 Timed assessments and anomaly detection
4.7.6 Challenge questions – identification and authentication
4.7.7 Oral contact to verify identity/competence
4.7.8 Monitoring the net - social media and other web sites
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 30
An Australian guide to the risk management of VET online e-assessment
Appendix – a tool to assist risk management
The following tool is designed to assist a context specific review of the treatment options
identified in the research report, An Australian enquiry into the veracity and authenticity of
online e-assessment, and presented above in this companion document. Its purpose is to
provide for each of the identified ‘treatment options’ to be evaluated at an RTO industry skill
area level or qualification level.
Date:
Reference:
Industry Skill Area / Qualification(s):
Risk Management team:
Treatment Option
(The Treatment Options are numbered in accordance
with the section in the Companion Guide Document
where the ‘risk factors’ to be evaluated are presented.)
Acceptable risk
( ?  )
Table 1
Impact on Veracity &
Authenticity (H,M L)
The following table provides for each treatment option to be evaluated so that ‘next step’ actions may
be determined and prioritised. For each treatment option, first evaluate the nature and likelihood
impact of the identified risks on the veracity and authenticity of assessment (e.g. High, Medium, Low).
Second, determine if the risk is ‘acceptable’ (e.g. yes , uncertain ?, no ). Then determine the ‘next
step’ actions and priorities. Treatment options for high-stake, high-risk, assessments are listed
separately in table two. (NB: If a lack of information is preventing a confident evaluation then the ‘next
step’ may need to be further research, consultation, and monitoring and review activity.)
Next Step
(who, what and when)
Specification of competence
4.2.1 New standards for Training Packages
4.2.2 Industry communication and consultation, at
all levels
4.2.3 The New quality assurance standards for
RTOs and assessors
Enrolment Process
4.3.1 Motivation to enrol and marketing practices
4.3.2 Unique Student Identifier (USI)
4.3.3 Baseline authentication of identity
4.3.4 Integrity pledges built into online
assessment process
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 31
Acceptable risk
( ?  )
(The Treatment Options are numbered in accordance
with the section in the Companion Guide Document
where the ‘risk factors’ to be evaluated are presented.)
Accept
Treatment Option
Impact on Veracity &
Authenticity (H,M L)
Table 1
Impact
An Australian guide to the risk management of VET online e-assessment
Next Step
(who, what and when)
Learning Process
4.4.1 Learning Management Systems
Assessment Process
4.5.1 Understanding of the competency
expectations – assessor practitioners
4.5.2 Assessor Competence – designers and
developers.
4.5.3 Plagiarism
4.5.4 Assessment Resources – national strategy
4.5.5 Rethinking online e-assessment
4.5.6 Video options for skills and applied
knowledge
Workplace Performance
4.6.1 Employer feedback – competence in the
workplace
4.6.2 Industry engagement in monitoring VET
system performance
Monitoring and Review
4.7.1 Systematic validation and moderation
Table 2.
Monitor and Review –
treatment options for high-stake, highrisk, assessment contexts.
Next Step
(who, what and when)
4.7.2 Online proctoring
4.7.3 Lockdown technologies
4.7.4 A Blob-based presence verification system
in summative e-assessment
4.7.5 Verification and biometric identification
4.7.6 Timed assessments and anomaly detection
4.7.7 Challenge questions – identification and
authentication
4.7.8 Oral contact to verify identity/competence
4.7.9 Monitoring the net - social media and other
web sites
4.7.10 Monitoring and Review – identifying high /
low risk RTOs
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 32
An Australian guide to the risk management of VET online e-assessment
References
Airely, Dan 2012, The (Honest) truth about dishonesty: how we lie to everyone – especially
ourselves, Harper Collins, New York.
Apampa, Kikelomo 2010, Presence verification for summative e-assessment, thesis for the
degree of Doctor of Philosophy, University of Southampton, School of electronics and
computer science.
Apampa, Kikelomo; Gary Wills and David Argles 2011, ‘Towards a presence verification
system in summative e-assessments’, International Journal of e-assessment, Vol. 1, no.1.
www.eprints.soton.ac.uk.
AS/NZS ISO 31000:2009, Risk management – Principles and guidelines, Commonwealth of
Australia.
ASQA 2012, Standards for NVR Registered Training Organisations 2012, Australian Skills
Quality Authority, Commonwealth of Australia.
ASQA 2013a, Training for the White Card for Australia’s Construction Industry: A national
strategic review of the registered training organisations offering industry induction training –
the White Card, Australian Skills Quality Authority, Commonwealth of Australia.
ASQA 2013b, Training for aged care in Australia: A national strategic review of the registered
training organisations offering industry induction training, Australian Skills Quality Authority,
Commonwealth of Australia.
ASQA 2013c, Marketing and advertising practices of Australia’s registered training
organisations, Australian Skills Quality Authority, Commonwealth of Australia.
ASQA 2013d Provider information sessions, power point presentation accessed online at …
ASQA 2013, ASQA Strategic Plan 2013-16 and Operational Plan 2013-14, Australian Skills
Quality Authority, Australian Government.
Australian Government 2013, Annual national report of the Australian vocational education
and training system 2011, Department of Industry, Innovation, Climate Change, Science,
Research and Tertiary Education, Australian Government.
AUTC 2002, ‘On-line assessment’, Australian Universities Teaching Committee, Centre for
the Study of Higher Education (CSHE),
http://www.cshe.unimelb.edu.au/assessinglearning/03/online.html
Bahn, Susanne and Llandis Barratt 2011, Determining the Effectiveness of Mandatory
Construction Safety Training in WA, School of Management, Edith Cowan University,
Western Australia.
Blaug, Mark 1976, ‘The empirical status of human capital theory: a slightly jaundiced survey’,
Journal of Economic Literature, Vol. 12 No. 2, pp. 827-855.
Booth, Robin; Berwyn Clayton, Robert Hatcher, Susan Hungar, Patricia Hyde, and Penny
Wilson 2003, The development of quality online assessment in vocational education and
training, Australian Flexible Learning Framework, NCVER.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 33
An Australian guide to the risk management of VET online e-assessment
Callen, Victor and Berwyn Clayton 2010a; E-Assessment and the AQTF: Bridging the divide
between practitioners and auditors, Australian Flexible Learning Framework, Department of
Education, Employment and Workplace Relations; 26 February 2010, Commonwealth of
Australia.
Callen, Victor and Berwyn Clayton 2010b, Bridging the Divide: the challenges and solutions
around e-assessment as voiced by practitioners and auditors, AVETRA, Crow’s Nest, paper
presented at the 13th Australian Vocational Education and Training Research Association
(AVETRA) Conference, viewed 25 November, 2013.
CTCS 2010, 2010 review of market opportunities, Canadian Trade Commission Service,
(CTCS).
Clayton, Berwyn; Robin Booth and Sue Roy 2001, Maximising confidence in assessment
decision-making: a springboard to quality in assessment, Centre Undertaking Research in
Vocational Education (CURVE), www.avetra.org.au.
COAG 2012a, National Agreement for Skills and Workforce Development, Council of
Australian Government, www.federalfinancialrelations.gov.au.
COAG 2013, Skills in Australia 2012: Five years of performance, COAG Reform Council,
Sydney.
COAG ISC, 2014, Communique for the COAG Industry Skills Council Meeting – 3 April 2014
http://www.natese.gov.au/__data/assets/pdf_file/0005/80519/COAG_Industry_and_Skills_Co
uncil_-_Communique_-_3_Apr_2014.pdf.
Crisp, Geoffrey 2011, Rethinking assessment in the participatory digital world – Assessment
2.0 (also referred to as Transforming Assessment), National Teaching Fellowship – final
report. Support for the original work was provided by the Department of Education,
Employment and Workplace Relations, an initiative of the Australian Government.
Crisp, Geoffrey; Mathew Hillier and Shamin Joarder, Transforming Assessment Website
http://www.transformingassessment.com and the Transforming Assessment island in Second
Life http://slurl.com/secondlife/transforming
assessment/254/254/23/
Department of Finance and Administration (2006a) Introduction to Cost- Benefit analysis and
other evaluation techniques, Financial Management Reference Material NO.5,
Commonwealth of Australia.
Department of Finance and Administration (2006b) Handbook of Cost-Benefit Analysis,
Financial Management Reference Material NO.6, Commonwealth of Australia,
http://www.innovation.gov.au
DoC 2011, Code of conduct for agents and sales representatives: Client identification
verification and real estate fraud prevention. Guidance note. Western Australian Department
of Commerce, www.commerce.wa.gov.au/.
Docking, Russell 2013, Effective strategies for the competency-based assessment and RPL:
workshop participant manual, Innovation and Business Skills Australia.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 34
An Australian guide to the risk management of VET online e-assessment
Foster, David and Harry Laymen 2013, Online proctoring systems compared, March 13,
http://startrinity.com.
FLAG 2013, Technology Innovations Applied Research projects: guidelines for applicants
and application form, flexiblelearning.net.au, Australian Government, Department of Industry,
October.
DBIS 2014, Department for Business, Innovation & Skills, Review of Industry Training
Boards, www.gov.uk/government/consultations/review-of-industry-training-boards-itbs,
accessed 21 May 2014.
Harris, Sarah 2013, Students warned Twitter will be monitored by exam boards for signs of
cheating, Mail online, http://www.dailymail.co.uk/.
Halliday-Wynes, Sian and Josie Misko 2013, Assessment issues in VET: minimising the level
of risk, A National Centre for Vocational Education and Research (NCVER), Issues Paper,
Commonwealth of Australia.
Driver, Janine 2013 You can’t lie to me: the revolutionary program to supercharge your inner
lie detector and get to the truth, Harper One.
Hodge, Steve 2014, Interpreting competencies in Australian vocational education and
training: practices and issues, Research Report, National Centre for Vocational Education
and Research (NCVER), Department of Industry, Commonwealth of Australia.
Hsu Stephen 2013, ‘How to beat online exam proctoring’, Information Processing, April 10,
http://infoproc.blogspot.com.au
Gillis, Shelley and Berwyn Clayton 2013, Industry e- validation of assessment exemplars: an
independent review report, National VET E-learning Strategy, Department of Industry,
Innovation, Climate Change, Science, Research and Tertiary Education, Australian
Government.
Kowszun, Jojo and Oscar Struijive 2005; ‘Risk assessment for the distributed e-learning
regional pilots and Higher Education Academy Subject Centre projects, Report 1, Guidance
on risk, Cogency Research and Consulting Limited, United Kingdom.
http://www.jisc.ac.uk/media/documents
Lang James M. 2013 Cheating Lessons: Learning from academic dishonesty, Harvard
University Press.
Lawlor, Diane and Michael Tovey 2011, ‘Training in Australia’, Pearson.
Lewis, Barbara A. and Virgina M. MacEntree et al., ‘Learning management system
comparison’, Proceedings of the 2005 Informing Science and IT Education Joint Conference,
Flagstaff, Arisona, USA – June 16-19.
Machado, M. & Tao, E. 2007. ‘Blackboard vs. Moodle: Comparing user experience of
Learning Management Systems’, Proceedings of the 37th ASEE/IEEE Frontiers in
Education Conference, California State University, Monterey Bay, Seaside.
Mack, Timothy C. 2014, ‘Privacy and Surveillance Explosion’, The Futurist, Vol. 48, No. 1.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 35
An Australian guide to the risk management of VET online e-assessment
McCabe Donald L.; Kenneth D. Butterfield, and Linda K. Trevino 2012, Cheating in College:
why students do it and what educators can do about it, The John Hopkins University Press.
McCabe Donald L. and Trevino, Linda K.1996, ‘What we know about cheating in college:
longitudinal trends and recent developments’, Change, ProQuest Education Journals, Vol.
28. No. 1.
Macfarlane, The Hon Ian 2014, April 3, ‘A new partnership between industry and skills’,
Ministerial Press Release, Government of Australia.
Misko, Josie and Suellen Priest 2009, Students’ suggestions for improving their vocational
education and training experience, National Centre for Vocational Education and Research
(NCVER), Department of Education, Employment and Workplace Relations; 10 August 2011;
Commonwealth of Australia.
Morris, Thomas 1992, ‘Education retention rates and labour market outcomes: towards an
effective investment in workforce productivity’, The Economics of Education, Centre for the
Economics of Education, Monash University, Australian Government Publishing Service,
Canberra.
Morris, Tom 2014b, An Australian enquiry into the veracity and authenticity of VET online
e-assessment: a risk management approach to stakeholder concerns, New Generation
Technologies website http://ngt.flexiblelearning.net.au.
Morris, Tom 2014c, Assessment e-Risk Survey of key stakeholders 2014: an Australian
enquiry into VET online e-assessment - support document to this report, New Generation
Technologies website http://ngt.flexiblelearning.net.au.
National Broadband Network 2014, ‘Government Expectations’, The Hon Malcolm Turnbull,
Minister for Communications and Senator The Hon Mathias Cormann, Minister for Finance,
public letter to Dr Ziggy Switowski, Executive Chairman, NBN Co Limited, Parliament House,
Canberra, April 8.
National VET E-learning Strategy 2012-2015, Department of Industry, Innovation, Science,
Research, and Tertiary Education, Australian Government, http://ngt.flexiblelearning.net.au.
The New Generation Technologies, Flexible Learning Advisory Group, the National Advisory
for Tertiary Education, Skills and Employment (NATESE), Department of Industry,
Commonwealth of Australia, http://ngt.flexiblelearning.net.au.
NCVER 2013 Australian vocational education and training statistics: financial information
2012, National Centre for Vocational Education and Research (NCVER), Australian
Government, Department of Industry, www.flexiblelearning.net.au.
NSSC 2013, June; Improving Vocational Education and Training: the Australian Vocational
Qualification System - Standards and Policy Framework; National Skills Standards Council,
Commonwealth of Australia.
O’Connell 2013, ‘Proctored vs. unproctored testing: does it really make a difference’, Select
International - articles, accessed online 10 October 2013, www.selectinternational.com.
Pappas, Christopher 2014, ‘The learning management systems, quality evaluation survey’,
eLearning Industry, website accessed 2014, June 2, http://elearningindustry.com.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 36
An Australian guide to the risk management of VET online e-assessment
Priest, Annie 2009; Getting the knowledge-skills mix right in higher-level vocational education
and training qualifications, National Centre for Vocational Education and Research
(NCVER), Department of Education, Employment and Workplace Relations; 10 August 2011;
Commonwealth of Australia.
Ross, John 2014, April 9, College ‘cowboy’ fear as red tape unwinds, The Australian
newspaper.
SNR Standards for NVR Registered Training Organisations 2012, Commonwealth of
Australia.
Stowell, Rob and Reece Lamshed 2011, E-assessment guidelines for the VET sector - final
report, Australian Flexible Learning Framework and National Quality Council; Department of
Education, Employment and Workplace Relations; 10 August 2011; Commonwealth of
Australia.
Western Australian State Training Board review 2008, State training board review of industry
training board advisory arrangements in Western Australia, April 21, consultations bey
Quantum Consulting Australia.
Wilber, Ken 2000 Sex, Ecology, Spirituality: the spirit of evolution, second edition,
Shambhala.
Wilber, Ken 2001, A theory of everything: an integral vision for business, politics, science and
spirituality, Gateway - Gill and Macmillan.
Young, Jeffery R. 2010, ‘High-Tech cheating abounds, and professors bear some blame’,
The Chronicle of Higher Education, March 28, https://cronicle.com.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 37
An Australian guide to the risk management of VET online e-assessment
More Information
National VET E-learning Strategy
Email: flag_enquiries@natese.gov.au
Website: flexiblelearning.net.au
New Generation Technologies
incorporating E-standards for Training
Email: e-standards@flexiblelearning.net.au
Websites:
New Generation Technologies: ngt.flexiblelearning.net.au
E-standards for Training: e-standards.flexiblelearning.net.au
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 38
Download