the e-assessment survey (MS Word 1.0MB)

advertisement
Assessment e-Risk Survey of key
stakeholders 2014:
an Australian enquiry into VET online e-assessment.
Support Document.
Polytechnic West
30 June 2014
flexiblelearning.net.au
Assessment e-Risk Survey of Key Stakeholders 2014
Acknowledgements
This report was produced for the Flexible Learning Advisory Group (New Generation
Technologies - National VET E-Learning Strategy) by Tom Morris (Polytechnic West).
The enquiry benefited from a significant amount of stakeholder input. I am grateful for the
assistance of the Western Australian Industry Training Councils for their distribution of
Assessment e-Risk Surveys and the provision of employer feedback; to ASQA and TAC
auditors for their survey responses, and Dr. Russell Docking for his advice; to a large number
of Polytechnic West online students who responded to the invitation posted on their Learning
Management System; and to the assessors at Polytechnic West and Central who completed
surveys.
I have been fortunate to receive the unwavering support of three people: Greg Martin,
Polytechnic’s Portfolio Manager, Business and Information Technology (and custodian of the
Polytechnics Online Centre of Excellence); Josie Daniele, Advanced Skills Lecturer; and
Sue Morris, my wife and industry skills trainer.
My appreciation also goes to Sue Dowson, eLearning Advisor, for assisting me to embrace
the Survey Monkey learning journey and to David Harris for his exceptional proof reading
and editorial advice.
Disclaimer
The Australian Government, through the Department of Industry, does not accept any liability to any person for
the information or advice (or the use of such information or advice) which is provided in this material or
incorporated into it by reference. The information is provided on the basis that all persons accessing this material
undertake responsibility for assessing the relevance and accuracy of its content. No liability is accepted for any
information or services which may appear in any other format. No responsibility is taken for any information or
services which may appear on any linked websites.
With the exception of the Commonwealth Coat of Arms, the Department’s logo, any material protected by a trade
mark and where otherwise noted all material presented in this document is provided under a Creative Commons
Attribution 3.0 Australia (http://creativecommons.org/licenses/by/3.0/au/) licence.
New Generation Technologies
incorporating E-standards for Training
National VET E-learning Strategy
Assessment e-Risk Survey of Key Stakeholders 2014
Table of Contents
1 Introduction ...................................................................................................................... 1
2 Survey methodology ........................................................................................................ 1
2.1 The key stakeholders .................................................................................................................1
Employers ...........................................................................................................................................2
Students ..............................................................................................................................................2
Auditors ...............................................................................................................................................2
Assessors ............................................................................................................................................2
2.2 The survey instrument ...............................................................................................................3
2.3 Analysing the responses ............................................................................................................3
3 Analysis of responses ..................................................................................................... 3
3.1 Stakeholder concerns – framework one ....................................................................................3
3.2 Treatment options – framework two ..........................................................................................5
3.3 Collation of responses framework .............................................................................................5
4 Collated stakeholder responses...................................................................................... 6
4.1 Specification of competence ......................................................................................................7
Industry input .......................................................................................................................................7
Clarity of documentation ......................................................................................................................7
Relevant and up-to-date ......................................................................................................................8
4.2 Enrolment process .....................................................................................................................8
4.3 Learning process .......................................................................................................................9
4.4 Assessment process ..................................................................................................................9
Rules of evidence .............................................................................................................................. 10
Principles of assessment ...................................................................................................................11
Assessor competence .......................................................................................................................12
Assessment resources ......................................................................................................................12
4.5 Workplace performance ...........................................................................................................13
4.6 Monitoring and review ..............................................................................................................15
Plagiarism..........................................................................................................................................17
Inappropriate collaboration ................................................................................................................18
Cheating ............................................................................................................................................18
Identity fraud......................................................................................................................................19
5 Risk concern ratings ...................................................................................................... 19
6 In conclusion - an integrated approach ........................................................................ 22
Appendix 1 – profiles of respondents .............................................................................. 24
1.1 Employer respondents ...........................................................................................................24
1.2 Student respondents .............................................................................................................26
1.3 Assessors respondents .........................................................................................................28
Appendix 2 - Verbatim comments .................................................................................... 30
New Generation Technologies
incorporating E-standards for Training
National VET E-learning Strategy
Assessment e-Risk Survey of Key Stakeholders 2014
2.1 Employers ..............................................................................................................................30
2.2 Students.................................................................................................................................35
2.3 Auditors ..................................................................................................................................47
2.4 Assessors ..............................................................................................................................53
References ......................................................................................................................... 60
More Information ............................................................................................................... 61
Research Report
An Australian enquiry into the veracity and authenticity of online
e-assessment: a risk management approach to stakeholder concerns.
Companion Document
An Australian guide to the risk management of VET online e-assessment: a
companion document to the research report into the veracity and authenticity
concerns of stakeholders.
Both documents may be accessed through the New Generations Technology website
http://ngt.flexiblelearning.net.au.
New Generation Technologies
incorporating E-standards for Training
National VET E-learning Strategy
Assessment e-Risk Survey of Key Stakeholders 2014
1 Introduction
This document is the third of three reports prepared as part of the Flexible Learning
Advisory Group (FLAG) commissioned enquiry into online e-assessment in Australia.
The full research report is entitled: An Australian enquiry into the veracity and
authenticity of online e-assessment: a risk management approach to
stakeholder concerns.
The second report is entitled: An Australian guide to the risk management of
VET online e-assessment: a companion document to the research report into
the veracity and authenticity concerns of stakeholders.
The purpose of this third report is two-fold. First: to more fully explain the nature,
scope, and analysis of the Assessment e-Risk Survey; and second, to present the
collated responses of the four stakeholder groups (including verbatim comments) at the
level of individual survey questions.
The survey created a wealth of information. The intention of this support document is to
make this information available in a form that will be useful to other researchers, policy
advisers, and practitioners.
2 Survey methodology
The Assessment e-Risk Surveys were conducted using the online Survey Monkey
application in March/April 2014.
The four key stakeholder groups surveyed were: employers, students, auditors and
assessors.
The stakeholder surveys had two objectives:
i.
ii.
to gain an indication of the nature and extent of stakeholder ‘concerns’ about
online e-assessment; and
to canvass the ‘acceptability’ to these stakeholders of some proposed risk
treatment options.
The key stakeholders
The following provides an outline of the key stakeholder groups surveyed and the method of
contact. Appendix 1 provides more information on the e-assessment profiles of the employer,
student and assessor survey respondents: extent of engagement with online e-assessment,
relevant industry skill areas, and related qualification levels.
The four stakeholder groups received different questionnaires; the specific questions put to
each stakeholder group are presented in Appendix 2, along with their verbatim comments.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 1
Assessment e-Risk Survey of Key Stakeholders 2014
Employers
Employers were approached through the Western Australian Industry Training Council (ITC)
executive officers. In total, 28 employer survey responses were received from 9 of the 10
ITCs. Half of the ITCs provided collated survey responses on behalf of their members. The
confidential link to the online survey was made available to ITC employer representatives
either through direct email or the ITC’s social media.
Students
The student survey was made available to Polytechnic West students through the students
learning management system. Polytechnic West is a Western Australian public training
provider. Responses were received from 137 students who had undertaken some form of
online assessment (4% of the cohort with logon access to the learning management system).
The survey of students covered an expanded range of topics to the employer survey, and
covered most of the topics addressed in the auditors and assessors surveys (although the
language was modified, and questions were broached from a student perspective).
The responses were screened to exclude students who did not indicate that they had direct
experience of online assessment. Three questions were used to do this, respondents were
asked: (1) to indicate the number of units in which they have been assessed online; (2) the
qualification level(s) they have been assessed in online; and (3) to list the main study areas
in which they have been assessed online. Nil responses to these questions were used to
screen out 18 students from the analysis of responses.
Auditors
National Australian Skills Quality Authority (ASQA) auditors and Western Australian Training
Accreditation Council (TAC) auditors were given the opportunity to respond to a more
extensive survey than was provided to employers. The confidential link to the online survey
was made available through the relevant ASQA and TAC managers. Responses were
received from approximately one third of the auditors in both groups, 18 ASQA auditors and
three TAC auditors, a total of 21 responses.
Assessors
The survey of assessors targeted assessors with experience in the use of online assessment
in the full, or partial, assessment of students. Responses were elicited from assessors
employed by Polytechnic West and Central Institute of Technology, both being public
Western Australian training organisations.
The assessor survey was essentially the same as the survey of auditors. Responses were
received from a total of 46 online assessors, 35 from Polytechnic West and 11 from the
Central Institute of Technology.
Three questions were used to screen out assessors that had not conducted online
assessments. Respondents were asked: (1) to indicate student online assessment numbers;
(2) the qualification level(s) they have assessed online; and (3) to list the main study areas
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 2
Assessment e-Risk Survey of Key Stakeholders 2014
they have assessed online. Nil responses to these questions were used to screen out 12
assessors from the analysis of responses.
The survey instrument
The Assessment e-Risk Scale used a fixed choice Likert scale to elicit opinions from
stakeholders in regard to e-assessment related risks (their concerns); and a range of
proposed treatment options. Respondents were also provided with an opportunity to provide
additional comments and clarifying remarks. The verbatim comments from respondents are
presented below in Appendix 2.
Analysing the responses
The Survey was part of the information gathering process for this enquiry. It occurred prior to
the specific nature of the enquiry being resolved. As a consequence, there is not a direct and
mutually exclusive relationship between the survey questions and the frameworks presented
in the enquiry report and companion risk management guide. The following section outlines
the two frameworks used in the enquiry and how these two frameworks were brought
together to collate and analyse survey responses.
3 Analysis of responses
The full report and companion document to An Australian enquiry into the veracity and
authenticity of online e-assessment used two related but different frameworks. The purpose
of the first framework was to collate the ‘concerns’ of key stakeholders as identified in
Australian and international research and policy reports along with the feedback arising from
the Assessment e-Risk Survey. The second framework was used to present the range of
identified treatment options in an ‘assessment lifecycle’ framework.
The purpose of the following is to present the collated responses of the four stakeholder
groups to individual questions, in the one ‘collated’ framework for analysis of responses.
Stakeholder concerns – framework one
The analysis of stakeholder concerns suggests that it is useful to identify three components
that contribute to stakeholder concerns about the veracity and authenticity of online
e-assessment. The three key components identified were: the specification of competence,
the assessment process itself, and integrity of evidence concerns.
These three components and the identified sub-components are presented in the diagram
below.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 3
Assessment e-Risk Survey of Key Stakeholders 2014
E-assessment risk components framework - diagram
Specification of
Competence
Assessment
Process
Integrity of
Evidence
Industry Input
Rules of
Evidence
Plagiarism
Clarity of
Documentation
Principles of
Assessment
Inappropriate
Collaboration
Relevant and Up
to Date
Assessor
Competence
Cheating
Assessment
Resources
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Identity Fraud
Page 4
Assessment e-Risk Survey of Key Stakeholders 2014
Treatment options – framework two
Consideration of strategies to address these concerns led to an ‘assessment lifecycle’ view
of the treatment options. This lifecycle framework is presented below in an ISO 31000
‘inspired’ diagram.
Addressing the online e-assessment risks - diagram
Collation of responses framework
In order to present the collated responses of each of the four stakeholder groups it is useful
to bring these two frameworks together.
The following ‘collation framework’ has integrated the more detailed components (and subcomponents) of stakeholder concerns, into an assessment lifecycle view of the treatment
options. This framework supports a more holistic understanding of the responses of
stakeholders to the survey questions.
The ‘collation framework’ is summarised below and used in the following section of this
report to present - at the individual question-proposition level - the collated responses of the
four stakeholder groups (the heading numbers in the summary below refer to the relevant
sub-sections in Section 4 below).
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 5
Assessment e-Risk Survey of Key Stakeholders 2014
4.1 Specification of competence
4.1.1 Industry input
4.1.2 Clarity of documentation
4.1.3 Relevant and up-to-date
4.2 Enrolment process
4.3 Learning process
4.4 Assessment process
4.4.1 Rules of evidence
4.4.2 Principles of assessment
4.4.3 Assessor competence
4.4.4 Assessment resources
4.5 Workplace performance
4.6 Monitoring and review
4.6.1 Plagiarism
4.6.2 Inappropriate collaboration
4.6.3 Cheating
4.6.4 Identity fraud
4 Collated stakeholder responses
The following presents at the individual question level the collated responses of stakeholders
using the combined ‘lifecycle’ and stakeholder ‘concerns’ framework described above.
Where possible, questions are presented at the sub-component level. However, a number of
the questions are relevant to more than one sub-component and are, therefore, presented at
the broader category level.
The first column in each table indicates which of the four stakeholder groups the responses
related to. The second column presents the proposition included in each of the surveys. The
different stakeholder surveys are not the same as each other and do not include the same
range of issues and propositions. (The auditor and assessor surveys are the most similar.)
Responses are presented as percentages with the last column in each row (labelled #)
presenting the actual number of responses received from the relevant stakeholder group, for
the specific question-proposition.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 6
Assessment e-Risk Survey of Key Stakeholders 2014
The rating scale
The ‘Likert type’ rating scale used in the survey is presented below, along with the key to the
abbreviations used in the following summary tables. (Abbreviations were not necessary, and
were not used, in the administered surveys.)
SD
D
N A/D
A
SA
DK
Strongly
Disagree
Disagree
Neither
Agree nor
Disagree
Agree
Strongly
Agree
Don’t Know
The ‘Neither Agree or Disagree’ and ‘Don’t know’ columns have been shaded to assist in the
distinction of the two level of responses that indicate agreement with the proposition from
those the two responses that indicate disagreement.
It is important to note that propositions that indicate concern are posed in both positive and
negative formats. And in a couple of cases this is not consistent across stakeholder groups.
Care should therefore be taken in interpreting the ratings.
Specification of competence
This question was asked of employers to gain a baseline indication of familiarity with the
Units of Competence.
Stakeholder
Question / Proposition
SD
D
N
A/D
A
SA
DK
#
Employers
2f. I have READ the standards for skill and
knowledge relevant to our employees as
set out in the Units of Competence.
7%
7%
7%
37%
22%
19%
27
Industry input
There were no directly relevant survey questions that addressed this issue. Support for the
importance of industry input is evident from many other sources and most notably the COAG
Industry and Skills Council communique (COAG ISC 2014).
Clarity of documentation
Stakeholder
Question / Proposition
SD
D
N
A/D
A
SA
DK
#
Employers
2g. The standards for skill and
knowledge relevant to our employees as
set out in the Units of Competence are
CLEAR.
7%
22%
4%
44%
4%
19%
27
4f. The way Units of Competence are
written is part of the reason for problems
with e assessment.
14%
24%
14%
29%
19%
0%
21
Auditors
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 7
Assessment e-Risk Survey of Key Stakeholders 2014
Stakeholder
Question / Proposition
SD
D
N
A/D
A
SA
DK
#
Assessors
7f. The way Units of Competence are
written is part of the reason for problems
with e assessment.
0%
19%
26%
28%
28%
0%
43
NB: Auditor and Assessor propositions are negative propositions – agreement indicates concern.
Relevant and up-to-date
Stakeholder
Employers
SD
D
N
A/D
A
SA
DK
#
4%
19%
15%
48%
7%
7%
27
SD
D
N
A/D
A
SA
DK
#
7a. It is important to establish the true
identity of a student when they enrol to
study ONLINE.
0%
0%
12%
42%
46%
0%
122
7b. As a condition of any type of enrolment
students should sign a statement
committing to not submit ‘false or
misleading evidence of their skills and
knowledge’.
0%
1%
11%
48%
40%
1%
122
4a. As a condition of enrolment students
should sign a witnessed statement
committing to not submit ‘false or
misleading evidence of their skills and
knowledge’.
5%
5%
19%
48%
24%
0%
21
3h. It is important to establish the true
identity of a student when they enrol.
0%
0%
0%
33%
67%
0%
21
4g. It is important to establish the true
identity of a student when they enrol.
0%
0%
0%
48%
52%
0%
21
6h. It is important to establish the true
identity of a student when they enrol.
0%
5%
2%
47%
47%
0%
43
7a. As a condition of enrolment students
should sign a witnessed statement
committing to not submit ‘false or
misleading evidence of their skills and
knowledge’.
0%
5%
7%
51%
33%
5%
43
7g. It is important to establish the true
identity of a student when they enrol.
0%
9%
5%
35%
49%
2%
43
Question / Proposition
2h. The standards for skill and knowledge
as set out in Training Package units of
competence are USEFUL and
RELEVANT.
Enrolment process
Stakeholder
Students
Students
Auditors
Auditors
Auditors
Assessors
Assessors
Assessors
Question / Proposition
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 8
Assessment e-Risk Survey of Key Stakeholders 2014
Learning process
Stakeholder
Students
SD
D
N
A/D
A
SA
DK
#
7%
1%
15%
42%
35%
0%
137
SD
D
N
A/D
A
SA
DK
#
4a. Overall I was satisfied with the
assessments included in my ONLINE
study
2%
8%
6%
58%
25%
1%
137
7d. ALL students assessed ONLINE
should be contacted and asked course
related questions before they are
deemed competent.
10%
20%
26%
30%
13%
1%
122
5c. Many students deemed competent
through ONLINE assessment are not
actually competent.
16%
36%
27%
12%
3%
5%
130
4c. ALL e-assessment students should
be contacted and orally asked course
related questions before they are
deemed competent.
10%
29%
29%
10%
24%
0%
21
5b. All summative assessment should
involve a test of competence under exam
conditions.
5%
19%
33%
29%
14%
0%
21
3a. Many students deemed competent
through e-assessment are NOT actually
competent.
0%
5%
24%
48%
24%
0%
21
5h. Organised cheating and fraud is
currently a significant issue for the quality
of e-assessment outcomes.
16%
19%
21%
16%
28%
0%
43
5i. Cheating and fraud is a bigger
problem in e-assessment than in
traditional forms of assessment.
14%
30%
7%
19%
26%
5%
43
7c. ALL e-assessment students should
be contacted and orally asked course
related questions before they are
deemed competent.
2%
28%
30%
28%
12%
0%
43
6a. Many students deemed competent
through e-assessment are NOT actually
competent.
5%
26%
23%
21%
21%
5%
43
Question / Proposition
4b. I would recommend online learning
and assessment to others
Assessment process
Stakeholder
Students
Students
Students
Auditors
Auditors
Auditors
Assessors
Assessors
Assessors
Assessors
Question / Proposition
NB: note some propositions are presented as negative statements, agreement therefore indicates concern.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 9
Assessment e-Risk Survey of Key Stakeholders 2014
Rules of evidence
Stakeholder
Auditors
Auditors
Auditors
Auditors
Auditors
Auditors
Assessors
Assessors
Assessors
Assessors
SD
D
N
A/D
A
SA
DK
#
1h. I have seen e-assessment used to
validly assess all four dimensions of
competence (task skills, task
management skills, contingency
management skills, job/role environment
skills).
14%
38%
5%
29%
14%
0%
21
1i. I have seen e-assessment used to
validly assess specific task skills (NB:
this is a narrow question about task skills
and does not imply anything about the
assessment of the other dimensions of
competence).
19%
24%
0%
43%
14%
0%
21
2a. E-assessment practice tends to be
associated with an INAPPROPRIATE
reliance on the KNOWLEDGE based
assessment of competence.
0%
5%
10%
38%
48%
0%
21
2b. E-assessment strategies CANNOT
be used to validly assess the SKILL
aspects of competency.
10%
38%
10%
33%
10%
0%
21
2c. E-assessment CANNOT be used to
collect SUFFICIENT evidence to validly
assess competence.
10%
38%
19%
29%
5%
0%
21
2e. The majority of e-assessment
evidence I have seen used was
appropriately AUTHENTICATED.
33%
33%
5%
29%
0%
0%
21
6g. E assessment is too risky to be used
as part of a summative assessment
strategy.
16%
47%
14%
14%
9%
0%
43
4h. I have seen e-assessment used to
validly assess all four dimensions of
competence (task skills, task
management skills, contingency
management skills, job/role environment
skills).
4%
20%
24%
39%
7%
7%
46
4i. I have seen e-assessment used to
validly assess specific task skills (NB:
this is a narrow question about task skills
and does not imply anything about the
assessment of the other dimensions of
competence).
4%
4%
30%
50%
7%
4%
46
5a. E-assessment practice tends to be
associated with an INAPPROPRIATE
reliance on the KNOWLEDGE based
assessment of competence.
2%
38%
24%
29%
7%
0%
42
Question / Proposition
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 10
Assessment e-Risk Survey of Key Stakeholders 2014
Stakeholder
Assessors
Assessors
Assessors
Assessors
SD
D
N
A/D
A
SA
DK
#
5b. E-assessment strategies CANNOT
be used to validly assess the SKILL
aspects of competency.
5%
44%
12%
33%
7%
0%
43
5c. E-assessment CANNOT be used to
collect SUFFICIENT evidence to validly
assess competence.
9%
44%
9%
23%
14%
0%
43
5e. The majority of e-assessment
evidence I have seen used was
appropriately AUTHENTICATED.
12%
19%
9%
49%
7%
5%
43
8b. All summative assessment should
involve a test of competence under exam
conditions.
7%
28%
28%
16%
21%
0%
43
Question / Proposition
SD
D
N
A/D
A
SA
DK
#
2d. E-assessment should only be used to
SUPPLEMENT a summative assessment
decision.
10%
52%
10%
24%
5%
0%
21
2f. The majority of e-assessment
DESIGN that I have audited is valid,
reliable, flexible and fair.
24%
48%
14%
14%
0%
0%
21
2g. The majority of e-assessment
PRACTISE that I have audited is valid,
reliable, flexible and fair.
24%
48%
14%
14%
0%
0%
21
4e. It makes sense to modify the
principles of assessment to replace the
word FLEXIBLE with the word
CONSISTENT (ASQA 2013, p44).
10%
14%
33%
29%
5%
10%
21
5d. E-assessment should only be used to
SUPPLEMENT a summative assessment
decision.
5%
49%
19%
19%
9%
0%
43
5f. The majority of e-assessment
DESIGN that I have audited is valid,
reliable, flexible and fair.
12%
17%
10%
46%
7%
7%
41
5g. The majority of e-assessment
PRACTISE that I have audited is valid,
reliable, flexible and fair.
9%
19%
16%
42%
5%
9%
43
7e. It makes sense to modify the
principles of assessment to replace the
word FLEXIBLE with the word
CONSISTENT (ASQA 2013, p44).
2%
12%
47%
26%
9%
5%
43
Question / Proposition
Principles of assessment
Stakeholder
Auditors
Auditors
Auditors
Auditors
Assessors
Assessors
Assessors
Assessors
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 11
Assessment e-Risk Survey of Key Stakeholders 2014
Assessor competence
Stakeholder
Auditors
Assessors
SD
D
N
A/D
A
SA
DK
#
5a. The assessment competence of
assessors is part of the reason for
problems with e-assessment.
0%
19%
14%
24%
38%
5%
21
8a. The assessment competence of
assessors is part of the reason for
problems with e-assessment.
7%
33%
12%
42%
5%
2%
43
Question / Proposition
NB: note these propositions were presented as negative statements, agreement therefore indicates concern .
Assessment resources
Stakeholder
Students
Students
Students
Students
Students
Auditors
Auditors
Auditors
Auditors
SD
D
N
A/D
A
SA
DK
#
4d. The ONLINE assessments were a
good test of my knowledge.
1%
4%
12%
50%
32%
1%
137
4e. The ONLINE assessments were a
good test of my skills.
1%
6%
15%
50%
27%
1%
137
4f. The ONLINE assessments were a
good test of my ability to apply the skills
in a workplace setting.
3%
7%
23%
43%
20%
4%
137
4h. Social media has been used as part
of the assessments I have done (e.g.
blogs, Facebook and twitter).
34%
31%
12%
7%
3%
13%
137
4i. Smart phones and/or tablet type
technology has been used as part of the
assessments I have done.
28%
24%
11%
18%
7%
12%
137
1a. The proportion of all summative
assessments relying on e-assessment
has grown in the last 3-5 years.
0%
5%
14%
38%
38%
5%
21
1b. The growth in e-assessment is
limited to a few industry skill areas.
5%
43%
14%
24%
5%
10%
21
1c. The growth in e-assessment is
predominantly at the higher level of AQF
4 and above. (For level 4 definitions see:
http://www.aqf.edu.au/aqf/in-detail/aqflevels/)
5%
48%
29%
0%
5%
14%
21
1d. The growth in e-assessment is
predominantly at the lower level of AQF
3 and lower.
0%
38%
33%
19%
0%
10%
21
Question / Proposition
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 12
Assessment e-Risk Survey of Key Stakeholders 2014
Stakeholder
Auditors
Auditors
Auditors
Assessors
Assessors
Assessors
Assessors
Assessors
Assessors
Assessors
Question / Proposition
SD
D
N
A/D
A
SA
DK
#
1e. Knowledge based tests, and the
online quiz, continue to be the dominant
form of e-assessment.
0%
10%
10%
52%
24%
5%
21
1f.The last 3-5 years has seen an
increase in the use of social media in
e-assessment (e.g. blogs, Facebook and
twitter)..
0%
5%
29%
33%
19%
14%
21
1g. The last 3-5 years has seen an
increase in the use of smart phones and
tablet technology as part of
e-assessment.
0%
0%
19%
67%
5%
10%
21
4e. Knowledge based tests, and the
online quiz, continue to be the dominant
form of e-assessment.
7%
17%
13%
43%
17%
2%
46
4a. The proportion of all summative
assessments relying on e-assessment
has grown in the last 3-5 years.
2%
7%
9%
33%
46%
4%
46
4b. The growth in e-assessment is
limited to a few industry skill areas.
4%
37%
17%
22%
4%
15%
46
4c. The growth in e-assessment is
predominantly at the higher level of AQF
4 and above. (For level 4 definitions see:
http://www.aqf.edu.au/aqf/in-detail/aqflevels/)
7%
24%
22%
17%
7%
24%
46
4d. The growth in e-assessment is
predominantly at the lower level of AQF
3 and lower.
9%
43%
17%
7%
2%
22%
46
4f.The last 3-5 years has seen an
increase in the use of social media in eassessment (e.g. blogs, Facebook and
twitter).
2%
20%
20%
39%
4%
15%
46
4g. The last 3-5 years has seen an
increase in the use of smart phones and
tablet technology as part of
e-assessment.
4%
20%
22%
30%
13%
11%
46
SD
D
N
A/D
A
SA
DK
#
0%
0%
15%
56%
30%
0%
27
Workplace performance
Stakeholder
Employers
Question / Proposition
1c. If we found a gap in the training and
assessment of an employee we would
like to be able to register our concern.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 13
Assessment e-Risk Survey of Key Stakeholders 2014
Stakeholder
Question / Proposition
SD
D
N
A/D
A
SA
DK
#
Employers
1d. We would want to be able to register
our concern confidentially.
4%
4%
22%
48%
22%
0%
27
1e. It is NOT the role of an employer to
check that an employee who has been
assessed actually has the skills and
knowledge required.
44%
33%
0%
7%
15%
0%
27
6a. I have found that employees who
have undertaken online assessment
have skills and knowledge to the same
standard as employees assessed in
more traditional ways.
7%
52%
11%
19%
0%
11%
27
6c. I am aware of employees who were
assessed as competent through an
online assessment process but were not
competent.
0%
15%
15%
41%
15%
15%
27
6d. It seems to me that online
assessment is as at least as reliable as
other forms of assessment.
15%
37%
7%
19%
15%
7%
27
1a. We monitor the performance of all
staff and would soon be aware if an
employee lacked the required skills and
knowledge.
0%
11%
7%
56%
26%
0%
27
1b. We can quickly determine if
someone is COMPETENT OR NOT
when we get that person on- the-job.
0%
7%
4%
67%
22%
0%
27
6e. Cheating to complete an online
assessment will NOT be useful to a
student who does NOT have the skill
and knowledge because these gaps will
be discovered in the workplace.
0%
19%
4%
30%
41%
7%
27
4g. A lot of what I have had to study is
NOT relevant in the workplace.
17%
47%
14%
14%
2%
6%
137
4c. The ONLINE assessments I
undertook were relevant to the
workplace
1%
3%
12%
50%
30%
4%
137
5d. If ANY student is deemed competent
when they are NOT this will be identified
in most workplaces.
2%
8%
15%
42%
28%
5%
130
8h. In the majority of workplaces
employees who lack the skills and
knowledge will be identified and action
taken.
2%
4%
17%
50%
20%
6%
122
Employers
Employers
Employers
Employers
Employers
Employers
Employers
Students
Students
Students
Students
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 14
Assessment e-Risk Survey of Key Stakeholders 2014
Stakeholder
Auditors
Auditors
Assessors
Assessors
Question / Proposition
SD
D
N
A/D
A
SA
DK
#
3b. If a student is deemed competent
when they are NOT this will be identified
in most workplaces.
5%
24%
14%
43%
5%
10%
21
5c. In the majority of workplaces
performance management will identify
employees who were deemed
competent when they lack the skills and
knowledge required.
10%
19%
29%
38%
0%
5%
21
8c. In the majority of workplaces
performance management will identify
employees who were deemed
competent when they lack the skills and
knowledge required.
2%
16%
16%
60%
5%
0%
43
6b. If a student is deemed competent
when they are NOT this will be identified
in most workplaces.
5%
16%
7%
60%
12%
0%
43
SD
D
N
A/D
A
SA
DK
#
2i. It is to be expected that every now
and again someone will get through the
assessment process and be deemed
competent when they do not have the
required skills and knowledge.
26%
26%
7%
30%
11%
0%
27
2j. Even the assessment of ONE person
as competent who does NOT have the
required skills and knowledge would
cause me to be concerned about the
quality of the assessment system.
4%
19%
26%
26%
26%
0%
27
6b. I am aware of employees who have
indicated that they have cheated when
doing online assessments that were
supposed to be their own work.
4%
4%
22%
37%
11%
22%
27
7c. To keep people honest a RANDOM
SAMPLE of students assessed ONLINE
should be contacted and asked course
related questions.
3%
16%
25%
40%
14%
1%
122
8f. EVERYONE should have to do at
least one test in each unit under exam
conditions to ensure they are doing the
work themselves.
10%
25%
24%
28%
13%
1%
122
Monitoring and review
Stakeholder
Employers
Employers
Employers
Students
Students
Question / Proposition
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 15
Assessment e-Risk Survey of Key Stakeholders 2014
Stakeholder
Students
Students
Students
Auditors
Auditors
Auditors
Auditors
Auditors
Auditors
Auditors
Assessors
Assessors
SD
D
N
A/D
A
SA
DK
#
8i. There are NOT enough students
who copy or cheating to justify
inconveniencing ALL students.
2%
5%
21%
32%
20%
19%
122
8j. For every technique developed to
reduce cheating in ONLINE
assessments those who want to cheat
will be able to find a way to work around
or 'hack; the technology.
3%
11%
22%
37%
16%
10%
122
9l. Only students enrolled in HIGH
RISK courses should be monitored in
some way when they do ONLINE
assessments.
6%
23%
27%
34%
1%
10%
122
3i. The only level of identity risk that will
not undermine confidence in the vet
system is zero.
0%
14%
29%
33%
14%
10%
21
2i. Cheating and fraud is a bigger
problem in e-assessment than in
traditional forms of assessment.
10%
19%
5%
33%
19%
14%
21
3g. E assessment is too risky to be used
as part of a summative assessment
strategy.
14%
62%
10%
10%
5%
0%
21
4b. As part of internal audit processes a
RANDOM sample of e-assessment
students should be contacted and orally
asked course related questions.
5%
10%
10%
62%
14%
0%
21
5d. There are NOT enough students
who engage in cheating and fraud to
justify inconveniencing all students.
10%
38%
24%
14%
5%
10%
21
5f. There needs to be a greater use of
technology such as video, voice and
keyboard monitoring to address eassessment cheating.
0%
5%
14%
67%
10%
5%
21
5g. A risk assessment approach should
be used to determine when the above
technology should be used.
5%
10%
10%
48%
24%
5%
21
8g. A risk assessment approach should
be used to determine when the above
technology should be used.
0%
9%
26%
49%
12%
5%
43
6c. I have come across examples of
students deliberately seeking to be
deemed competent when they know
they are NOT competent.
0%
16%
19%
49%
14%
2%
43
Question / Proposition
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 16
Assessment e-Risk Survey of Key Stakeholders 2014
SD
D
N
A/D
A
SA
DK
#
7b. As part of internal audit processes a
RANDOM sample of e-assessment
students should be contacted and orally
asked course related questions.
0%
19%
23%
42%
16%
0%
43
7d. The use of timed assessment
activities could be used to assist identify
anomalies created by copying and
collaboration.
0%
5%
26%
56%
14%
0%
43
8d. There are NOT enough students
who engage in cheating and fraud to
justify inconveniencing all students.
12%
23%
23%
16%
16%
9%
43
8e. For every technological strategy to
reduce cheating there will be another
technological work around created.
0%
14%
16%
58%
9%
2%
43
8f. There needs to be a greater use of
technology such as video, voice and
keyboard monitoring to address eassessment cheating.
0%
19%
26%
42%
12%
2%
43
Question / Proposition
Stakeholder
Assessors
Assessors
Assessors
Assessors
Assessors
Plagiarism
Stakeholder
Students
Students
Auditors
Auditors
Assessors
Assessors
SD
D
N
A/D
A
SA
DK
#
6f. The meaning of 'plagiarism' is not
easy to understand.
35%
45%
8%
8%
2%
2%
130
6g. Plagiarism is a bigger problem when
doing ONLINE assessments than when
you are doing assessments in-class or as
written/printed homework.
16%
29%
23%
22%
4%
6%
130
3d. Plagiarism is a bigger problem in eassessment than it is in the assessment
of face to face students.
10%
29%
5%
33%
19%
5%
21
3f. For many students plagiarism is a
literacy skill issue rather than a deliberate
act of cheating.
14%
19%
38%
19%
5%
5%
21
6d. Plagiarism is a bigger problem in eassessment than it is in the assessment
of face to face students.
5%
42%
12%
21%
19%
2%
43
6f. For many students plagiarism is a
literacy skill issue rather than a deliberate
act of cheating.
7%
7%
35%
37%
9%
5%
43
Question / Proposition
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 17
Assessment e-Risk Survey of Key Stakeholders 2014
Inappropriate collaboration
SD
D
N
A/D
A
SA
DK
#
6h. ONLINE assessment students are
more likely to inappropriately
collaborate, copy and cheat when doing
their assessments than students
assessed in other ways.
23%
31%
23%
15%
2%
6%
130
4d. The use of timed assessment
activities could be used to assist identify
anomalies created by copying and
collaboration.
10%
5%
19%
52%
10%
5%
21
3e. E-assessment students are more
likely to inappropriately collaborate on
their summative assessments.
5%
24%
19%
29%
14%
10%
21
6e. E-assessment students are more
likely to inappropriately collaborate on
their summative assessments.
7%
35%
23%
26%
5%
5%
43
SD
D
N
A/D
A
SA
DK
#
7e. Monitoring students when they do
ONLINE assessment activities should be
used to help identify students who are
copying or 'cheating' and not doing the
work themselves.
4%
10%
21%
43%
20%
2%
122
8g. There are people and agencies that
will help you complete the assessment
requirements of a course without having
to do the work yourself.
8%
21%
22%
7%
4%
38%
122
5a. Students doing ONLINE assessment
take advantage of the situation and copy
and ‘cheat’ from other students and
other sources.
25%
35%
19%
8%
3%
8%
130
5b. This sort of copying and 'cheating'
happens in ALL forms of assessment
not just online assessment.
9%
17%
20%
34%
11%
9%
130
6e. I have come across examples of
students deliberately seeking to be
deemed competent when they know
they are NOT competent.
19%
25%
24%
14%
1%
17%
130
Question / Proposition
Stakeholder
Students
Auditors
Auditors
Assessors
Cheating
Stakeholder
Students
Students
Students
Students
Students
Question / Proposition
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 18
Assessment e-Risk Survey of Key Stakeholders 2014
Stakeholder
Students
Students
Students
Auditors
Auditors
Auditors
Question / Proposition
SD
D
N
A/D
A
SA
DK
#
9k. There needs to be a greater use of
technology such as video, voice and
keyboard monitoring to address ONLINE
cheating.
7%
25%
34%
25%
0%
8%
122
9m. The internet has a lot of resources
that can help you complete assessments
without doing the work yourself.
4%
25%
32%
24%
3%
12%
121
9n. Students who are serious about
cheating will be able to find ways around
any technology designed to reduce
cheating (e.g. they can find cheats and
'hacks' from the internet).
3%
4%
24%
50%
5%
14%
120
5e. For every technological strategy to
reduce cheating there will be another
technological work around created.
0%
14%
24%
43%
5%
14%
21
2h. Organised cheating and fraud is
currently a significant issue for the
quality of e-assessment outcomes.
5%
14%
5%
43%
19%
14%
21
3c. I have come across examples of
students deliberately seeking to be
deemed competent when they know
they are NOT competent.
0%
19%
24%
52%
5%
0%
21
SD
D
N
A/D
A
SA
DK
#
2%
19%
30%
21%
12%
16%
43
Identity fraud
Stakeholder
Assessors
Question / Proposition
6i. The only level of identity risk that will
not undermine confidence in the vet
system is zero.
NB: a number of respondents indicated in their comments that the meaning of this proposition was not
clear. Other respondents confirmed their understanding of this question by stating that ‘zero risk’ was
an unrealistic goal.
5 Risk concern ratings
In the absence of reliable data on the likelihood and consequences of an identified concern,
an objective risk rating is not possible. In the current context the assessment of risk
‘concerns’ is highly subjective.
One of the treatment options identified in the full report is the need to enhance monitoring
and review processes in order to more effectively address the lack of ‘onset visibility’ (see
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 19
Assessment e-Risk Survey of Key Stakeholders 2014
section 5.2.2 and 6.7.10 in the full report An Australian enquiry into the veracity and
authenticity of online e-assessment, Morris 2014).
The following criteria were used to translate the stakeholder responses presented above in
section 4 into an ‘extent of concern’ risk rating.
H = Significant (high) degree of concern expressed by stakeholder respondents.
This category indicates that the majority of stakeholders indicated that they
were concerned about the nominated risk. Depending on the way the
proposition was framed this meant that more respondents agreed or strongly
agreed with the statement than disagreed or strongly disagreed.
In the few marginal cases where there was more than 40%, but not a clear
majority of respondents indicating concern, the ‘don’t know’ ratings and ‘neither
agree nor disagree’ responses were removed from the calculation of the
majority view.
C = Concern was expressed by stakeholders but with a lower level of consensus.
This category indicates that more than 20% of stakeholders indicated that they
were concerned about the nominated risk (agreed/strongly agreed or
disagreed/strongly disagreed with the statement).
L = Limited concern. This category indicates that 20% of respondents, or less,
indicated that they were concerned or strongly concerned about this risk.
/ = Indicates that the survey for this stakeholder group did not include questions that
provided useful information in relation to this risk factor.
The following table also provides an indication of the extent to which there is a direct
relationship between the identified area of concern and questions in the Assessment eRisk Survey.
In the following table ‘Direct’ indicates that this issue was specifically canvassed in one of the
question-propositions. ‘Explicit’, indicates that the issue was explicitly canvassed in a number
of questions, and that the rating is based on a composite evaluation of the relevant
responses. I should be noted that different stakeholder groups received different
questionnaires and, therefore, there are differences across groups as to whether issues were
explicitly or implicitly canvassed.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 20
Assessment e-Risk Survey of Key Stakeholders 2014
Collated risk concerns – Assessment e-Risk Survey - Table
Assessment e-Risk Survey
Auditors
Assessors
Students
/
/
/
/
Clarity of Documentation
C H
H
/
H
Direct
Relevant and Up to Date
C
/
/
L
C
Direct
H C
C
L
C
Explicit & indirect to students
All dimensions of competence
H H
C
/
H
Explicit
Addressing task specific skills
C
L
L
/
L
/
/
L
L
Student survey only
H H
H
L
H
Several questions
Valid assessment process
H H
C
L
H
Explicit - Several questions
Fair assessment process
/
/
/
L
L
Implicit for students only
Flexible assessment process
/
/
L
L
L
Explicit auditors and assessors
Reliable assessment process
H C
H
L
H
Explicit
Assessor/Practitioner
/
H
H
/
H
Explicit
Designer/Developer
/
H
H
/
H
Indirect
C H
H
L
H
Indirect
Plagiarism
/
H
H
H
H
Explicit
Inappropriate Collaboration
/
H
H
C
H
Explicit
Cheating
H H
H
C
H
Explicit
Identity Fraud
H H
H
C
H
Explicit
Overall
Employers
Relationship to:
Assessment e-Risk
Survey (2014)
Industry Input
Concern
Specification of Competence
Nil
Assessment Process
Rules of Evidence
Sufficient evidence
Valid evidence
Current evidence
Authentic evidence
Principles of Assessment
Assessor Competence
Assessment Resources
Integrity of Evidence
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 21
Assessment e-Risk Survey of Key Stakeholders 2014
6 In conclusion - an integrated approach
This report has been complied to assist researchers, policy developers and practitioners to
gain a deeper understanding of the Assessment e-Risk Survey and the responses provided
by representatives from the key stakeholder groups.
The survey was exploratory, and was administered relatively early in the life of this six month
enquiry. The surveys were designed to gain a measure of the extent and nature of
stakeholder concerns and to canvass some possible solutions. The survey may be seen to
have garnered a wealth of information.
Analysis has demonstrated that the most significant areas of concern to all four stakeholder
groups relate to the rules of evidence and principles of assessment. The primary areas of
concern in these two areas were the validity and reliability of the assessment process, and
the validity, sufficiency and authenticity of evidence (including plagiarism, inappropriate
collaboration, cheating, and identity fraud). The three other significant areas of concern
identified were: the lack of clarity in the documented competency requirements; the
competence of assessors (both as assessor/practitioners and assessment
designer/developers); and available assessment resources.
The survey responses, combined with the Australian and international literature review,
clearly demonstrated that concerns about the veracity and authenticity of online
e-assessment need to be taken seriously. The enquiry found that the concerns being
expressed by stakeholders ranged from genuine and well founded concerns about the
integrity of assessment evidence and an inappropriate reliance on knowledge based
assessment tasks; through to a lack of awareness of the full range of online e-assessment
strategies and technology that can be used to validly assess all four dimensions of
competence.
The enquiry was led to distinguish between strategies designed to deal with an ‘overarching’
concern about the veracity and authenticity of online e-assessment and a need for ‘context
specific’ strategies to deal with high risk, high stake assessment contexts where the primary
concern was the ‘integrity of evidence’.
The overarching strategy level suggests that there is a need for enhanced engagement of all
key stakeholders, within their areas of expertise and responsibility, to address quality
concerns and achieve cost-effective, online e-assessment. Four observations stand out.
1
2
3
The surveys clearly indicated an opportunity to more effectively utilise employer
workplace performance feedback in monitoring and reviewing assessment
outcomes.
For assessors, there is an opportunity to reinforce the need to addresses all four
dimensions of competence and to ensure the rules of evidence are satisfied.
Increased dissemination of success stories among auditors and assessors would
also appear to be required. While the surveys indicated that a significant number
of auditors and assessors have seen online e-assessment used to validly assess
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 22
Assessment e-Risk Survey of Key Stakeholders 2014
4
all four dimensions of competence, a majority of respondents have not had this
experience.
And students have their responsibilities; which the survey responses indicate
many take seriously. Many students indicated their understanding of the
workplace consequences of being deemed competent if they submit false
evidence. And there was widespread student support for more rigorous enrolment
verification of identity and accountability; including signed integrity pledges and
integrity reminders.
The second level of stakeholder concern may be seen to relate to high-risk, high-stake,
assessment contexts. These are areas where the concern is with the authenticity of online
e-assessment, specifically the ‘integrity of evidence’. Unfortunately, the nature of this survey
meant that the evidence provided by respondents was anecdotal. While the enquiry identified
that high risk, high stake, assessment contexts exist this enquiry was not able to clarify the
nature and extent of such contexts. The treatment options to deal with these high risk, high
stake, contexts were outlined in the section headed ‘monitoring and review’. A significant
range of current and emerging technologies were identified.
As part of this enquiry, a companion document entitled An Australian guide to the risk
management of VET online e-assessment was created. The guide includes a checklist that
could be used to assist in the determination the required risk management strategy in a
specific context.
In conclusion, the stakeholder surveys indicated that there is undoubtedly an overarching
need to continue to address concerns about the veracity and authenticity of online
e-assessment. Beyond this, however, the need and value of ‘integrity monitoring technology’
in a VET environment is less clear.
Stakeholders raised concerns about the ‘integrity of evidence’ arising from; plagiarism,
inappropriate collaboration, cheating and identity fraud. However, the nature, and extent of
integrity breaches are unclear, the relevance in a VET context is not clear, and the veracity of
the technology is itself not self-evident. It is noted that a comprehensive assessment
strategy, that validly addresses all four dimensions of competence, is in itself one of the best
ways to validate the reliability and authenticity of evidence (as per the rules of evidence). If in
addition to modifying the assessment strategy, consideration is to be given to incorporating
‘integrity monitoring’ technology into the process, this should be informed by a context
specific risk-management approach.
The fundamental role of VET is to assist people to take up a productive role in the workplace.
Employer responses to the Assessment e-Risk Survey confirm that they ‘monitor the
performance of all staff and would soon be aware if an employee lacked the required skills
and knowledge’. Strategies that improve the quality and effectiveness of employer feedback
would appear to provide the VET sector with a unique and powerful way to monitor the
veracity and authenticity of its delivery. This would appear to be a strategy that makes good
sense for all stakeholders and all stages of the assessment lifecycle.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 23
Assessment e-Risk Survey of Key Stakeholders 2014
Appendix 1 – profiles of respondents
The following presents the collated responses of stakeholders to questions designed to gain
an understanding of the online e-assessment profile of respondents. Profile questions were
not included in the Auditor survey; therefore this appendix does not have a ‘profile' section for
auditors.
1.1
Employer respondents
3. To the best of your knowledge how many of your employees have been assessed
(fully or partly) using some form of online or other computer technology to do their
assessment.
Answer Options
#
%
I am not aware of any employees being assessed in this way
(skip to question 5).
4
15%
1 or 2 employees
6
22%
3 to 10 employees
4
15%
10 to 30 employees
3
11%
Over 30 employees
10
37%
Answered question
27
100%
Skipped question
0
Total
27
100%
4. Please indicate the main area of study of your employees who have been assessed online or
using some other form of computer technology (e.g. business management, education and
training, office administration, hospitality, cooking, electrical, automotive etc.).
Verbatim answers from respondents:
#
Office administration
1
White card
1
business, computer, OSH, first aid
1
Human Resources
1
Health - safety; manual handling; basic life support and workplace bullying
prevention; hand hygiene
1
Healthcare related competencies and those of health support workers,
1
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 24
Assessment e-Risk Survey of Key Stakeholders 2014
frontline management and some administrative competencies
too many to list, public health, clinical services, government
1
Dip Project Management; Dip Commercial Art Graphic Design); Third
party Enrolled Nurses undertaking our trial VET sector online assessment
process combined with face-to- face; skills assessment to reveal the gaps
in EN course content.
1
WHS
1
Areas which typically are not highly technical or operational in nature
(hands on component).
1
Gas Transmission; Construction Card
1
Inductions, Safety training, Simple and repeatable processes
(manufacturing)
1
Construction
1
Hazard identification, Weather, Mentoring, Leadership development
1
Swim Teachers, Aquatic Rescue Instructors
1
Training and assessment
1
Hospitality, education and training
1
Training and assessment
1
Hospitality, education and training
1
Answered question
19
Skipped question
8
Total
27
5. Please indicate the qualification levels relevant to your employees who were assessed
online - you may select more than one level
Answer Options
#
%
Not applicable
5
19%
Certificate 1 - entry level
2
7%
Certificate 2
4
1%
Certificate 3 - this level includes Trade Certificates
10
37%
Certificate 4 - this level includes the TAE Certificate IV for
trainers
13
48%
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 25
Assessment e-Risk Survey of Key Stakeholders 2014
Diploma
4
15%
Advanced Diploma
1
4%
Associate Degree
2
7%
Not sure or don’t know
3
11%
Total number of indicated qualification levels
44
100%
1.2
Answered question
27
Skipped question
0
Total
27
Student respondents
1. Approximately how many units have you completed that were assessed online either fully or partially?
Answer Options
#
%
I have NOT done any online assessment.
0
0%
1 or 2 units
56
80%
3 to 5 units
50
73%
5 to 10 units
22
34%
Over 10 units
9
13%
137
100%
0
0%
137
100%
Answered question
Skipped question
Total
2. If you have done online assessment - please indicate the main study areas in which
you have done online assessment (e.g. business & management, office
administration, education & training, hospitality, cooking, electrical, automotive etc.) otherwise please skip this question.
Answer Options
Business & Management
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
#
%
30
24%
Page 26
Assessment e-Risk Survey of Key Stakeholders 2014
25
20%
20
16%
13
11%
13
11%
9
7%
4
3%
Education and training
3
2%
Project Management
3
2%
Business administration
Information Technology (including networking and
programming)
Financial Services- Accounting
Work Health and Safety
Electrical - Instrumentation
Human Resource Management
2%
Community Services
2
Total number of nominated study areas
Answered question
Skipped question
Total
123
100%
121
88%
16
12%
137
100%
3. If you have been assessed online - either partially or fully - please indicate the level
of your course - if you have not been assessed online please skip this question?
#
%
Certificate 1
2%
2
Certificate 2
2%
2
Certificate 3
17%
22
Certificate 4
55%
71
Diploma
26%
34
Advanced Diploma
2%
2
Associate Degree
5%
7
Don't know or not applicable
2%
3
Other (please specify)
4%
5
Answer Options
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 27
Assessment e-Risk Survey of Key Stakeholders 2014
Answer Options
Answered question
#
%
100%
130
7
Skipped question
Total
1.3
Assessors respondents
1. Approximately how many students have you - either fully or partially - assessed
through online assessment?
#
%
I have never used online assessment
0
0%
Less than 30 students (but more than zero)
13
28%
Between 30 and100 students
13
28%
Over 100 students
20
43%
46
100%
Answer Options
Answered question
Skipped question
Total
0
46
The following table was collated from free text answers to the question. The responses
indicated that the 46 assessors undertook online e-assessment in over 15 industry skill
areas. The 45 respondents to this question indicated that they assessed 60 study areas
between them, an average of just over one study area (1.3) per assessor.
2. Please indicate the main study areas in which you have used online assessment
(e.g. business & management, office administration, education & training, hospitality,
cooking, electrical, automotive etc.)
Industry study areas – indicated by respondents
#
%
Business and management
14
23%
Training and Assessment / TAE
7
12%
Information Technology & Multimedia
6
10%
Construction, Plumbing and Gas-fitting
6
10%
Accounting & Business finance
5
8%
Occupational/Work Health & Safety
4
7%
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 28
Assessment e-Risk Survey of Key Stakeholders 2014
Industry study areas – indicated by respondents
#
%
Quality Auditing, environmental monitoring, and
sustainability.
3
5%
Electrical and Instrumentation
3
5%
Science/Laboratory operations
2
3%
Computer Systems Engineering and Electronics
2
3%
Science/Laboratory operations
2
3%
Computer Systems Engineering and Electronics
Engineering
2
3%
Range of other areas including: fashion, fitness,
community services, maths, special information,
information and library services.
7
Total nominated Industry Skill Areas
Answered question
Skipped question
Total
12%
60
100%
#
%
45
98%
1
2%
46
100%
3. At what qualification levels have you used online assessment?
Answer Options
#
%
Certificate 1
0
0%
Certificate 2
5
11%
Certificate 3
24
52%
Certificate 4
30
65%
Diploma
25
54%
Advanced Diploma
6
13%
Associate Degree
1
2%
Other (please specify)
0
0%
Respondents who answered the question
46
100%
Respondents who skipped the question
0
TOTAL
46
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
100%
Page 29
Assessment e-Risk Survey of Key Stakeholders 2014
Appendix 2 - Verbatim comments
The following presents the verbatim comments of respondents. Very limited editing of
responses has occurred. The editing took out direct references to individuals or institutions
and comments that did not relate to the topic of this enquiry - the veracity and authenticity of
online e-assessment. Where the spell checker identified errors corrections have been made.
2.1
Employers
1. Assessment in general. The first part of this survey is seeking your views more
generally on the assessment in vocational education and training.
a. We monitor the performance of all staff and would soon be aware if an employee
lacked the required skills and knowledge.
b. We can quickly determine if someone is COMPETENT OR NOT when we get that
person on the job.
c. If we found a gap in the training and assessment of an employee we would like to be
able to register our concern.
d. We would want to be able to register our concern confidentially.
e. It is NOT the role of an employer to check that an employee who has been assessed
actually has the skills and knowledge required.
No employer should just accept qualifications as meaning that a person is skilled
Question 'c' is a bit of a leading question. What if the training and assessment was OK but it
had been some time since the employee had performed the tasks? Consider revising
question to read: 'If we found a gap in the recent training and assessment of nationally
recognised qualifications or UOC for an employee we would like to be able to register our
concern with the relevant authority'.
The Employer needs to be involved in the training of staff and work with the training
organisation to ensure the employee has the skills and knowledge required. If the employee
does not have the skills and knowledge required this needs to be discussed with the
employee and training organisation to work out how to rectify the situation.
The Employer needs to be involved in the training of staff and work with the training
organisation to ensure the employee has the skills and knowledge required. If the employee
does not have the skills and knowledge required this needs to be discussed with the
employee and training organisation to work out how to rectify the situation.
2. Assessment in general - continued.
f.
I have READ the standards for skill and knowledge relevant to our employees as set
out in the Units of Competence.
g. The standards for skill and knowledge relevant to our employees as set out in the
Units of Competence are CLEAR.
h. The standards for skill and knowledge as set out in Training Package units of
competence are USEFUL and RELEVANT.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 30
Assessment e-Risk Survey of Key Stakeholders 2014
i.
j.
It is to be expected that every now and again someone will get through the
assessment process and be deemed competent when they do not have the required
skills and knowledge.
Even the assessment of ONE person as competent who does NOT have the required
skills and knowledge would cause me to be concerned about the quality of the
assessment system.
Competency based assessment should mean that nobody should be assessed as competent
when they are not. This means the RTO carrying out the assessment is not fulfilling their
obligations as part of the AQTF.
Child Health Nurses and Aboriginal Health Workers working in Child Health do not have
'units of competence' outlined/ agreed across the state of WA (or Nationally) let alone within
organisations. There is significant lack of clarity in this area.
Question 'j' needs a bit of context, and is again a leading question.
The problem also lies in the competency package in the first place not meeting the industry's
standards or needs rather than the assessment process.
As an Employer I find the training packages hard to understand.
3. To the best of your knowledge how many of your employees have been assessed
(fully or partly) using some form of online or other computer technology to do their
assessment.
Do not agree with online assessment. It does not show skill competency.
This refers to mandatory corporate training mainly - not necessarily other specific aspects of
how staff conduct their jobs (little online or other technology is available in the Child Health
area)
We use on-line, though not against national units of competence.
Some courses seem to have been made less stringent to allow online learning
Online learning has been quite successful in the area of repetitive theory based inductions
and safety training, though in most cases delivered as blended learning (Online theory + face
to face assessment). The success rate of Pure online training (not blended learning) is quite
low.
6. Specifically about online assessment…
a. I have found that employees who have undertaken online assessment have skills and
knowledge to the same standard as employees assessed in more traditional ways.
b. I am aware of employees who have indicated that they have cheated when doing
online assessments that were supposed to be their own work.
c. I am aware of employees who were assessed as competent through an online
assessment process but were not competent.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 31
Assessment e-Risk Survey of Key Stakeholders 2014
d. It seems to me that online assessment is as at least as reliable as other forms of
assessment.
e. Cheating to complete an online assessment will NOT be useful to a student who does
NOT have the skill and knowledge because these gaps will be discovered in the
workplace.
A good assessment also provides extra training for the student throughout the process. Most
online assessments do not allow for this type of engagement in the learning process.
Online assessment will only cover the theory aspect of the skills and knowledge practical
application, nor does it meet the assessment principles i.e. flexibility, use a range of
assessment tools to meet the student needs, nor competency over a period of time in
different context is demonstrated. On-line assessment is impractical for trade based UoC
where a strong practical skills demonstration is required for example the UoC install trench
support would not be feasible to deliver or assess on-line while the theory could be assessed
online the theory is implicit in performance of the trenching outcomes.
Many students don’t want the skills rather they want the unit or qualification.
Students who undertake on-line assessment can 'miss' part of the required learning because
it is not always interactive despite having chat boards and you-tube clips and a lot of bells
and whistles. This is particularly the case in human services courses such as aged care and
mental health. The interactive learning and transfer of skills between students or co-worker in
a class or workplace setting cannot be duplicated online. i.e. assessing interpersonal skills
on-line is nearly impossible because this requires direct observation. If interpersonal skills
are an element of the performance criteria, then a written on-line assessments cannot
adequately demonstrate competency. On-line assessments have a place in the cycle of
assessment for competency, but they should not replace face to face delivery in human
service fields to make is "easier" for employers to tick the box and say their staff are qualified
to meet an industry compliance requirement.
I am aware of Advanced diploma level mental health courses being fully on-line, whilst it
meets AQTF compliance I do wonder if it is aimed at giving the students the required
knowledge or the skills required to do their job, or to reduce training costs for the VET sector
or the employer. Drop out rates in this on-line course were above 60% despite it using a
best practice online technology. This degree of attrition (failure to complete) will in reality add
to the demise of VET as a useful alternative to gain a qualification. Improve on the quality of
delivery, materials and facilitators in 'face to face' classes and offer them outside of 9am to
3pm time slots and you may be surprised how many more students enrol.
Working in an adult learning context – it’s important to appreciate that people learn in
different ways. The method of assessing knowledge to ensure understanding should not be
confused with whether you 'cheat' or not.
It's difficult to generalize about the perceived benefits of online versus classroom and
combination delivery formats. In my experience, any education / training delivery format is
potentially going to provide a less than ideal outcome for certain participants cohorts. This
can be due to a mismatch of learning styles with the delivery mode, literacy, computer
literacy, participant anxiety, or functional abilities with the English language.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 32
Assessment e-Risk Survey of Key Stakeholders 2014
Question 6b...what is the purpose of that question? it has nothing to do with the purpose of
the survey. Just because someone might cheat has nothing to do with the methodology of
the assessment! I think your research methodology is quite flawed, and littered with leading
questions like this.
Although gaps should be identified in the workplace, issuing Certificates to those not
competent undermines the VET system and hurts its reputation within industry.
I have responded to question 'd' based on what I have seen to date, and on-line assessment
in a competency based domain has a long way to go still (there may be some doing it well,
but they must also have the money and resources to go with it). Question 'e' is very
subjective and would depend on so many factors, such as engaged line managers, other
mechanisms if they even exist.
No form of assessment is fool proof - and online is no different. Face to face assessments
are equally known to have missed gaps in competence and no form of assessment can ever
guarantee that skills in the workplace will consistently demonstrate competence. Assessment
are always a representation of "Acceptable" level of competence on the day of assessment.
Mentoring, and training is part of every day on site works, and a person should only be
assessed when his employer believes he is at the correct level for the assessment.
On line assessment can check for knowledge and understanding but it is difficult to
accurately assess for the competency of technical skills online.
7. OTHER COMMENTS
You may add any other comments or observations regarding online assessment or
this survey in the box below.
Online assessment is not acceptable to employers who require the skills to perform tasks in
the workplace. There are severe OS&H issues here. I believe employers could well come to
the conclusion that the RTO who issues qualification or statements of attainment to a person
who is not competent could take legal action against the RTO. It always surprises me that
this has not already occurred. If saving money is the objective of online assessment then that
money should be placed in a trust for future payouts on litigation. Students have a rightful
expectation that any course they undertake gives them the competencies to perform in the
workplace. Employers have a rightful expectation that they can have confidence in the piece
of paper that states a person in competent.
Employees with language literacy and numeracy issues might find it difficult to communicate
their knowledge and skills via online means. Online learning doesn't address practical skills
which is where the application of knowledge is truly assessable.
It has been my experience with on-line(e-learning/distance learning) a raft or problems arise
in align the UoC to meet the rules of evidence, establishing the authenticity of the candidates
work is one issue, fairness is assessment not to disadvantage those with LLN issues,
sufficiency in that competence is demonstrated in all dimensions in different
context...because a person can articulate and describe a process/procedure does not mean
they are competent...it may just mean they are good researchers.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 33
Assessment e-Risk Survey of Key Stakeholders 2014
On line training is fine but on line assessment is where the problems arise. On line
assessment on its own can never be accepted 100%. There need to be checks and
balances. Also difficult to assess skills on line. Assessing knowledge on line is easier as long
as there are checks done.
There is so much discrepancy between assessing competence in the VET sector versus
Tertiary sector whether it is online or not - this is a significant issue.
Thanks for the opportunity to comment.
It seems a little idealistic, but I firmly believe that as an industry, we should pre-engage with
participants to establish whether an online delivery format is right for them. Even if it just a
simple checklist for employers...are you setting up your employee to be deemed 'not yet
competent' and setting yourselves up for months and years of dissatisfaction, performance
management and workplace retraining? If so, be proactive and send your employee to a
more appropriate delivery format in the first place!
Online learning and assessment is a very favourable modality for some learners as it allows
them to earn in their own timeframes in comfort without pressure. I have seen as many
incompetent practitioners/ graduates who have gone through face-to-face assessment in
industry. I personally completed my whole Graphic Design dip via online learning and there is
little point cheating as you will be discovered on entry to industry (though I am sure it
happens!).
Within industry, online assessment is often seen as an easy way to get a qualification
because:
- Students can cheat
- It is difficult for trainers to identify gaps in learning
- It is less of a commitment from students
Online assessments are the only way we will be able to sustain vast amounts of knowledge
that employees are expected to learn in today's world where need for knowledge and data is
increasing and the number of people available to do the task in decreasing. The challenging
question here is not whether or not online learning should be implemented, but more around
the ways to integrate online learning and assessments to blend with traditional forms of
learning.
It is vitally important that HR professionals within organisations do not fall back to the lazy
position of relying entirely on training to decide whether or not an employee is competent to
do their job. This is an easy to administer but totally wrong representation of what training
can achieve, and puts unrealistic expectation and burden on ANY assessment process,
irrespective of the method. Performance Management is SO much more than 'send 'em to a
course'.
On line assessment can check for knowledge and understanding but it is difficult to
accurately assess for the competency of technical skills online.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 34
Assessment e-Risk Survey of Key Stakeholders 2014
2.2
Students
3. If you have been assessed online - either partially or fully - please indicate the level
of your course - if you have not been assessed online please skip this question?
post grad studies during semester
Bachelor
partial at this point
electrical licensing upgrade
University
4. Your general observations… please note - all questions need a response - If you
have not been assessed ONLINE there is a don't know or not applicable option for
each statement - if this is appropriate.
a.
b.
c.
d.
e.
f.
Overall I was satisfied with the assessments included in my ONLINE study
I would recommend online learning and assessment to others
The ONLINE assessments I undertook were relevant to the workplace
The ONLINE assessments were a good test of my knowledge.
The ONLINE assessments were a good test of my skills.
The ONLINE assessments were a good test of my ability to apply the skills in a
workplace setting.
g. A lot of what I have had to study is NOT relevant in the workplace.
h. Social media has been used as part of the assessments I have done (e.g. blogs,
Facebook and twitter).
i. Smart phones and/or tablet type technology has been used as part of the
assessments I have done.
it is much easier than the paper based system that I had to use for my Advanced diploma
modules. Simply submitting online is far simpler than having to print, sign documents then
get to the post office to send something across to WA which from central Queensland can
take over a week to get to Perth!
Some questions here relate to being in a workplace... I am in this course to gain employment
in the related field. There is no way I can know if the assessments test my skills or my
knowledge in a work place. I wish all of my lecturers would use online assessments
Online study has been great for accessing information needed for assessments but
personally i deal better with hard copies of a course outline or assessment task sheet so it
was useful for access but i had to print off all the information anyway
I consider online assessment to be valuable for testing knowledge, not skills (or how to apply
that knowledge)
The online studies allows people to work full time thus improving the academic knowledge.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 35
Assessment e-Risk Survey of Key Stakeholders 2014
Online study seemed difficult to truly implement real world experience and education. The
structure of online learning was great introduction to classroom life but much of the
necessary skills and applicable knowledge for me was lacking.
I studied through Open Universities and while I learned a lot, very little seemed applicable to
the real world.
I believe online learning is great for the mature age student that can manage their time. I
have completed an undergraduate degree online before this course. The only complaint l
have re my current course is some feedback from assessments conducted by tutors is
difficult to define at times and the difference in marking from one to another is often quite
wide. Overall l am very happy with online learning and would definitely recommend it to
others. Thanks
There is nothing wrong with relevant books and on campus / face to face delivery.
I found it hard when it came to uploading my assessments to Turnitin, it was only allowing 1
Document and once it is submitted you cannot add anything if needed. I also understand tafe
needs Turnitin for plagiarism reasons.
The assessment aspect seemed fine, my dissatisfaction arose from the tuition aspect being
pretty well non-existent.
I have been able to access my study material via iPad and this allows me to study more
frequently without worrying about. Internet connectivity away from home
Online learning is not for everyone, this takes a good amount of self-discipline and . Unless
the online assessments are done in a classroom setting / environment rather than uploaded
or filled in elsewhere there is no way to be 100% sure that it is the student’s own work.
Relevancy to workplace is subjective so can't comment.
social media was not used in my assessments as a source but as an example of a marketing
trend therefore i neither agree or disagree and with smart phones i do receive feedback and
look up stuff on it but i do not use it for handing in assessments or anything of that sort
therefore i disagree and agree there too.
in doing full time study for Business and administration i feel like this information has helped
me the assessment were quite easy but some parts were not quite easy to understand.
Readings need to be available on Android smartphones for flexible study. I would like to be
able to study readings away from my PC such as in bed or on the move away from the
house.
This is my first on-line diploma and I still have 2 courses to finish, I like the flexibility and
continuous feedback from the assessments.
Feel restricted in some assessments as they are now multiple choice and therefore no room
for opinion
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 36
Assessment e-Risk Survey of Key Stakeholders 2014
5. The risks as I see them are…
a. Students doing ONLINE assessment take advantage of the situation and copy and
‘cheat’ from other students and other sources.
b. This sort of copying and 'cheating' happens in ALL forms of assessment not just
online assessment.
c. Many students deemed competent through ONLINE assessment are not actually
competent.
d. If ANY student is deemed competent when they are NOT this will be identified in most
workplaces.
All my studies i do all the researches by myself when i get stuck i ask the lectures.
some cheat but will be found out sooner or later
I think if someone wants to cheat, first of all they will make it happen whether through online
or in-class. also, it will catch up to them either in the workplace or in further studies, and
therefore they are not helping themselves.
Regarding "cheating" and copy I think that it depends of factors such as:
- the age of students: in my opinion, young students ( 18 - 25 years old ) are more likely to
cheating than older students,
- maturity: person who wants to learn, get knowledge and be competent in his/her future job,
won’t be interested in cheating and copying.
I agree also with point "d": the best test from our knowledge is the workplace.
Unfortunately some workplaces will not even give a chance to someone without previous
experience as some people they have previously employed have been a failure due to the
fact they were not really competent.
In a work place colleagues, work mates, teams and peers work together and share
information or risk break down in a work place. Plus resources are not hidden to employees
by the superior. Employees attend work place training and are provided every answer in a
work place. Competency in the class room will not necessarily translate to competency in the
work place.
Most students are quite protective of their own work and assessments, and a cheat would
find it difficult to copy from another student. Knowledge alone doesn't make competency only experience doing the work repeatedly ensures competence.
Its actually hard to copy and cheat as one works/studies alone being there no groups to
compare the notes/discuss.
Online study was actually just as difficult to "cheat" and copy. Most students are quite
protective of their work. If anything I would say it is harder to cheat and copy in comparison
to a classroom. I find many students here try to look at my work. I feel more comfortable
doing any real study that will be assessed at home so as to protect my work.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 37
Assessment e-Risk Survey of Key Stakeholders 2014
I feel that close supervision is necessary in online assessments and if a student is caught
cheating they should automatically receive a fail for the unit. If this is enforced systematically
then surely the rate of cheating would decrease.
Most online learners are working fulltime and use this mode of study to step up the ladder or
further their skills
Depending upon e-learning for delivery of material may lead to further student complacency;
working to pass exams rather than learn material, cheating.
I believe any student who goes to the trouble of doing further education online is not out to
cheat and they are very much aware of the need to complete and understand the information
to succeed in reaching their goals.
I can’t speak for others who may cheat. I have to say for myself working full time and
completing online studies has been a lot of work and I've had to seek out and learn and apply
everything that's involved with these online studies. Cheating I think would prove difficult and
if some do I think it would be obvious.
The first two questions don't really apply to what I am studying. All the information to actually
be learned was required to be found from online sources not affiliated with the course, so
really it's basically asking people to "cheat" as you're obviously going to want to find the
quickest way to find the information you need. It's counter-productive, it's frustrating, and I
think that it is a travesty that PW is asking students to pay tuition and then make them use
resources provided for free by people who are not employed, affiliated or renumerated by
PW.
it is hard to determine competency until you are in the field. people who are not competent
won't last long
Whilst true that if prone to copying and 'cheating' it can and does take place in almost all
forms of assessments, the likelihood of it happening whilst assessments are being
supervised is lower.
with ( a.) i have never seen it happen but i have heard of it though i don’t think it is possible
without it being discovered by Turnitin
if you cheat, you must know well that you cheat yourself not anyone else. If you don't learn
how to do the job so how can you do it?
every aspect of being a student contains some form of computer aspect whether online
assessments or on a word document etc. either way is an easy way of 'cheating' and would
be up to the student and lecturer to ensure this does not happen.
coping and cheating may happen for online course but only if you know someone doing the
same course
The online assessment isn't a true test of practical application
Where I neither agree or disagree I feel I am not in possession of the appropriate data to
evaluate this question effectively.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 38
Assessment e-Risk Survey of Key Stakeholders 2014
We are told we can talk and help each other and that results in 'cheating'.
I have never had any contact with another student and never copied anything.
I stand behind all work i hand in and do not cheat
6. The risks as I see them are…
e. I have come across examples of students deliberately seeking to be deemed
competent when they know they are NOT competent.
f. The meaning of 'plagiarism' is not easy to understand.
g. Plagiarism is a bigger problem when doing ONLINE assessments than when you are
doing assessments in-class or as written/printed homework.
h. ONLINE assessment students are more likely to inappropriately collaborate, copy and
cheat when doing their assessments than students assessed in other ways.
Please include any comments of a specific or general nature in the box below:
Avoiding plagiarism is a bigger problem than understanding it. Not every student will know
how to put referencing into practice as they may have come from a field where research was
not needed and therefore citations were not needed. More guidance on avoiding plagiarism
would not hurt rather than letting students seek out relevant information on the issue.
Most students protect their online tasks and don't share.
As previously mentioned l have completed a degree online and at times felt isolated
regarding my study. All tutors when contacted have always been receptive to any questions l
have had and helpful, so this is reassuring when you face any doubt about the unit being
studied.
Hard to make such a blunt statement on point .h, but it's not hard to imagine how online
assessment may cultivate that kind of behaviour.
I believe if you feel you need or want to cheat to complete assessments, you'll find a way. It
doesn't matter whether its distance learning or classroom study.
When there is only one way of doing something is it plagiarism if you use that one method
that you saw someone else use? It's not that it's hard to understand, it's that the definition is
outdated and doesn't apply to all study areas. I don't think it would be very easy to get away
with plagiarism on the internet if the marking is done well, if you suspect someone is
plagiarising, copy the offending material, and search for it on the internet, if it's plagiarised it
shouldn't be too hard to find. There are also specific utilities for teachers to check for
plagiarism.
Plagiarism is easy. copying the work of others without citing or referencing appropriately. if
the work is online, then the student is more likely to copy and paste directly, as there is no
one to watch to see how long it takes them to do the work. this also means that chunks of
text that do not fit the language of the rest of the text can be copied in turn by the assessor
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 39
Assessment e-Risk Survey of Key Stakeholders 2014
and searched in Google to determine such. and unless e-students are in contact with each
other, i would think that inappropriate collaboration, copying and cheating would be more
likely in a class room situation.
I think plagiarism is potentially a bigger problem with a report or essay type assignment
rather than an online multiple question or question and answer type assessment. Written
report and online reports are equally subject to plagiarism if they are not checked by
something like 'Turnitin' before they are submitted for assessment.
it is easier but any competent student won't need to plagiarize.
plagiarism is the biggest thing not to do but in some cases i disagree with it because i think
we should have a right to copy it but clarifying it what it really means and i think its good idea
provide examples so we can put all the evidence together before we write it.
Where I neither agree or disagree I feel I am not in possession of the appropriate data to
evaluate this question effectively.
I think people need to remember that there are no new ideas and the same ideas are put
forward all the time in the work place i.e. when I go to remove a bolt I do not quote the
person who invented the technique. Unless you are conducting specific, unique research, is
there really a point to reference in a similar fashion to university levels?
7. To address the risks…
a. It is important to establish the true identity of a student when they enrol to study
ONLINE.
b. As a condition of any type of enrolment students should sign a statement committing
to not submit ‘false or misleading evidence of their skills and knowledge’.
c. To keep people honest a RANDOM SAMPLE of students assessed ONLINE should
be contacted and asked course related questions.
d. ALL students assessed ONLINE should be contacted and asked course related
questions before they are deemed competent.
e. Monitoring students when they do ONLINE assessment activities should be used to
help identify students who are copying or 'cheating' and not doing the work
themselves.
I think it all depends with the student for example now in diploma there are some quizzes so i
think they can also help to see if students are cheating.
I strongly disagree with most of these statements because, in my opinion, the
university/college role is not to be a "police". I would not feel well and comfortable in this kind
of environment.
7C, only if there is a suggestion of cheating or plagiarism.
Unfortunately this would be very hard to police and be time consuming. It may even negate
the benefit for the honest among us to do online study
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 40
Assessment e-Risk Survey of Key Stakeholders 2014
Rather than contacting people out the blue which would be a bit random and unfair, there
should always be a final supervised written exam for online courses.
During my online studies, prior to submitting any work, all students had to agree to a
statement regarding plagiarism and cheating and that all work was their own.
No further checking up on individual students occurred to my knowledge regarding phone
call or person-person assessment.
As a mature age student, there is no advantage cheating or short cutting a subject. I have
undertaken the study to further my knowledge. To grasp the content it makes no sense to me
to cheat. Delving into the topic enables an online learning to understand the information
which is the whole reason for studying.
Don't annoy people by asking them stupid questions, if you're that worried, set up an online
word processor (or otherwise) that they have to use to do the work and log their work as they
do it.
I think all online students should contacted but any of the above actions would need quite a
large number of resources and be hard to manage if there are a large volume of online
students.
if every student is going to be contact why do online course to start with
i agree with that students that drag and paste it should not competent because it might not
be the right answer but in some cases we should copy it so you look it up its history then
make it in your own words people who do it online i think it’s a great idea but when you try to
submit you have to find the other folders they could be up the top or bottom each week they
should place the latest assessments on the bottom so it’s not confusing
All above suggestions to address risks seem a good way to establish authenticity.
Go thru questions with students individually and prompt to correct written activities as well.
I think that asking students course related questions should be at the discretion of the
Lecturer particularly if they can see the student is struggling when they submit written
answers of if they have a suspicion that the student is not doing the work themselves and
definitely contact with that student should be made before commencement of the course.
8. To address the risks…
f.
EVERYONE should have to do at least one test in each unit under exam conditions to
ensure they are doing the work themselves.
g. There are people and agencies that will help you complete the assessment
requirements of a course without having to do the work yourself.
h. In the majority of workplaces employees who lack the skills and knowledge will be
identified and action taken.
i. There are NOT enough students who copy or cheating to justify inconveniencing ALL
students.
j. For every technique developed to reduce cheating in ONLINE assessments those
who want to cheat will be able to find a way to work around or 'hack; the technology.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 41
Assessment e-Risk Survey of Key Stakeholders 2014
8f, exam conditions? As in timed? If so then yes. agree.
8j, possible but unlikely.
Questions marked with I don't know either make no sense to me or I cannot relate to.
I agree you can only address "opportunistic" cheating. Hardened cheats deliberately
manipulate any preventative systems in place.
There were exams as part of my degree. I had to attend a physical address and sit under
supervision among other students from other courses in a large function room. If you did not
attend that exam you failed unless you had prior arrangements which would be for the
following study period to sit a different exam. Cheating in exams was and is impossible.
During my online study, l have had a mixture of assessments, both exams undertaken at a
central venue and others where l have been assessed by work online. Both options worked
for me. It is more about grasping the content and both options l believe assess this and have
worked for me.
Is it cheating to ask a friend for help when your question goes unanswered for a week by
your lecturer? If you don't want people to look to other sources for information, don't direct
them there (as is the case for the course I have been doing) and respond to question
promptly or have FAQs for certain sections of the course. In my opinion most of these
problems could be minimised if you improve the quality of service to the students.
Everyone should do at least one classroom/supervised test i.e. under exam like conditions
but I think the test should be for example a multiple choice or similar quiz like test
(competency assessment) with 5 or so questions relating to each topic covered in class
rather than an actual exam (some students find exams stressful and may be competent but
feel under too much pressure). So more like and in class competency assessment than an
exam.
my final test wiring switch board is done on campus
people who lack the knowledge and skills are really not learning i think teachers should be
doing a lot more they should be describing more information to the students before
addressing the assessment.
Cheating is a serious offense, one cheater compromises the integrity of the whole student
body and course. There will always be people trying to profit from other's laziness.
Exam should be open book though as I personally refer to notes to answer questions
Where I neither agree or disagree I feel I am not in possession of the appropriate data to
evaluate this question effectively
I started my course in NSW and they had tests for every unit. I found this really hard to
organize and it inconvenienced the people who had to supervise me. I think that for a TAFE
qualification the assessments are fair but possibly a different testing system would be
necessary for a university degree.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 42
Assessment e-Risk Survey of Key Stakeholders 2014
While there are a minority of students who would attempt to circumnavigate test situations,
once in a work place skills, knowledge and competency are quickly identified and if a person
is lacking they won’t make it very far. In such situations it is to an individual’s best interests to
not cheat and to successfully complete a course of study that they have enrolled in.
The purpose of an online study course is that it is available for all students who can’t
physically attend classes themselves, due to work and other outside commitments or
distances from the collages, and people such as myself who have worked in the industry for
years and want to formalise their skills by studying for a certificate or diploma. If making
online students attend classes for exams as part of the course may cause a lot of
inconveniences for each student, they may have to take unpaid time from work (if they are
indeed allowed), travel long distances to attend etc. and this may discourage people from
enrolling in the first place.
9. Addressing the risks continued...
k. There needs to be a greater use of technology such as video, voice and keyboard
monitoring to address ONLINE cheating.
l. Only students enrolled in HIGH RISK courses should be monitored in some way
when they do ONLINE assessments.
m. The internet has a lot of resources that can help you complete assessments without
doing the work yourself.
n. Students who are serious about cheating will be able to find ways around any
technology designed to reduce cheating (e.g. they can find cheats and 'hacks' from
the internet).
o. Other (please specify)
9i, high risk? A better definition of high risk would is needed.
Going online to do research and find information for the benefit of applying the knowledge to
achieve grades is work in itself. I'm unsure if my course is "HIGH RISK". I don't know
anything about hacking so I wouldn't know how others do it.
Using online resources in programming is a given. The course encourages it but it's not so
much cheating rather plagiarism that can be a problem.
Most online assessments were not monitored; I don't think it is available yet. Online
assessments shouldn't have to be. During an exam, the student would fail if they haven't
done any or enough work independent of the internet. I guarantee it, I was a student who fell
into that hole.
IT is great for developing and researching so you get a better understanding of the
information being studied. As a learner resources and reading greatly assist and allow you to
gain a broader understanding of a subject. Provided you highlight where your information
comes from (referencing) you can't really be labelled a cheat.
For any level of monitoring to be performed without consent would be unlawful. Appropriate /
relevant monitoring levels need to be clarified to students if this is occurring or was to occur.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 43
Assessment e-Risk Survey of Key Stakeholders 2014
I don't know about 'hacks' or companies that help with assessments so can't answer if they
exist or not. I think if you are prone to cheating or taking a short cut you will find a way to do
it.
people who cheat should helped out by the teachers so they have the skills and knowledge
to re do it but with technology i think technology gives u a lot more information
All students need to be monitored the same way. I firmly believe that cheaters always get
caught, it'll come out sooner or later
Where I neither agree or disagree I feel I am not in possession of the appropriate data to
evaluate this question effectively.
unsure of why people would cheat... I study to further my knowledge. It's disappointing to
hear that cheating is such a huge issue.
There is information on the internet but you still have to read everything through everything to
search for the information that you need (In my opinion this only adds to your knowledge).
Also if you are employed you have people to ask questions to help you to learn so I don't
think that using the internet is cheating just helping to expand your understanding, except if
you are hacking into a system to steal answers which I know I don't have the skills to do!
It may be useful for lectures to assess each student before commencement of an online
course by conducting an interview via phone, video etc. to obtain a basic understanding of
the individual's competency level such as language skills, background, education and work
experience related to the course outlines before they begin they studying. This will enable
them to somewhat determine whether a student is working at that level or not. Also
information and resources on the internet can be obtained by ALL students not just those in
online courses.
10. If you have any other comments or suggestions you would like to make in respect
to online assessment (or the use of any other computer technology in assessment)
please type your comments into the box below. Thank you again for taking the time to
complete this survey.
I find the learning materials provided very helpful and the support. I feel that a big focus
should be placed on plagiarism, this should be detected in marking assessments and a
warning should be issued to individual students.
I do like doing online rather than in-class because it takes time pressure off a bit when
completing assessments. However, I think some units need to still have in-class
assessments to ensure they are done by the right person and make sure you know your stuff
and are completing the work.
I think that most lecturers have the knowledge and the electronic back up to check for
cheating and they should be able to check references for essays in an easier way as some
who copy from others would be reluctant to correctly identify sources of references.
Another way to stop cheating online is to have series of questions instead of a stock
standard question that is used time after time.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 44
Assessment e-Risk Survey of Key Stakeholders 2014
It would be great if more lecturers used eCampus and Moodle to host quizzes, they are much
easier to fill out then using the conventional pen and paper method.
Plus, there is an electronic record when students access or submit assignments. That can
only be a good thing for lecturers.
I'm sure there are people / students out there trying to short cut the system. This will always
be the case. I completed my undergraduate degree two years ago in my late 40's and loved
every minute of studying. Hence my continuation of WHS today. To me, there is no reason
for cheating and believe l probably speak for a lot of other mature age students. It is the
challenge, knowledge and ability to keep the mind active which motivates me. Regardless of
the type of assessments that take place, judging all online learners as incompetent or cheats,
just does not make sense to me. Or why they would want to do this anyway. Online learning
enables people like myself who have young families and are tight for time the flexibility to
learn at a time that is suitable for me. This is a lot more convenient than having to attend
lectures. Regards.
In general, during the tests I did … I hardly had time to read answer and check all questions.
So if I was not well prepared I would have failed anyway as it is not possible to find answers
on-line within given time frame.
Online assessments provides the means to achieve my goals within my working environment
and in my own time. I don't believe I can cheat because the course is laid out in a way you
have to complete each section. I can ask work mates for help, but to ask for that help I'm
admitting I don't know how to do it myself. If I did ask someone, I have to understand the
question and the answer anyway, which is still achieving the objective of the course.
There needs to be a way to show the assessment grades to the prospective employer.
I think online students are usually mature working students who would not otherwise have a
chance to attend college full time and are committed to gaining a qualification through hard
work
If someone wants to cheat they will. No amount of blocks or surveys will stop that. The
computers allow for honest students to research to help them with assignments, projects etc.
Perhaps an in-depth referencing course should be conducted in week 1 as a mandatory
component of high risk subjects.
this is all about cheating people will always find a way to cheat if that is what they need to do
this survey has really helped me to understand why people cheat by copying and pasting
information that doesn’t belong to them and the online assessments makes it a lot more
easier for students but teachers need to be informed what we are doing this week i find it
pretty hard trying to keep up to date with everything
I would like to say how grateful I am that I can study online around work and also I can finish
my course sooner if I'm motivated, which does actually motivate me. I think the 'spot quiz'
idea is the most promising you could also have a time limit so once students start the quiz
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 45
Assessment e-Risk Survey of Key Stakeholders 2014
they have to complete the questions in 5-10 minutes, this would students calling a friend,
internet searching etc. as they won't have time if they want to answer all the questions.
I would like readings to be available to save onto my computer or smart phone to read offline.
I recognise there are many risks attached to this but in past education a hard copy module
was supplied with study resources. It seems that the main risk with supplying a PDF copy of
readings would carry copy right issues.
Thank you for providing the opportunity to provide comments relative to online study, I am
not sure the information I have provided was the intent behind the survey, however any
feedback is good feedback.
I think that students should be able to print off all of their lectures/units to be able to study at
home and not have to constantly click the forward and back buttons to view the lectures
online.
I have been completing a Diploma of Management over the past 14 months. The course
material is relevant to my needs and I have found this course interesting. However, the
feedback I receive from my lecturers has been minimal … I feel the online student is not
getting the level of service that is necessary to gain the most from this method of learning.
Text books should still be a part of online study and should be made available for purchase.
Providing links to websites or You Tube clips as a replacement for text book information is
not enough. This is what sends students to the net in search of information in the first place.
If a student is cheating they are only really cheating themselves and I believe if they are not
competent in their qualification this will be evident in a workplace.
Internet resources are good for research and can be used without cheating.
it’s easy to cheat but still you have to study and prepare your own paperwork, it’s all same
about everywhere cheating ,if you sit in class and study. online study is more complicated
and takes long time. very less help. cheating somebody work no big help at all, student
needs to read through then only can answer questions.
and at work or anywhere else no other people have time for you do your assignments.
Online system is good way to doing course and people like me working full time can study
too in slow pace. Thanks
Screen is not user friendly...I would prefer to look at a whole screen than have to keep
scrolling every 2 sentences. Assessment should be thoroughly edited and completed by
more than one person to eliminate any mistakes...
It would be a lot easier if you could be able to print out the information pages or if they had
sub-headings, as i found it very time consuming and hard having to go back and forth from
my course to the information pages and then having to click through the numbered pages to
find the page that was relevant to my question.
Also i found if you stay logged in for more than an hour and are still typing your work, when
you go to save it, it just logs you out. Making you have to re-type all of the work you have
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 46
Assessment e-Risk Survey of Key Stakeholders 2014
already done. This has happened quite a few time to me already and is very frustrating and
time consuming!
The survey focuses on the issue of competency of some students due to the nature of
delivery, but I have studied both in class at Tafe and now online Tafe. Tafe education allows
for a more open delivery of content than say university would. The only difference I have
encountered between online and face to face delivery of tafe material is finding appropriate
time to complete assessments. Face to face assessments provide a time frame which the
assessment should be completed in, whereas the online assessments do not and you can
feel you have been lead blindly into an attempt which you thought was only going to take 60
minutes to complete, but then takes three or four times that amount of time due to the
vagueness of some questions included. A guideline of how long the Tafe believes the
assessment may take will assist online students in setting aside an appropriate amount of
time to complete assessments.
There are a number of plagiarism checking programs which universities use when students
hand in assignments. These programs scan assignments and compare submitted work to
journals, books and other students work to pick up copying and cheating. TAFE should be
able to utilize the same type of software to stop a student from submitting work that is not
theirs, has incorrect referencing or breaks copyright law.
2.3
Auditors
1. Assessment in general The first part of this survey is seeking your views more
generally on the assessment in vocational education and training.
From my observations…
a. The proportion of all summative assessments relying on e-assessment has grown in
the last 3-5 years.
b. The growth in e-assessment is limited to a few industry skill areas.
c. The growth in e-assessment is predominantly at the higher level of AQF 4 and above.
(For level 4 definitions see: http://www.aqf.edu.au/aqf/in-detail/aqf-levels/)
d. The growth in e-assessment is predominantly at the lower level of AQF 3 and lower.
e. Knowledge based tests, and the online quiz, continue to be the dominant form of eassessment.
f. The last 3-5 years has seen an increase in the use of social media in e-assessment
(e.g. blogs, Facebook and twitter).
g. The last 3-5 years has seen an increase in the use of smart phones and tablet
technology as part of e-assessment.
h. I have seen e-assessment used to validly assess all four dimensions of competence
(task skills, task management skills, contingency management skills, job/role
environment skills).
i. I have seen e-assessment used to validly assess specific task skills (NB: this is a
narrow question about task skills and does not imply anything about the assessment
of the other dimensions of competence).
j. I have seen e-assessment used to validly assess knowledge, including knowledge of
skills required to undertake a task, but not assessment of the task themselves.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 47
Assessment e-Risk Survey of Key Stakeholders 2014
1b,c,d,e,f,g these questions are too broad to be answered in the context of this survey. If you
had asked "in my experience etc." I may be able to provide a response; however, I would still
question the applicability of the result to any specific argument.
1h,i - This is more reflective of the organisation's I have seen using this form of assessment
rather than any specific flaw in e-assessment as an assessment method.
The information provided depends on the training package requirements at the time of the
audit
There is anecdotal evidence of online E-Learning assessment increasing with the CERT III &
CERT IV hospitality (Cooking) qualifications, not entirely in regards to the knowledge based
components, but some of the practical aspects as well, evidence that is being assessed is
made up form photographs and video film.
The only assessment that I have seen available online has been knowledge based
assessment. The RTO's own lack of understanding of the requirements for valid assessment
has resulted in them believing this to be the only form of assessment required to cover off on
both knowledge and skills based assessment.
2. Assessment in general - continued.
a. E-assessment practice tends to be associated with an INAPPROPRIATE reliance on
the KNOWLEDGE based assessment of competence.
b. E-assessment strategies CANNOT be used to validly assess the SKILL aspects of
competency.
c. E-assessment CANNOT be used to collect SUFFICIENT evidence to validly assess
competence.
d. E-assessment should only be used to SUPPLEMENT a summative assessment
decision.
e. The majority of e-assessment evidence I have seen used was appropriately
AUTHENTICATED.
f. The majority of e-assessment DESIGN that I have audited is valid, reliable, flexible
and fair.
g. The majority of e-assessment PRACTISE that I have audited is valid, reliable, flexible
and fair.
h. Organised cheating and fraud is currently a significant issue for the quality of eassessment outcomes.
i. Cheating and fraud is a bigger problem in e-assessment than in traditional forms of
assessment.
There are some text based skills that could be assessed online.
A lot depends on the context of assessment and critical aspects of assessment, not just skills
and knowledge of the training package
E-assessment can be used to assess SOME aspects of skills, but not as the only form of
assessment.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 48
Assessment e-Risk Survey of Key Stakeholders 2014
Points B & C- disagree as I include the ability to video and upload via websites as eassessment. this would be valid evidence collection for assessment purposes and is
electronic.
3. To the best of your knowledge how many of your employees have been assessed
(fully or partly) using some form of online or other computer technology to do their
assessment.
a. Many students deemed competent through e-assessment are NOT actually
competent.
b. If a student is deemed competent when they are NOT this will be identified in most
workplaces.
c. I have come across examples of students deliberately seeking to be deemed
competent when they know they are NOT competent.
d. Plagiarism is a bigger problem in e-assessment than it is in the assessment of face to
face students.
e. E-assessment students are more likely to inappropriately collaborate on their
summative assessments.
f. For many students plagiarism is a literacy skill issue rather than a deliberate act of
cheating.
g. E assessment is too risky to be used as part of a summative assessment strategy.
h. It is important to establish the true identity of a student when they enrol.
i. The only level of identity risk that will not undermine confidence in the vet system is
zero.
Whether a graduate would be identified in their workplace as not being competent in the
areas completed would greatly depend on the competency of the people within that
workplace to actually know the difference. Therefore, some would be identified within the
workplace and some would not.
g. It depends on the nature of the qualification. A sound technician could be observed lighting
a theatre show in Brisbane by an assessor in Adelaide. It requires bandwidth
Depends on the training package requirements
Zero risk' does not exist. If the value of the outcome is higher than the cost of circumventing
measures to prevent fraud, there will always be fraud.
Point B- some workplaces are complicit in the practice of fraudulent assessment. As for the
rest, if employers/supervisors don’t know exactly what the qualification specifics are they
would not know whether they are being satisfied. They only care if their specific workplace
needs are met which are not always aligned with the qualification/s. Point I. this is confusing
and I cannot understand what you are trying to ascertain, hence the response of 'don’t know'.
4. Please indicate the main area of study of your employees who have been assessed
online or using some other form of computer technology (e.g. business management,
education and training, office administration, hospitality, cooking, electrical,
automotive etc.).
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 49
Assessment e-Risk Survey of Key Stakeholders 2014
a. As a condition of enrolment students should sign a witnessed statement committing
to not submit ‘false or misleading evidence of their skills and knowledge’.
b. As part of internal audit processes a RANDOM sample of e-assessment students
should be contacted and orally asked course related questions.
c. ALL e-assessment students should be contacted and orally asked course related
questions before they are deemed competent.
d. The use of timed assessment activities could be used to assist identify anomalies
created by copying and collaboration.
e. It makes sense to modify the principles of assessment to replace the word FLEXIBLE
with the word CONSISTENT (ASQA 2013, p44).
f. The way Units of Competence are written is part of the reason for problems with e
assessment.
g. It is important to establish the true identity of a student when they enrol.
The unit of competency is the outcome and currently RTOs use this unit of competency as
the observation checklist, the questions etc...We need to look at the role and assess the role
which the competency is linked to - then map what we can see if we observe or simulate
that role, then create additional instruments to gather sufficient evidence to make a
judgement - so no it is not the way in which units are written it is the way in which under
skilled Trainers/Assessors read them
4B and C. If the intent of these questions is to state that there is concern that the eassessment is not gathering sufficient, reliable and valid evidence of competency, then the
issue should be addressed within the assessment tool and associated process rather than
simply bolstered with other activities. If; however, you are asking if a validation process
should be undertaken on assessment materials and methods used, then my answer would
be "agree".
All students should be randomly contacted, not just e-assessment students.
It needs to be made clear that there is no difference between e-assessment, and distance
delivery/assessment. Distance as a mode has been undertaken successfully for many
years. Providing an RTO manages the student in a similar way, there are no issues.
Employers are telling us- directly or via reports such as Productivity Commission- that
applicants for work who have 'achieved' a qualification do not have skills. This is largely
because of poor assessment by RTOs, and that includes reliance on E assessment.
These same risks should be applied to all assessment
Identity verification is not an e-assessment issue - significant number of RTOs do not verify
the identity of students in any way or at any stage of their training or assessment, even
where students attend physically.
Point A- this is already done and is shown to be dysfunctional in preventing plagiarism.
Point C- this could only work where random sampling occurs re the works completed and not
the same each time. Students would know. In addition, the identity of the student would need
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 50
Assessment e-Risk Survey of Key Stakeholders 2014
to be satisfied as it would still be very easy for the student to have a stand in pretend to be
them on a telephone.
I have met people that receive payment from students to complete their e-assessment for
them - typically in RSA and accounting type courses. Identity confirmation is a serious
problem.
5. To address the risks…
a. The assessment competence of assessors is part of the reason for problems with eassessment.
b. All summative assessment should involve a test of competence under exam
conditions.
c. In the majority of workplaces performance management will identify employees who
were deemed competent when they lack the skills and knowledge required.
d. There are NOT enough students who engage in cheating and fraud to justify
inconveniencing all students.
e. For every technological strategy to reduce cheating there will be another
technological work around created.
f. There needs to be a greater use of technology such as video, voice and keyboard
monitoring to address e-assessment cheating.
g. A risk assessment approach should be used to determine when the above technology
should be used.
c - this would entirely depend on how well the performance management process is used at
the workplace and if they have a performance management process
5A. This is a very broad statement. How could any response beyond "don't know" be
appropriate?
5B. Presuming that by "exam conditions" you mean appropriate rigor.
5C. Again, depends on the competency of the person conducting the performance
management process. Some would be identified and some would not.
These technologies must address real time assessment e.g. through Skype.
Assessment of skills needs to be in the work place, as simulated workplace conditions are
usually inadequate
Point D- we already have some employers being selective of students for work placement
based on the RTO they have attended. Best business practice should not be seen as an
inconvenience by RTO's rather as a protection of their brand.
6. If you have any other comments or suggestions you would like to make in respect
to e-assessment, the risks and possible ways to address these risks your time to add
these comments in the box below would be much appreciated or email me direct at
Thomas.Morris@polytechnic.wa.edu.au
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 51
Assessment e-Risk Survey of Key Stakeholders 2014
The amount of cheating and fraud that goes on within traditional assessment methods
remains unaddressed. We have been doing distance education for years; the only difference
was the hard copy medium meant that evidence took more time to come in.
I am finding that more and more organisations are realising that fully online is not the
panacea they thought it would be and are tending more towards a blended approach as they
find most students, even the most technologically advanced require a portion of the
traditional approach to be blended with the e assessment to achieve success; and they are
finding that themselves the blended approach is more administratively manageable.
Like all assessment tools, when developed and implemented correctly, e-assessment is a
fantastic mode of assessment. However, development includes understanding student
needs, understanding the skill of the assessor, utilising other tools to ensure validity, and
constant reviewing. It's much more expensive (to undertaken appropriately) than some
RTOs and auditors think.
In considering the risks related to e-assessment, it is important to consider these relative to
what risks exist in relation to 'traditional' forms of assessment. It is my view that there is far
more actual risk in these more traditional forms of assessment, due to a combination of this
being by far the most predominant method of delivery and the lack of verification of identity in
general. Many opinions about lack of verification of identity are uninformed and based on
both misunderstanding of how technology can achieve this and dramatic overstatements of
the prevalence, while ignoring the fact that there are significant failures in this area related to
more traditional delivery methods.
Whenever you allow a student to complete assessment outside the visuals of the assessor
you increase the risk of plagiarism and fraudulent assessment practices. The only way you
can minimise these practices from occurring is to implement strategies such as random
checking of knowledge by the RTO; videos with strict conditions to be met; use of online
video conferencing for the purpose of verbal examinations that match the photograph
(licence or passport) of the student maintained on file. RTO’s also needed to be more
innovative with their assessment design when they make the decision to remove the
assessor from direct input in the process. At present they have little or no understanding of
assessment design which results in invalid, unreliable and insufficient assessment practices.
They are ruled by quick turnover driven by funding availability rather than quality
training/assessment.
I honestly don't know how you get around the problems. The same problems are identified
with face-to-face delivery. Students turn up in place of an enrolled student, plagiarism,
assessors using old assignments of other students to substitute for a student, they save
assessments under the unit code rather than with the student file, this way, at audit if a
student file is missing an assessment the assessor just pulls one from the previous group.
My personal experience with auditing organisations that use e-learning and e-assessment is
that most organisations are not using them for quality educational outcomes. The primary
motivator for organisations using this delivery and assessment method is financial due to
perceived reduced overheads.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 52
Assessment e-Risk Survey of Key Stakeholders 2014
Most organisations do not actually understand sound pedagogical delivery and assessment
in the first instance (face-to-face or otherwise), and when combined with an 'e' environment
the problem is amplified. Most organisations do not seem to ask themselves the question of
"Just because we can do it, should we do it?"
I regularly see assessment mapping where organisations indicate that skill are assessed
through written tests, writer assignments, and so on. I believe the training packages need to
be much more specific regarding what mode of assessment is acceptable. Unless certain
things (often common sense things) are specifically written, ASQA auditors are unable to
stop unsound practices. This is due to being asked the question "Where does it say that an
organisation must..." when we are in the Administrative Appeals Tribunal. Again, I must
stress, this is even if the organisation's practices do not make educationally sound sense.
Tighter guidelines are required if educational quality is to be improved, and exploitation of the
system is to be reduced.
2.4
Assessors
4. From my observations…
a. The proportion of all summative assessments relying on e-assessment has grown in
the last 3-5 years.
b. The growth in e-assessment is limited to a few industry skill areas.
c. The growth in e-assessment is predominantly at the higher level of AQF 4 and above.
(For level 4 definitions see: http://www.aqf.edu.au/aqf/in-detail/aqf-levels/)
d. The growth in e-assessment is predominantly at the lower level of AQF 3 and lower.
e. Knowledge based tests, and the online quiz, continue to be the dominant form of eassessment.
f. The last 3-5 years has seen an increase in the use of social media in e-assessment
(e.g. blogs, Facebook and twitter)..
g. The last 3-5 years has seen an increase in the use of smart phones and tablet
technology as part of e-assessment.
h. I have seen e-assessment used to validly assess all four dimensions of competence
(task skills, task management skills, contingency management skills, job/role
environment skills).
i. I have seen e-assessment used to validly assess specific task skills (NB: this is a
narrow question about task skills and does not imply anything about the assessment
of the other dimensions of competence).
I believe e-assessment can validly assess all four dimensions of competence, but to do so a
range of technologies and methods need to be used, and I am yet to see this being done. If
e-assessment including conferencing, authenticated collection of evidence via video, photos
etc. in workplace scenarios, use of simulations, then there seems no reason why eassessment could not be valid.
e-assessment is a far better control for auditing purposes
I do not personally use social media etc. or smart phones but I can see that they may have
valid uses.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 53
Assessment e-Risk Survey of Key Stakeholders 2014
use e learning as a blended delivery mode, i.e. face to face in the classroom with a
computer - the nature of the units delivered have a significant practical component - so can
be demonstrated on site, however am aware that students could send in videos of their
practical skills.
Using e-assessment carries a much higher risk in terms of passing audits as it is not yet
widely accepted and is foreign to many so it is therefore excluded from validation and
moderation activities.
very broad questions and I can’t answer with confidence in other areas so this only relates to
what i do
e-learning is being misused and underused because of the ignorance and paranoia of ITC
and the e-learning “helpers”. ITC are more concerned in securing their own jobs than in
providing a good service.
e-assessment dominantly assesses the ability to comprehend written English in a
technologically complex environment, but in a conceptually simplistic way. True or false?
If you are unsure of what I mean - please note that you have no way of immediately checking
with me face to face, and please imagine that you have a lower level of education and
English is not your first language.
Most online assessment that I have seen has reduced assessment to low order thinking skills
that encourage surface rather than deep learning
I have used industry provided and self-generated e-assessment of both knowledge and
skills, both formative and summative, since 1998.
5. The risks as I see them are…
a. E-assessment practice tends to be associated with an INAPPROPRIATE reliance on
the KNOWLEDGE based assessment of competence.
b. E-assessment strategies CANNOT be used to validly assess the SKILL aspects of
competency.
c. E-assessment CANNOT be used to collect SUFFICIENT evidence to validly assess
competence.
d. E-assessment should only be used to SUPPLEMENT a summative assessment
decision.
e. The majority of e-assessment evidence I have seen used was appropriately
AUTHENTICATED.
f. The majority of e-assessment DESIGN that I have audited is valid, reliable, flexible
and fair.
g. The majority of e-assessment PRACTISE that I have audited is valid, reliable, flexible
and fair.
h. Organised cheating and fraud is currently a significant issue for the quality of eassessment outcomes.
i. Cheating and fraud is a bigger problem in e-assessment than in traditional forms of
assessment.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 54
Assessment e-Risk Survey of Key Stakeholders 2014
Context of assessment method varies between online areas as I need a lot of practical work
it is authenticated as they align with a project production pipeline
There have been flaws appearing in the sufficiency of evidence. Also when online
assessment is combined with classroom delivery the validity is much easier to ensure.
All of these issues come back to the quality of e-learning and e-assessment resources and
methodologies. I believe f-2-f assessment is less prone to cheating due to lecturer scrutiny,
student authenticity and knowledge/relationship with students, but it is still a very valid
concern in the class. While authenticity and practical skill demonstrations both are able to be
met by e-assessment this costs more time and money and I haven't seen either issue
addressed properly in my institution.
Traditional forms of assessment, tedious and time consuming.
I use e assessments but also bring each student in to sit an in class supervised test. This
way I know they have attempted the assessment and feel confident that they have the
required skills.
E-assessment is superior to paper based assessment as it has the ability to deliver similar
questions but having different answers. They are more difficult to copy as no 2 tests are the
same unlike some paper based versions that have been in use for decades.
Provided reasonable Authentication methodologies are used E-assessment would have the
same validity as face to face type assessment etc. Practical tasks can be assessed through
E -assessment if that practical task is achievable within that medium i.e. Spreadsheet use
etc. It is also valid when appropriate simulation devices are used in the same way simulated
practical tasks are valid within a classroom environment.
We often would not know a student by sight.
Per item global moderation against 250,000+students in the program that I have used
provides a significant degree of confidence in the assessment.
6. The risks as I see them are…
a. Many students deemed competent through e-assessment are NOT actually
competent.
b. If a student is deemed competent when they are NOT this will be identified in most
workplaces.
c. I have come across examples of students deliberately seeking to be deemed
competent when they know they are NOT competent.
d. Plagiarism is a bigger problem in e-assessment than it is in the assessment of face to
face students.
e. E-assessment students are more likely to inappropriately collaborate on their
summative assessments.
f. For many students plagiarism is a literacy skill issue rather than a deliberate act of
cheating.
g. E assessment is too risky to be used as part of a summative assessment strategy.
h. It is important to establish the true identity of a student when they enrol.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 55
Assessment e-Risk Survey of Key Stakeholders 2014
i.
The only level of identity risk that will not undermine confidence in the vet system is
zero.
In the online learning / assessment I have used there is virtually no opportunity for student
collaboration, or for students to form the kind of relationship that leads to sharing of
information, so in my experience this is more of an issue in the class.
Risk of summative e-assessment varies dependant on unit being delivered. There are some
units that may never be appropriate for fully online assessment where others would be well
suited.
Most people accept that where there is a system there will be some people who will thwart /
circumvent it. People have fraudulent Passports and Drivers licences - this doesn't
undermine our confidence in these systems. However, in all cases, a range of checks should
be employed to reduce fraud as much as is feasible.
just need proper management. Do not assume things
It is more important to establish the identity of the student when they attend TAFE to sit an in
class test rather than at the enrolment time.
If students are assessments are completely online then there is a significant risk of cheating
and fraud
E assessments can only be utilised a as an additional form of assessment
the AQTF level of assessment is open to too much interpretation
E assessment can also be face to face! The question could have been better.
There is less ability to plagiarise collaboratively within the online systems than traditional
delivery due to less interaction between individual students. Tools such as turn it in can
further reduce any likelihood of this type of thing happening. I believe Plagiarism would be
greater within traditional modes of delivery than online delivery due to the collaborative
nature of traditional delivery models.
i. is an odd question/statement - not clear
Feedback from employers and lecturers both here and overseas is that employers are
increasingly not employing graduates who have completed qualifications online.
"Leaking" of some assessments by the 1,000,000 students and 32,000 instructors in 170
countries is a continuous challenge for the industry vendor that sponsors the program that I
deliver.
7. To address the risks…
a. As a condition of enrolment students should sign a witnessed statement committing
to not submit ‘false or misleading evidence of their skills and knowledge’.
b. As part of internal audit processes a RANDOM sample of e-assessment students
should be contacted and orally asked course related questions.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 56
Assessment e-Risk Survey of Key Stakeholders 2014
c. ALL e-assessment students should be contacted and orally asked course related
questions before they are deemed competent.
d. The use of timed assessment activities could be used to assist identify anomalies
created by copying and collaboration.
e. It makes sense to modify the principles of assessment to replace the word FLEXIBLE
with the word CONSISTENT (ASQA 2013, p44).
f. The way Units of Competence are written is part of the reason for problems with e
assessment.
g. It is important to establish the true identity of a student when they enrol.
b. and c. oral course related questions should be built into the teaching and assessment
strategies. There are other useful techniques such as video conferencing, recording,
photographic evidence that could also be used. If these were being used in delivery and
assessment then I think statement (b) would be sufficient.
e. Reliable assessment is consistent. Flexible assessment is addressing a completely
different aspect of assessment that I believe is needed to enable assessment of people from
all sectors of society.
It is important to establish the identity that it's the enrolled student doing the assessments,
not when their enrol. At least one assessment could be done under supervision (e.g. in
workplace or elsewhere).
a once of confirmation when student enrol not a bad idea
Units of competence are currently being re-written.
E learning students should not be assessed more than non elearning students- given many
assessments are done outside the classroom any student has the potential to cheat.
There is no evidence to suggest issues with a person enrolling in another person’s work with
a different name. Why would this type of fraudulent activity be of great concern at VET level
qualifications with the possible legal ramifications of this? What would they have to gain as it
would soon be evident in any job if the skills were not acquired with those ramifications to
those students. What would be the point for the student? Why do we victimise so many with
unnecessary Authentication and other challenges simply due to a belief that something may
be happening at a high scale when logic would say this is unlikely?
8. To address the risks…
a. The assessment competence of assessors is part of the reason for problems with eassessment.
b. All summative assessment should involve a test of competence under exam
conditions.
c. In the majority of workplaces performance management will identify employees who
were deemed competent when they lack the skills and knowledge required.
d. There are NOT enough students who engage in cheating and fraud to justify
inconveniencing all students.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 57
Assessment e-Risk Survey of Key Stakeholders 2014
e. For every technological strategy to reduce cheating there will be another
technological work around created.
f. There needs to be a greater use of technology such as video, voice and keyboard
monitoring to address e-assessment cheating.
g. A risk assessment approach should be used to determine when the above technology
should be used.
b. Depends on the unit being delivered.
d. That would be the same as saying there are not enough people that drink drive to justify
inconveniencing everyone with breathalyser testing. If there are not a 'visible' range of
effective strategies employed to reduce cheating and fraud you can't expect people to have
trust in Qualifications. This is already an area of concern as some states refuse to
acknowledge some qualifications that have been obtained fully online.
e. of course - just as every time something is regulated there will be people that devise
workarounds. This doesn't mean the regulation should not be passed - it should ensure the
majority comply.
assessments like quiz base I think is better having the student in front of you
The assessment should take place under the supervision of a lecturer or invigilator
A qualification without the knowledge has no long term purpose. Far too much emphasis is
presently placed on cheating to get a qualification when there are much easier and cheaper
ways to now simply purchase the certificate. Almost any University, any TAFE $50 to
$100.00 for the Certificate on the Web and an employer would not know nor do they check if
it is authentic. Why would a student just wanting the piece of paper go such a long route and
expense to enrol in a qualification simply to cheat and obtain a piece of paper to falsely say
they can do something when they can purchase the piece of paper much more easily and
cheaply. I can understand small areas that are strongly regulated such as the white card
having the attraction to cheat. but in the main there is simply no real gain for a student to
attempt this type of action.
Why rely on technology to monitor? Keep it simple rather than build complexity on
complexity.
9. If you have any other comments or suggestions you would like to make in respect
to online assessment (or the use of any other computer or information technology in
assessment), the risks and possible ways to address these risks. Your time to add
these comments in the box below would be much appreciated. Thank you again for
taking the time to complete this survey.
Practical competencies require the observation of the skills. This is difficult with online
assessment. Options could be the video recording or video based observation of the skill
being performed. video monitoring of students conducting tests is used in the commercial
certification environment (CompTIA, Adobe, CISCO et al) where students are in a certified
test centre and consistent testing and environment is maintained. Commercial certification is
done mainly via knowledge based assessment. There are issues of the validity of this
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 58
Assessment e-Risk Survey of Key Stakeholders 2014
approach, but it does provide an extensive check of the candidate underpinning knowledge.
Also commercial certification has "competent" levels of between 70 and 85% correct for their
assessments. They indicate that there are no experts who will know everything; instead they
know how to find the information.
I believe Online learning and assessment is a key area for further development and
utilisation world-wide. It offers many benefits. However, the methods we use to deliver and
assess need appropriate resources and design applied as well as verification of
effectiveness gathered to enable continuous improvement. Too often online is used for costcutting and time-saving, therefore the quality and effectiveness of the resources and
assessments suffer because the appropriate effort hasn't been invested in their design or in
verification of effectiveness. These concerns are far broader than just e-assessment!
Online assessment spare a lot of time that may be used more effectively. Staff need to be
fast track in the use of the online assessment. More control over assessments when online.
Mapping easier
Consistency is important and as this task is being left to the lecturers with varying levels of IT
skills, you will always have differences and discrepancies.
This task should be undertaken by the college as a whole... (surely this would satisfy audit
requirements around holistic/same assessments for units across courses, campuses and
areas).
appropriate software and technological equipment, as well as sufficient time to create online
content and assessment is necessary to successfully use e-assessments. Also, sufficient
practice , time and support for the technology should be given to students .
Good that you are addressing this. I have found that a mix of assessments is the best and
formative are OK online but I am wary of relying too much on summative online.
I have little or no confidence in online testing. What I have seen has been of a low standard
and open to cheating. Additionally it has only reflected lower order thinking skills and in most
cases only pertains to knowledge and not skills. Practical demonstration in front of a lecturer
or supervisor is the most appropriate method of assessment.
Cheating and plagiarism has become the norm in our society due to the lack of support for
teachers who make complaints about its occurrence and because penalties are not given to
students who cheat or plagiarise.
This is exacerbated by competency based learning has taken the onus off students to work
hard to achieve a high standard of work. Instead the majority of students do barely enough to
be deemed competent and if they don’t achieve competence can have a second chance as
after completing the first chance they now know what to expect in the supervised
assessment or test.
At least with supervised assessments the lecturer has some chance of picking up on
cheating and plagiarism.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 59
Assessment e-Risk Survey of Key Stakeholders 2014
References
AS/NZS ISO 31000:2009 Risk management – Principles and guidelines, Commonwealth of
Australia.
ASQA 2012, Standards for NVR Registered Training Organisations 2012, Australian Skills
Quality Authority, Commonwealth of Australia.
COAG ISC, 2014, Communique for the COAG Industry Skills Council Meeting – 3 April 2014
http://www.natese.gov.au/__data/assets/pdf_file/0005/80519/COAG_Industry_and_Skills_Co
uncil_-_Communique_-_3_Apr_2014.pdf.
Crisp, Geoffrey 2011, Rethinking assessment in the participatory digital world – Assessment
2.0 (also referred to as Transforming Assessment), National Teaching Fellowship – final
report. Support for the original work was provided by the Department of Education,
Employment and Workplace Relations, an initiative of the Australian Government.
Docking, Russell 2013, Effective strategies for the competency-based assessment and RPL:
workshop participant manual, Innovation and Business Skills Australia.
FLAG 2013, Technology Innovations Applied Research projects: guidelines for applicants
and application form, flexiblelearning.net.au, Australian Government, Department of Industry,
October.
Kowszun, Jojo and Oscar Struijive 2005; ‘Risk assessment for the distributed e-learning
regional pilots and Higher Education Academy Subject Centre projects, Report 1, Guidance
on risk, Cogency Research and Consulting Limited, United Kingdom.
http://www.jisc.ac.uk/media/documents
Morris, Tom 2014a, An Australian enquiry into the veracity and authenticity of VET online
e-assessment: a risk management approach to stakeholder concerns, New Generations
Technology website http://ngt.flexiblelearning.net.au.
Morris, Tom 2014b, An Australian guide to the risk management of VET online
e-assessment: a companion document to the research report into the veracity and
authenticity of stakeholder concerns, New Generations Technology website
http://ngt.flexiblelearning.net.au.
National VET E-learning Strategy 2012-2015, Department of Industry, Innovation, Science,
Research, and Tertiary Education, Australian Government, http://ngt.flexiblelearning.net.au.
The New Generation Technologies for Learning, Flexible Learning Advisory Group,
the National Advisory for Tertiary Education, Skills and Employment (NATESE), Department
of Industry, Commonwealth of Australia, http://ngt.flexiblelearning.net.au.
SNR Standards for NVR Registered Training Organisations 2012, Commonwealth of
Australia.
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 60
Assessment e-Risk Survey of Key Stakeholders 2014
More Information
National VET E-learning Strategy
Email: flag_enquiries@natese.gov.au
Website: flexiblelearning.net.au
New Generation Technologies
incorporating E-standards for Training
Email: e-standards@flexiblelearning.net.au
Websites:
New Generation Technologies: ngt.flexiblelearning.net.au
E-standards for Training: e-standards.flexiblelearning.net.au
National VET E-learning Strategy
New Generation Technologies
incorporating E-standards for Training
Page 61
Download