External Assessment and Certification

advertisement
Certification and Examinations
Report, 2007
Trim: 2008/18516
Contents
Acknowledgements
page
ii
List of tables
iii
List of figures
v
Overview of the external assessment and certification functions
1
Sections
1
Data collection
4
2
Registering students
6
3
Data trends and statistics
9
4
Setting the examination
15
5
Access for candidates with disabilities
17
6
Conduct of the examinations
21
7
Marking the examination scripts
26
8
Special considerations for candidates
29
9
Evaluation of the examination papers
31
10
Statistical process to achieve comparability
36
11
General achievement test
39
12
Certification of student achievement
40
13
Acknowledging excellence
43
14
Public relations
46
Appendices
A
Committee membership in 2007
51
B
Consolidation costs in 2007
53
C
Comments on the examination papers
54
D
Year 12 state statistics in 2007
57
E
Statistical report on the examinations – enrolments
60
F
Statistical report on the examinations – performance
69
G
Statistical report on the WACE
81
H
Student numbers – WSA subjects: 2000–2007
82
I
Practical components in examinations
85
J
Marks management in the 2007 Engineering Studies examination
90
K
Scaling procedure for Arabic and Hebrew
94
L
Relativities of scaled marks in Mathematics subjects
96
M
Structured Examination Designs – the use of linkage to place noncommon assessments on a common scale
98
Certification and Examinations Report 2007
i
Acknowledgements
This report has been compiled by the following people who were involved in
the external assessment and certification of Western Australian students
during 2007
Jenny Morup, Manager, Certification and Examinations
Lauren Miles, Administration Assistant
Marlene Hall, Administration Assistant
Rachel Schollum, A/Administrative Assistant
Examinations Team
Alan Honeyman, Senior Consultant, Measurement and Research
Barrie Chick, Coordinator Examinations (Written)
Beryl Bettell, Coordinator Examinations (Practical)
Carolyn Hackett, Coordinator Examinations (Special Provisions)
Con Coroneos, Senior Consultant, Humanities
Cristina Caruso, Exams Support Officer (Supervisors)
John Van Wyke, A/Measurement and Research Consultant
Kelly Hourston, Assessment Officer (Exams)
Kerrie Ward, Administration Assistant
Kerry Tarrant, Exams Development Officer (Papers)
Lyn Sadleir, Coordinator Examinations (Marking)
Selina Mosbergen, Exams Support Officer (Marking)
Data Management Team
Andrea Schwenke, Administration Assistant (RSLA)
Chris Lee, Data Administrator
Jenny Offer, Certifications Officer
Kathy Pilkington, Senior Consultant, Certification and Examinations
Lynn Galbraith, Project Officer (RSLA) – from 22 September 2007
Marie Parker, Data Coordinator
Ron Grimley, Project Officer (RSLA) – until 21 September 2007
Veronica Wimmer, Clerk/Typist
194981_1.DOC
Certification and Examinations Report 2007
ii
Tables
Report
Table 1
Table 2
Table 3
Table 4
Table 5
Table 6
Table 7
Table 8
Table 9
Table 10
Table 11
Table 12
Table 13
Table 14
Table 15
Table 16
Table
Table
Table
Table
17
18
19
20
page
School registrations, 2007
Participation of students born in 1991 (16 year old in
2007) at school and in non-school programs, 2007
Year 12 Aboriginal/Torres Strait Islander enrolments,
2003–2007
Change in the number of students enrolling for the
examinations, 2001–2007
Examination enrolments, as at October, 2003–2007
(one examination or more)
Examination enrolments, as at October, 2005–2007
(four or more examinations)
Change in the number of students who sat the
TEE/WACE examinations, 2003–2007
Year 12 enrolments in at least one unit of
competency,2003–2007
Units of competency studied by Year 12 students,
2003–2007
Distribution of special examination arrangements
applications, 2006–2007
Special examination arrangements by disability
category, 2007
Marking of practical examinations, 2007
Sickness/misadventure applications by sector and
gender, 2007
Outcome of sickness/misadventure applications,
2004–2007
Sickness/misadventure applications by location,
2004–2007
Distribution of evaluation comments on examination
papers, 2007
School/subject cohorts, 2000–2006
Achievement of a WACE, 2001–2007
Special consideration for a WACE, 2007
Number of exhibition and award winners, 2007
Appendices
Table
Table
Table
Table
Table
Table
A1
A2
A3
A4
A5
A6
Table A7
Table A8
Table A9
6
8
10
10
10
11
12
13
14
18
19
26
29
30
30
34
37
40
41
45
49
2007 TEE/WACE examinations costs consolidation
Examination statistics
Wholly school-assessed subjects
VET studies
WACE course units
Number of candidates sitting the TEE/WACE
examination in each subject/course, 2004–2007
Candidates in each TEE/WACE examination, shown as
percentage of total candidature, 2003–2007
Total number of percentage of full-time and part-time
Enrolments in TEE subjects/WACE courses 1983–2007
Age of enrolled students in 2007 TEE subjects/WACE
Examination courses by enrolment type and gender
Certification and Examinations Report 2007
53
58
58
58
59
60
61
62
63
iii
Table A10
Table A11
Table A12
Table A13
Table A14
Table A15
Table A16
Table A17
Table A18
Table A19
Table A20
Table A21
Table A22
Table A23
Table A24
Table A25
Table A26
Table A27
Table A28
Table
Table
Table
Table
A29
A30
A31
A32
Private candidature and absent private candidates in
TEE subjects/WACE examination courses, 2006–2007
Number and percentage of background candidates in
TEE subjects, 2003–2007
Number and percentage of candidates sitting for a
specific Number of TEE subjects/WACE examination
courses 2004–2007
Enrolments, absentees and non-examination
candidates In each TEE subject/WACE examination
course, 2007
Number of anomalous performers identified in each
TEE subject/WACE examination course, 2006–2007
Subject/course absentee and anomalous performer
rate in relation to date of examination, 2007
Summary statistics on examinations papers, 2007
(2006 statistics in parentheses)
Mean and standard deviation of moderated school
assessments, Raw examination marks and correlation
coefficients for the TEE subjects/WACE examination
courses, by gender, 2007
Mean and standard deviation of combined marks and
scaled marks TEE subjects/WACE examination
courses, by gender, 2007
Relationship between raw examination marks and
standardised Examination marks for TEE
subjects/WACE examination courses 2007
Relationship between combined marks and scaled
marks for TEE subjects/WACE examination courses,
2007
Subject loading for each TEE subject/WACE
examination Course, 2006–2007
Ranges of scaled marks corresponding to decile places
in TEE subjects/WACE examination courses, 2007
‘Typical’ school students achieving a WACE, 2006–
2007
Number and percentage of students who sat the
Curriculum Council English language competence test,
2007
Student numbers – wholly school-assessed subjects,
2004–2007
Concurrent validities for examinations, 2005–2006
Summary of differences in combined marks resulting
from inclusion versus exclusion of practical component
in external examination mark
Differences when external examination excludes
versus includes practical component
Data structure – Engineering Studies, 2007
Calculation of scaled scores for Hebrew, 2007
Calculation of scaled scores for Arabic, 2007
Relativities of scaled marks in mathematics subjects:
summary data
Certification and Examinations Report 2007
64
65
65
66
66
68
70
71
74
77
78
79
80
81
81
82
86
87
88
92
94
95
96
iv
Figures
Report
page
Figure 1
Total registrations, 2006–2007
6
Figure 2
Student registrations by sector, 2006–2007
7
Figure 3
Student registrations by gender, 2006–2007
7
Figure 4
Status of Notice of Arrangements received, 2007
8
Figure 5
Number of Year 10 students who enrolled in
subject/course units, 2001–2007
9
Figure 6
Number of students enrolled for a specific number of
examinations, 2003–2007
11
Figure 7
Special examination arrangements applications as a
percentage of enrolments, 2006–2007
18
Figure 8
Special examination arrangements applications by
location, 2006–2007
18
Figure 9
Number of applications per school/college by sector,
2007
19
Figure 10
Participation in the English language competence
test, 2007
22
Figure 11
Marks adjustment process
36
Figure 12
Post-results counselling, 2007–2008
47
Figure A1
Concurrent validity, 2000–2006
86
Figure A2
Data analysis for Engineering Studies, 2007
90
Figure A3
Raw score to ability conversion
93
Figure A4
Engineering Studies 2007, summary of mean scores
93
Figure A5
Hebrew 2007
95
Figure A6
Arabic 2007
95
Figure A7
Music: structure of written examination
100
Figure A8
Music: structure of practical examination
101
Appendices
Certification and Examinations Report 2007
v
Overview of the certification and examinations functions
Outcome 6:
Outcome 7:
Assessments of student achievement for the senior secondary certificate are
valid and credible.
Accurate information on student achievement is provided to inform the
community and to facilitate post-school choice.
Curriculum Council Annual Report
Major legislative and education policy changes have occurred over recent years. The
raising of the school leaving age, the introduction of new course examinations and the
move to compulsory examinations were predominant focuses for certification and
examination activities in 2007, and these will continue in 2008.
In 2007, the Curriculum Council certified the achievement of approximately 45,000
students studying for the Western Australian senior secondary certificate. There were
nearly 14,000 candidates who were enrolled to sit at least one TEE/WACE
examination. There were thirty-eight different examinations, of which four were for
new WACE courses, four were interstate language examinations, and the remainder
were for tertiary entrance subjects.
Significant achievements for 2007 included:











Introduction of WACE examinations for English, Engineering Studies and Media
Production and Analysis.
Introduction of two new language subjects, Arabic and Hebrew, and the
development and successful application of appropriate manual scaling
procedures in collaboration with TISC.
Introduction of a practical component to the Aviation examination via computer
simulation.
Introduction of innovative scaling techniques to successfully equate
examination performances in the six different combinations of stage and
context in the Engineering Studies examination.
Continued development and refinement of the collection of data relating to
student registration and student demographics, enrolment and results on the
Student Information Record System (SIRS).
Development and maintenance of student records for Year 8-10 students in
light of the raising of the school leaving-age legislation. In 2007, 141,000
students were registered with the Curriculum Council.
Maintenance of data relating to the participation of nearly 30,000 Year 11 age
students in all programs.
Monitoring of significant data trends
o number of Year 10 students enrolled in senior school programs increased
markedly in 2007
o graduate decrease in the number of government school students sitting the
examinations
o number of students who completed at least one unit of competence
decreased in 2007.
Continued monitoring of Aboriginal student enrolments with regular meetings
held with system/sector personnel.
Provision of special examination arrangements for 276 candidates with
disabilities and consideration of 342 applications for sickness/misadventure
during the examinations.
Consideration of 173 applications for students who were at risk of not achieving
the WACE due to the implementation of the new course units.
Certification and Examinations Report 2007
1




Coordination and analysis of the General Achievement Test for 14,659 students
who were studying new WACE course units.
Finalising of the WACE requirements and the awards and exhibition policy for
implementation in 2008 and beyond.
Recording of 3,337 (2006: 2,344) VET qualifications on 2,401 (2006: 1,782)
Year 12 students’ statements of results. Of these, 55 (2006: 77) qualifications
were achieved through a traineeship.
Granting 997 exhibitions and awards to 713 students.
Specific actions were undertaken during 2007, as recommended in the 2006
Certification and External Assessment Report:









A series of workshops were held with schools to work through the SIRS
procedures.
The policy and guidelines for special provisions for examination students were
refined and communicated to schools.
Features of the SIRS-External Assessment data base were developed to enable
more functions relating to the conduct of the examinations to be carried out
using this data base.
Country examination supervisors were encouraged to attend the training
workshop for chief supervisors.
A new form for the reporting of alleged breach of examination rules was
introduced.
All optical marks reader (OMR) marks collection sheets were successfully
replaced by the new teleform technology.
On-line marking was used for the marking of the Media Production and Analysis
scripts.
Scaling methodologies were researched and the most appropriate method used
to produce scaled scores for Hebrew and Arabic.
Aspects of the exhibition and awards policy and guidelines were reviewed to
incorporate achievement in course units.
In meeting these achievements the Curriculum Council was supported by the following
panels and committees.






Examining panels – one for each Western Australian TER subject/course (34
panels)
Special Examination Arrangements Committee and Appeals to Special
Examination Arrangements Committee
Sickness/Misadventure Committee and Appeals to Sickness/Misadventure
Committee
Examination Breaches Committee and Appeals to Examination Breaches
Committee
Awards Working Party and Awards and Exhibitions Committee
Special Provisions Committee
In addition to this, over 1,400 casual staff were employed to:








provide quality assurance for the examination papers
provide directed analysis and research of psychometric issues
assist with the various enrolment, results and examination dispatches
supervise the examinations
mark the practical components of examinations
sort the examination scripts
mark the scripts from written papers
identify exhibition winners.
Certification and Examinations Report 2007
2
Positive feedback, both written and verbal, has been received by many staff members
regarding their efficient and friendly service. Schools, parents, examination
candidates and the general public have commented on the clear, concise and accurate
advice they have received from the Secretariat regarding examinations and
certification.
Priorities for 2008
In reflecting on the 2007 examination and certification process together with the
move towards compulsory examinations, the Certification and Examinations branch
will undertake the following during 2008:
















Develop 71 sample and final examination papers, compared with 34 in 2007.
Streamline the development of sample papers and associated materials, and
the final examination papers.
Facilitate regular SIRS training sessions for school personnel to accommodate
the variety of software at schools and the transience of school staff.
Extend the monitoring of people involved in school and non-school programs to
include those with 1992 birth dates (17 year olds).
Introduce strategies to encourage schools to submit data by the required dates,
as outlined in the WACE activities schedule.
Extend the on-line marks collection process and on-line marking to more
subjects/courses.
Explore alternative formats for examining the practical components of courses.
Engage an independent person to review the development of the Engineering
Studies and Media Production and Analysis examination papers.
Engage an independent person to review the on-line marking of the Media
Production and Analysis scripts.
Review the process for approving special examination arrangements for
candidates with specific learning difficulties.
Explore the feasibility of developing an on-line application and response process
for special examination arrangements.
Explore the feasibility of using barcode reader technology for the sorting of
examination scripts.
Explore ways to improve understanding of marks adjustment processes in
schools and the general community.
Certification of Year 12 students to accommodate the new WACE requirements
and new English language competence requirement.
Refine the SIRS printouts to allow schools to monitor whether Year 12 students
have met WACE requirements.
Review the policy for the granting of exhibitions and awards in order to
acknowledge outstanding achievement of VET, Aboriginal and Torres Strait
Islander students, and students with ESL/ESD background.
Certification and Examinations Report 2007
3
Section 1: Data collection
The process for collecting student registrations, enrolments and results was comprehensive,
thereby enabling accurate information on student achievement to be reported.
In 2007, the Student Information Record System (SIRS) became the Curriculum
Council’s main database for the collection, storage and reporting of student data. This
database was designed and built by independent companies in consultation with the
Curriculum Council. Work on this database commenced in 2004.
Data collection process
During 2007, the Curriculum Council modified the student information record system
(SIRS) to enable the collection of data relating to the WACE course units as well as
subjects and VET data, including endorsed programs. Schools uploaded their student
registrations and demographics, course units, subject, and VET enrolments, as well as
course, subject and VET achievements into SIRS.
Following feedback from 2006 training workshops in SIRS, it was decided to run a
series of ‘hands-on’ sessions for school administrators. Five workshops were
conducted at the Curriculum Council and nine workshops in country locations –
Albany, Bunbury, Geraldton, Kalgoorlie and Narrogin. Forty-seven metropolitan
schools and fifty-two country schools attended the seminars.
The workshops were interactive and two-and-a-half hours in length. Metropolitan
participants used simulated school data to complete activities in SIRS that enabled
them to use the system, as various aspects of the software were demonstrated.
Participants in the country workshops used their own school data, which was highly
beneficial in identifying issues that needed to be addressed and certainly made the
workshops relevant. In addition, all schools received a CD: ‘Student Information
Record System a how–to guide’.
Feedback from participants indicated that these training workshops was useful and
had assisted their understanding of SIRS. Schools requested that the workshops be
held again in 2008 to enable new staff to access the training. The cost of these
training workshops (covering fax stream use, hire of laptops, catering, travel, CD’s
and postage) was $6,294.
Over the year, through visits to schools, telephone conversations and emails, various
issues were raised. Meetings were held with school representatives in an effort to
identify problems, discuss possible solutions and identify strategies for improvement,
both for schools and for the Curriculum Council.
The Curriculum Council help desk was expanded to assist schools that were having
problems with the upload of their data onto SIRS. A number of new reports were
added into the system, enabling schools and Curriculum Council staff to access
information easily.
Despite this additional training, additional help desk support and the refinement of
process, data collection issues persisted. These issues included:
1. Accuracy of the student number
Duplicate student numbers (students who have been issued with more than one
student number) has caused problems in 2007 again by preventing students from
meeting WACE requirements as their results are unable to be accumulated. Students
Certification and Examinations Report 2007
4
need to be made aware of the importance of knowing their student number and
advising schools when they transfer/move to another provider. Schools need to be
informed again of the consequences of students having more than one number.
In addition to duplicate numbers, there were a small number of instances when one
student was issued with the same number as another student (who had left school in
previous years).
2. Missing/incorrect demographic data
The past problems of missing demographic data seem to have been largely overcome
in 2007. However, there were several instances of incorrect demographic information
uploaded by schools. The system will give an error if a suburb is incorrectly spelt or
does not exist, but some students, particularly from overseas, have been uploaded
with an address that does not exist. This caused problems for the delivery of their
Statement of Results and Certificates. Schools need to be reminded to ensure that all
this data is correct and any changes to demographic data are also uploaded with the
registration file on a regular basis.
3. Failure to meet data collection deadlines
A number of schools continue to have trouble meeting the due date for their data.
Schools will be reminded to plan to meet the due dates through the WACE activities
calendar and at the SIRS training workshops. Follow-up telephone calls, emails and
faxes are used to inform schools they have missing data. This is time consuming and
can be stressful for schools and Council staff.
4. Failure to provide the VET qualifications achieved by students
The upload of VET qualifications from schools is still an issue. Many schools fail to do
this and the students then miss out on getting this information recorded on their
Statement of Results. This information also impacts on the school performance tables
which give statistics on VET qualifications. Schools have indicated that it is
sometimes difficult to get the RTO or training provider to give them the information in
time to meet Curriculum Council deadlines.
5. Impact on school resources
Modifications have been made to SIRS and to SIS and MAZE to streamline the number
of data uploads. Hopefully this will reduce the time needed to do these uploads.
Continuing and future improvements
Regular SIRS training sessions for school personnel to accommodate the variety of
software at schools and the transience of school staff will be facilitated.
Strategies to encourage schools to submit data by the required dates, as outlined in
the WACE activities schedule, will be introduced. In particular, work will continue with
system/sectors and schools to ensure all VET data is submitted in time for certification
of Year 12 results.
The WACE Procedures Manual has been developed and has been distributed to schools
at the beginning of the 2008 school year. In future this manual will be updated and
distributed at the beginning of each year.
Certification and Examinations Report 2007
5
Section 2: Registering students
In accordance with the Acts Amendment (Higher School Leaving Age and Related Provisions)
Act 2005, the Curriculum Council registered all students from Year 8 to Year 12.
During 2007, 141,054 students, from Year 8 to Year 12 were registered with the
Curriculum Council. Of these, there were approximately 52,000 students enrolled in
at least one Curriculum Council subject, WACE course unit or VET unit of competency.
Enrolments were received from all registered Western Australian senior high schools,
senior colleges, some remote community schools, some district high schools, some
Education Support Centres, the School of Isolated and Distance Education, four
Malaysian schools, one Singaporean school, one Indonesian school, one Vietnamese
school and two Chinese schools.
In accordance with the legislation, the Curriculum Council established a record of Year
8 students and continued to maintain a register of all students attending secondary
education. Students were registered from Western Australian senior high schools;
government and non-government, senior colleges, some remote community schools,
some district high schools, some Education Support Centres, the School of Isolated
and Distance Education, eight overseas schools and those children who were home
schooled. Table 1 summarises those registrations.
Table 1: School registrations, 2007
Year 8
17,086
12,026
130
29,242
Government
Non-Government
Other
Total
Year 9
17,629
11,643
119
29,391
Year 10
18,080
11,660
111
29,851
Year 11
17,280
10,255
217
27,752
Year 12
15,171
9,195
452
24,818
Total
85,246
54,779
1,029
141,054
Figure 1 below shows that 2,308 more students were registered in 2007 compared to
2006, an increase of 1.6%. There was a 3% increase in the number of students
registered from non-government schools.
160,000
Number of students
140,000
120,000
100,000
Total 2006
80,000
Total 2007
60,000
40,000
20,000
0
Government
Non-Government
Other
Total
Figure 1: Total registrations, 2006–2007
Certification and Examinations Report 2007
6
Figure 2 compares registrations by sector for all year levels from 2006 and 2007. This
figure illustrates that the registrations have remained relatively constant for the two
years. The only noticeable difference is in Year 12 where there was an 8% increase
from government schools and a 6% increase from non-government.
Govt sc hools 2006
Govt sc hools 2007
20000
Non-govt sc hools 2006
Non-govt 2007
18000
Other 2006
Other 2007
Number of students
16000
14000
12000
10000
8000
6000
4000
2000
0
Year 8
Year 9
Year 10
Year 11
Year 12
Figure 2: Student registrations by sector, 2006–2007
A further point of interest is shown in the ratio of male to female students in each
year group for 2006 and 2007. In 2007, the number of males decreased in Year 11
by 354 (2.45%) but increased in Year 12 by 825 (6.91%). Similarly, for 2007, there
was a decrease in female Year 11 students of 360 (2.56%) and an increase in female
Year 12 students in 2007 of 834 (6.46%). The only year group showing more female
students than male students attending school, was the 2007 Year 12 cohort.
18,000
16,000
14,000
Male 2006
12,000
Male 2007
10,000
8,000
Female 2006
6,000
Female 2007
4,000
2,000
0
Year 8
Year 9
Year 10
Year 11
Year 12
Figure 3: Student registrations by gender, 2006–2007
In 2007, students were required to either remain at school or participate in other
approved programs, including apprenticeships/traineeships, TAFE/RTO courses or
employment until the end of the year they turned 16 years of age. For 2007, this
involved students with a birth date between 01/01/1991 and 31/12/1991.
Data collected regarding the participation of students in programs during 2007 are
shown in the following table. When comparing these statistics with the 2006 data, the
level of participation in the various programs is similar. It is worth noting that only
2.5% (747) of the children were not participating in a program in 2007 compared to
3.4% (997) in 2006.
Certification and Examinations Report 2007
7
Table 2: Participation of students born in 1991 (16 years old in 2007) in school and
in non-school programs, 2007
School
TAFE/RTO
Apprenticeship
/ traineeship
Employment
No
program
Total
1
Full year participation
School 2
24,666
TAFE/RTO
556
Apprenticeship
/Traineeship
494
Employment
476
Sub-total
26,192
Part year or no participation 3
School
21
TAFE/RTO
8
Apprenticeship
/Traineeship
1
Employment
2
(4)
No program
1,195
Sub-total
1,227
Total
27,419
1
2
3
4
24,666
903
347
16
74
437
495
32
527
1,005
807
27,381
225
225
21
28
20
5
1
0
26
463
11
0
0
11
538
56
0
56
281
747
747
747
17
59
1,942
2,067
29,448
Includes students who have participated in a program or programs for more than 9 months
Includes 14,713 students at government schools, 9,893 students at non-government schools and 60 students in home education
programs
Students whose record shows left provider/left secondary education or not re-registered at all in 2007 are included in these
categories.
Includes 3 deceased students, 2 students who have left Western Australia and 1190 who have left school with no other provider
recorded.
Young people undertaking alternative programs to full-time school are required to
apply for a Notice of Arrangement through the Participation Directorate at the
Department of Education and Training. Once the Notice of Arrangement has been
approved by the Participation Directorate, the Curriculum Council is notified and
details placed on the student’s record.
The following chart shows the number of students who applied for a Notice of
Arrangement in 2007 (4,040), and how many were approved, cancelled, rejected or
currently pending (information current at December 2007).
Figure 4: Status of Notice of Arrangements received, 2007
Continuing and future improvements
The age for participation in approved programs will increase to 17 years in 2008. This
is in accordance with the Acts Amendment (Higher School Leaving Age and Related
Provisions) Act 2005. This means that monitoring in 2008 will need to cover both
1991 and 1992 birth dates.
A close association will be maintained with the system/sectors and TAFE colleges
throughout 2008 to further streamline the monitoring of students.
Certification and Examinations Report 2007
8
Section 3: Data trends and statistics
Trend data and statistics are provided in the public interest and to assist the Curriculum
Council, system/sectors and schools in their planning.
During the year, over eighty requests were made from a wide range of sources for
data on a variety of areas including performance of schools, subject enrolments and
trends in student achievement.
For the first time ever, schools were able to produce reports, via the internet (SIRS),
that related to their students’ current registration, enrolments, examination
arrangements and results. Considerable work was undertaken to ensure that these
reports were designed in a format that was accessible and useful to schools.
Previously, schools received some of this information in large computer printouts.
This information was accurate only at the time of printing, whereas now schools are
able to generate accurate reports when required.
Enrolment trends
There has been an upward trend in the number of students completing the Western
Australian Certificate of Education. In 2007, there were 216 Year 12 students who
achieved a WACE over three consecutive years (from their studies in Year 11 and Year
12).
Number of students
Year 10 student enrolments
Figure 5 below shows the number of Year 10 students who enrolled in Year 11 (D
code) Curriculum Council subjects from 2001 to 2006 and in Year 11 (D code)
subjects and at least two WACE course units in 2007. There were 28 Year 10
students who were enrolled in E code subjects in 2007. Of these, 13 students were
enrolled in at least one TEE subject as an examination candidate.
1000
900
800
700
600
500
400
300
200
100
0
2001
2002
2003
2004
2005
2006
2007
Year
Figure 5: Number of Year 10 students who enrolled in
subject/course units, 2001–2007
Year 12 Aboriginal/Torres Strait Islander student enrolments
Table 3 below indicates the number of Year 12 Aboriginal and Torres Strait Islander
students who enrolled with the Curriculum Council in 2007. The number of
enrolments continued to increase each year until 2006, when there was a decrease.
This may be a reflection of the change in the method of collecting data relating to
Aboriginal and Torres Strait Islander students. In 2007, the number of enrolments
increased. This may be attributed to the communication with schools explaining the
importance of data provided to the Curriculum Council.
Certification and Examinations Report 2007
9
Table 3: Year 12 Aboriginal/Torres Strait Islander enrolments, 2003–2007
Ethnicity
Aboriginal
Torres Strait Islander
Both Aboriginal and
Torres Strait Islander
Total
Number of students
2004
2005
2006
351
382
337
25
30
10
2003
321
22
2007
384
5
5
13
10
7
8
348
389
422
354
397
External examinations enrolments
Over recent years the enrolment numbers (as at June) to sit the examinations have
fluctuated, with a change from a progressive 1-2% increase to a 2-5% decrease in
2005 and 2006 and then a 33.5% increase in 2007. The 2007 increase is unusual and
could be explained by the enrolment process.
Table 4: Change in the number of students enrolling for the examinations, 2003–
2007
June enrolments
Total Year 12
enrolments
Number of students
enrolled to sit at least
one TEE subject/WACE
course examination
Percentage change
from previous year (at
the same time)
2001
2002
2003
2004
2005
2006
2007
20,322
21,022
21,441
21,588
21,832
21,096
21,875
13,768
14,061
14,353
14,585
14,269
13,533
18,068
+0.4
+2.1
+2.1
+1.6
-2.2
-5.2
+33.5
The number of students who enrolled (October) to sit for one or more TEE
subject/WACE course examinations increased (by 9.4%) from 12,663 in 2006 to
13,855 in 2007. Of the 13,855 students – 6,436 (2006: 5,881) were male and 7,419
(2006: 6,782) were female; 61 (2006: 65) were Aboriginals/Torres Strait Islanders,
6,485 (2006: 5,988) attended a government school and 6,938 (2006: 6,616)
attended a non-government school, 11,517 (2006: 10,254) attended schools in the
metropolitan area, 1,906 (2006: 1,895) attended country schools and 419
(2006: 492) students studied overseas.
Table 5 provides details of the students who enrolled to sit at least one TEE
subject/WACE course examination in the years 2003 to 2007.
Table 5: Examination enrolments, as at October, 2003–2007 (one examination or
more)
Enrolments
Gender
Male
Female
System/Sector
Government
Non-government
Overseas
Private candidates
Location
Metropolitan
Country
Overseas
Private candidates
2003
2004
2005
2006
2007
6,439
7,498
6,353
7,393
6,322
7,085
5,881
6,782
6,436
7,419
7,654
5,932
293
58
7,282
6,046
339
79
6,725
6,261
391
30
5,988
6,161
492
22
6,485
6,938
419
13
11,476
2,110
293
58
11,378
1,950
339
79
11,012
1,974
391
30
10,254
1,895
492
22
11,517
1,906
419
13
Certification and Examinations Report 2007
10
Table 6 provides details of the students who enrolled to sit at least four TEE
subject/WACE course examinations in the years 2005 to 2007.
Table 6: Examination enrolments, as at October, 2005–2007 (four or more
examinations)*
Enrolments
2005
Gender
Male
5,167
Female
5,895
System/Sector
Government
5,127
Non-government
5,534
Overseas
391
Private candidates
10
Location
Metropolitan
9,028
Country
1,633
Overseas
391
Private candidates
10
*Data were analysed in this way from 2005.
2006
2007
4,694
6,782
5,183
6,015
4,581
5,397
492
8
4,824
5,952
419
3
8,407
1,571
492
8
9,313
1,463
419
3
The figures in tables 5 and 6 show:
 a small percentage change in the number of male and female students enrolled to
sit the examination.
 more females sat the examinations than males.
 a decrease in the percentage of students attending government schools and an
increase in the percentage of non-government students enrolled to sit the
examinations.
 a decrease in the number of students enrolling to sit the examinations from
overseas schools in 2007. Prior to this, the number of these students had been
increasing.
 80% of the students who enrolled in an examination enrolled to sit four of more.
Number of students
As indicated in figure 6, the most popular number of TEE subject/WACE course
examinations enrolled in over the last five years was five subjects/courses.
2003
2004
2005
2006
2007
7000
6000
5000
4000
3000
2000
1000
0
1
2
3
4
5
6
7
8
Num ber of TEE subject/WACE course exam inations
Figure 6: Number of students enrolled for a specific number of
examinations, 2003–2007
Certification and Examinations Report 2007
11
Of the 38 examinations, English had the highest number of enrolments with 10,489
(2006: 8,457, 2005: 9,026, 2004: 9,146, 2003: 9,258, 2002: 9,171 and 2001:
8,533). Discrete Mathematics had the next highest with 7,739 (2006: 7,425, 2005:
7,546, 2004: 7,702, 2003: 8,714, 2002: 8,607 and 2001: 8,002). Hebrew had the
lowest number of enrolments with four students. Previously, the lowest enrolments
had occurred in Japanese: Advanced and Modern Greek. In 2006, five students had
enrolled in Japanese: Advanced and eleven students had enrolled in Modern Greek.
In 2007, WACE course examinations were held for the first time in English,
Engineering Studies and Media Production and Analysis. The number of students
enrolled to sit each of the course examinations was 10,489, 168 and 1,055
respectively.
Applications were also received from 13 people who enrolled to sit subject/s and
WACE courses in the external examinations as private candidates. That is, they had
no school assessment included in their combined mark.
External examination attendance
In 2007, the number of students who sat at least one TEE subject/WACE course
examination (11,765) increased when compared with 2006 and 2005 (2006: 10,953,
2005: 11,610). Although a corresponding increase may be expected in the number
who sat for each examination, this increase was not distributed equally across
subjects. Increases were recorded in 15 of the 38 examinations with conspicuous
increases (more than 15%) being recorded in the following examinations: Chinese:
Second Language (29.6%), English (24.0%); Geology (31.0%) and Modern Greek
(83.3%).
There was a decrease in the percentage of students sitting for 18 of the 38
examinations. There were large decreases in the proportion of candidates who sat the
examinations in Ancient History (29.5%), Chinese: Advanced (22.6%), Indonesian:
Advanced (28.3%) and Malay: Advanced (34.9%). The number who sat the
Indonesian: Advanced TEE has decreased for eight successive years.
Table 7 shows the change in the number of candidates who sat the TEE/WACE course
examinations from 2003 to 2007.
Table 7: Change in the number of students who sat the TEE/WACE examinations,
2003–2007
Candidates who sat at least 1
TEE subject
Percentage change from previous
year
Candidates who sat at least 4
TEE subjects
Percentage change from previous
year
Candidates for TEE/WACE
examinations
Certification and Examinations Report 2007
2003
2004
2005
2006
2007
12,426
11,652
11,610
10,953
11,765
1.3
-6.2
-0.4
-5.7
7.4
10,998
10,273
10,437
9,989
10,757
1.3
-6.6
1.6
-4.3
7.7
56,490
51,537
51,897
49,273
52,625
12
These figures show that:

There has been a gradual decrease in the number of candidates who sat for at
least 1 TEE subject/WACE course examination each year between 2003 and 2006.
This pattern changed when between 2006 and 2007 the number of candidates who
sat for at least 1 TEE subject increased (by 812) to 11,765.

There have been fluctuations in the number of students who sat at least 4 TEE
subject/WACE course examinations between 2003 and 2007. The fluctuations
appear to have been cyclic, with an overall decrease of 2.2% of candidates sitting
4 or more examinations for these years.

Of the 13,855 candidates who were enrolled to sit 1 or more TEE subject/WACE
course examinations, only 11,765 of these actually sat these examinations. This
represents a 15.1% absentee rate compared to 13.5% for 2006, 13% for 2005,
15% for 2004 and 11% for 2003.

Of the 11,198 candidates who were enrolled to sit 4 or more TEE subject/WACE
course examinations, 10,757 of these actually sat these examinations. This
represents a 4 percent absentee rate, which is the approximately the same as the
figures for 2006 (5%), and 2005, 2004 and 2003 (6%).

Of the 57,353 candidate/subject enrolments there were only 52,625
candidates/subjects present. This represents an absentee rate of 8.2% compared
to 8.2% for 2006, 8.4% for 2005, 9.1% for 2004 and 7.4% for 2003.
Enrolments: Vocational Education and Training (VET)
An




enrolment in a unit of competency can lead to one of the following:
competency achieved;
competency not achieved;
withdrawn; or
continuing enrolment.
Enrolments in units of competency are represented in tables 8 and 9. In addition, in
2007, there were 139 students (226 in 2006; 84 in 2005; 152 in 2004; 110 in 2003)
given recognition of prior learning (RPL) in 812 competencies (1242 in 2006; 238 in
2005; 589 in 2004; 317 in 2003). Students given RPL are not included in the
statistics in the following tables.
Table 8: Year 12 enrolments in at least one
2003
Number of students who enrolled
in at least one unit of
5,255
competency
Number and percentage* of
4,986
students who studied at least
(94.9%)
one unit of competency
Number and percentage* of
1,313
students who withdrew
(25.0%)
Number and percentage* of
4,507
students who achieved at least
(85.8%)
one unit of competency
unit of competency, 2003–2007
2004
2005
2006
2007
5,889
6,329
6,411
6,192
5,527
(93.8%)
6,147
(97.1%)
6,158
(96.1%)
6,071
(98.1%)
1,052
(17.9%)
897
(14.2%)
443
(6.9%)
648
(10.5%)
5,106
(86.7%)
5,689
(89.9%)
5,742
(89.6%)
5,662
(91.4%)
*Percentage is calculated of the number of students who enrolled in at least one unit of competency.
Certification and Examinations Report 2007
13
Table 9: Units of competency studied by Year 12 students, 2003–2007
2003
2004
2005
2006
Number of units of competency in
53,310
61,643
59,713
61,822
which students were enrolled
Number and percentage of units of
46,426
50,082
56,235
59,976
competency in which students
(87.1%) (81.2%) (94.2%) (97.0%)
studied
Number and percentage of units of
6,884
4,648
3,478
1,846
competency in which students
(12.9%)
(7.5%)
(5.8%)
(3.0%)
withdrew
Number and percentage of units of
38,334
39,668
46,975
57,574
competency in which students
(71.9%) (64.4%) (78.7%) (93.1%)
achieved competency
2007
68,479
65,202
(95.2%)
3,277
(4.8%)
56,706
(82.8%)
*Percentage is calculated of the number of units of competency in which students were enrolled.
7% of the students who studied four or more TEE subject/WACE course examination
also completed at least one VET unit of competency. This compares to 6%, 7% and
8% of students who studied at least four TEE subject/WACE course examinations and
completed competencies in 2004, 2005 and 2006 respectively.
School performance data
The Year 12 performance data was released at a media conference on Tuesday 8
January 2008. The heads of the school sectors and systems made comments on the
data and answered media questions. This approach provided a balanced perspective
to the community.
The data released was in the same format as for 2006. The tables included the first
50 schools in TEE/WACE course examinations, WSA, VET and WACE. Additional data
provided information on a school-by-school basis.
Although the media still constructed ‘league tables’ from these data, the release of the
data at a media conference resulted in a balanced coverage by a broader range of
media outlets.
The data were also published on the Curriculum Council’s website. The tables were
accompanied by comments cautioning the reader about the interpretations to be
made from the information.
Statistics
The following statistics can be found in Appendices to this report.
Year 12 state statistics
Appendix D, Sections 1–5
Enrolments in examinations
Appendix E, Tables A6–A15
Performance in examinations
Appendix F, Tables A16–A22
Achievement of WACE
Appendix G, Tables A23–A24
Continuing and future improvements
During 2008, continue to develop programs in SIRS, to provide schools with various
reports relating to their students’ achievement.
Certification and Examinations Report 2007
14
Section 4: Setting the examinations
High-quality examination papers were produced as a prerequisite for valid and credible
assessment of student achievement.
In 2007, tertiary entrance examinations were held in 34 subjects, and WACE
examinations in four courses. The WACE courses included Aviation and three new
courses: English, Engineering Studies and Media Production and Analysis. In addition
to a written paper, some examinations had a practical component (oral interview,
visual diary or performance).
Thirty tertiary entrance examination papers and the four WACE examination papers
were written in Western Australia by panels appointed by the Curriculum Council.
Each panel consisted of minimum of three examiners nominated by the universities
and the schools sector. The remaining four papers were imported from other states—
Hebrew and Arabic from the Victorian Curriculum and Assessment Authority, Modern
Greek from the Senior Secondary Assessment Board of South Australia and Japanese:
Advanced from the New South Wales Board of Studies.
In addition, a two-hour test of English language competence was set. This test was
made available to students who had completed their final year of senior secondary
schooling without obtaining a grade of C (or average of level 4) or better in an English
language area. It provides students with a safety net for meeting the English
language competence standard required for secondary graduation.
Quality control
Tertiary entrance and WACE examination papers prepared by the Curriculum Council’s
examiners were checked by an independent reviewer and by the following Curriculum
Council staff and contracted employees:
 a curriculum officer
 an assessment specialist
 an examinations development officer
 the manager of the certification and examinations branch
 a proof reader
 a final checker (nominated by the appropriate ARM panel).
The production of several examination papers did not meet the planned timeline.
Recruiting examiners, independent reviewers and final checkers who met the eligibility
criteria proved more difficult than in previous years. This resulted in some checking
processes being completed under extreme pressure. As a consequence, there was a
need to issue four errata notice to examination centres. One was the deletion of the
third ‘Instruction to candidates’ on the English paper. One was the inversion of the
rhythm notation symbol on one question on the Music paper. A third was the
modification of one question in the Media Production and Analysis paper, and the final
erratum a change to one word in the Art History paper.
A CD-ROM containing the stimulus material for the Media Production and Analysis
examination was sent to candidates during week 7 of term 3.
Production of examination materials
To accommodate a number of candidates with specific requirements, sixty-one
examination special papers were produced according to the following specifications:
Certification and Examinations Report 2007
15
Number
4
12
2
5
4
10
9
9
1
2
1
2
Format change
Printed on pink paper
Printed on green paper
Printed on blue paper
Printed on sand paper
Printed in size 18 font, Verdana on A3 paper
Reformatted according to subject-specific directions, printed on A4 paper,
single-sided, bound
Enlarged to A3, printed on one side only
Enlarged to A3
Braille
Music sight-reading enlarged
Prepared with additional space to write answers
Modified Geography broadsheet.
Officers from the Vision Education Service of the Department of Education and
Training produced the Braille paper and the special Geography broadsheets.
As in previous years, some examination questions needed to be modified for certain
candidates because their format (e.g. graphing) may have prevented a candidate from
demonstrating his or her achievement. This also changed arrangements for the
marking process, as additional instructions and marking guides were required for the
marking of these modified papers.
The total number of modified papers required in 2007 was significantly fewer than in
2006, with 61 being required (80 and 102 were required in 2006 and 2005
respectively).
All the TEE sound recordings were made at a professional recording studio, which also
took responsibility for the multiple copying of compact disks. The use of digital
recording and editing, and the use of compact disks in examination centres, continues
to be well received.
The significant increase in the cost of producing the Western Australian examination
papers for 2007 compared with 2006 was due to the introduction of the three new
WACE courses and the increase in payment rates for examinations. The cost for 2007
was $365,418, compared with $256,825 for 2006. A breakdown of costs is given in
Appendix B.
Continuing and future improvements
The process of producing examination papers and the sample examination papers for
new courses (together with their associated material) continues to be refined and
streamlined, and will include more coordinated involvement from the Curriculum
Officers.
From 2009, examinations for the new WACE courses will be specifically developed to
recognise the great variation in the abilities of students and their post-school
destinations. These course examinations will not be exclusive to tertiary-driven
candidates. Rather, these examinations will recognise that a standard of education
has been achieved. There will be two separate WACE examinations tailored for
students at two different stages of study and development.
Follow-up action
An independent review of the production of the Engineering Studies and Media
Production and Analysis examinations will be undertaken.
Certification and Examinations Report 2007
16
Section 5: Access for candidates with disabilities
Candidates who cannot adequately demonstrate the full extent of their academic achievement
under standard examination conditions are allowed to take external examinations under
special conditions. By providing these special conditions, more accurate information on
student achievement is obtained.
Examination candidates with long-term physical or learning disabilities, or special
medical needs, which would prevent them from being assessed accurately if they were
examined under standard conditions, may apply for special examination
arrangements.
Over the past three years the Australasian Curriculum Assessment Certification
Authorities (ACACA) agencies have been working together to establish a consistent set
of guidelines for considering candidates with special needs. These guidelines include
the evidence used to assess the applications for special arrangements and the nature
of the arrangements made for each category of disability. Two Curriculum Council
representatives have been part of this consultative process.
Candidates applying for special provisions completed a form that included relevant
details about the impact of the student’s disability on timed assessments.
The use of standardised essays was continued for all students seeking additional
working time in the examinations. These essays were analysed by an expert English
marker prior to the applications being presented for assessment. The information
received through the essays was most useful in assessing the applications.
Initially, the applications were categorised according to disability, then a panel of
disability experts made a recommendation to the Special Examination Arrangements
Committee. The Special Examination Arrangements Committee generally endorsed
those applications that were recommended for approval. Those applications that were
not recommended for approval by the panel were discussed by the committee prior to
a final decision being reached.
In 2007, a two-stage appeals process was established. Schools with additional
evidence to submit to support an application were able to apply for the decision to be
reviewed by the Special Examination Arrangements Committee in light of the
additional evidence. Those applicants without additional evidence to support an
application were able to lodge an appeal against the decisions of the Special
Examination Arrangements Committee. An independent appeals group was
established to consider the appeals.
The procedures adopted continued to contribute to the consistency and fairness of the
decision-making process.
Access
The number of applications for special examination arrangements decreased slightly in
2007, as did the proportion of applications to candidates. A total of 327 applications
(370 in 2006) were received for 2007. This represents 2.36 % (2.9 % in 2006) of the
number of candidates who were enrolled to sit at least one TEE/WACE examination.
Table 10 shows the changes in application figures between 2006 and 2007. There has
been little change in the proportion of candidates attending schools within the
different systems/sectors. The number of applications for special provisions coming
from the independent schools remains disproportionate. Applications from
Certification and Examinations Report 2007
17
independent schools still represent almost half of all applications, yet this sector has
only 27% of TEE/WACE candidates.
Table 10: Distribution of special examination arrangements applications, 2006–2007
School System
2006
Enrolments
Applications
%
103
82
183
0
2
370
27.8
22.2
49.5
0.0
0.5
100
Government
Catholic
Independent
Overseas
Private
Total
5,988
2,821
3,340
492
22
12,663
%
Applications
47.3
22.3
26.4
3.9
0.1
100
88
85
154
0
0
327
2007
%
Enrolments
26.9
26.0
47.1
0.0
0.0
100
%
6,415
3,214
3,716
409
13
13,767
46.6
23.3
27.0
3.0
0.1
100
Figure 7 below illustrates the difference in application rate between the systems/sectors, shown as
a percentage of their TEE/WACE candidate enrolment.
6
% of enrolments
5
4
2006
3
2007
2
1
0
Government
Catholic
Independent
Total
Figure 7: Special examination arrangements applications as a
percentage of enrolments, 2006–2007
3.5
3
2.5
2
1.5
1
0.5
0
2006
To
ta
l
te
Pr
iva
O
ve
rs
ea
s
Co
un
try
2007
M
et
ro
po
lita
n
% of Enrolments
Figure 8 below shows the percentage of candidates from each category of location
who applied for special examination arrangements. The proportion of metropolitan
students who made application (3.27%) was more than double that of country
students (1.5%). No overseas or private candidates made application in 2007.
Location of school/college
Figure 8: Special examination arrangements applications
by location, 2006–2007
Certification and Examinations Report 2007
18
The number of applications received from each school/college is shown in figure 9.
Applications were received from 86 schools. Most schools submit one or two
applications. There were 20 schools that submitted 5 or more applications and 8
schools that submitted 10 or more applications. There are three schools that over the
past three years have submitted 19 or more applications in each year.
Number of schools/colleges
20
15
Government
Catholic
Independent
10
5
0
1
2
3
4
5
6 to 9
10 or more
Num ber of applications per school/college
Figure 9: Number of applications per school/college by sector, 2007
Thirteen eligible schools have been identified as not having submitted any applications
during the six years 2002–2007. These schools comprise nine metropolitan schools
(7% out of 124) and four country schools (9% out of 45). During 2007, these schools
will be encouraged to make use of the special examination arrangements provisions,
should they have eligible students.
Twenty seven applications (8.2%) were not supported (17 in 2006 – 4.6%). While
this represents a higher proportion of applications not approved than last year, it is
consistent with the historical rate (11.2% in 2003, 8.5 % in 2002, 7.9% in 2001).
Unsuccessful applications included those where the request was outside the special
provisions policy or where there was insufficient evidence of diagnosis or the impact of
the disability on the student’s performance in external assessment. There were 23
appeals, of which 18 were upheld due to the provision of further evidence.
Statistics for each category of disability are displayed in table 11.
Table 11: Special examination arrangements by disability category, 2007
Category
ADD/ADHD
Hearing
Illness
Fine Motor
Physical
Psychological
SLD*
Vision
Total
Government
Male
Female
3
1
6
6
3
9
10
3
41
1
2
14
6
3
10
5
6
47
Total
4
3
20
12
6
19
15
9
88
Non-government
Male
19
4
15
7
5
6
67
3
126
Female
8
3
26
8
5
11
49
3
113
Total
27
7
41
15
10
17
116
6
239
Approved
27
10
54
22
14
28
107
14
276
Not
approved
0
0
3
3
0
2
19
0
27
Withdrawn
Total
4
0
4
2
2
6
5
1
24
31
10
61
27
16
36
131
15
327
* SLD means specific learning disability
Certification and Examinations Report 2007
19
The 276 successful applications for special examination arrangements covered a total
of 1,189 examinations, an average of 4.3 examinations per candidate.
Communicating the access to special examination arrangements
In 2007, the committee again emphasised the importance of evidence of the impact of
the disability on the student’s performance in timed assessments. It achieved this by:
 conducting a series of seminars for school-based personnel involved with casemanagement of students with disabilities. The seminars were held at the
invitation of the Catholic Education and AISWA systems/sectors, a regional
office of the Department of Education and Training and the Independent School
Counsellors Association. All sessions were well attended. The Department of
Education and Training undertook to provide its own education sessions for
relevant personnel.
 responding to the specific needs of candidates with severe disabilities whose
requirements and circumstances are unique in an examination situation; and
 conducting consultations with school personnel, health professionals, parents
and Year 12 students regarding the application requirements and procedures.
Resourcing the arrangements
A total of 71 examination centres catered for the candidates who had approval for
special examination arrangements at a cost of $24,161 for the supervision. The
arrangements granted included allowing the use of a scribe for candidates unable to
write or type and a clarifier for a hearing-disabled candidate.
To provide for students from all areas of the Perth metropolitan area, four centres
were again used as regional centres catering for students with special examination
needs. As in previous years, Tuart College, Canning College, Guildford Grammar
School and Kolbe College provided locations for students with a diverse range of
special examination conditions to sit examinations.
Continuing and future improvements
A number of issues were identified in 2007 to improve the special examination
arrangements process. Schools have reported that they are informed of decisions too
late in the year. It has also been reported that, in some instances, decisions do not
seem to be cognisant of school factors or that decisions may not be consistent.
Early in 2008, a high-level review will be undertaken to revise the process used to
grant special provisions in examinations. The aim of this review is to improve the
efficiency of approving applications and further enhance the consistency of decisions.
Continued involvement with the ACACA special provisions group will continue to
enhance national consistency.
Schools with poorly supported applications in 2007 will be contacted with the view to
improving the quality of their applications in 2008.
The feasibility of developing an on-line application and response process for special
examination arrangements will be explored.
Certification and Examinations Report 2007
20
Section 6: Conduct of the examinations
For students who do not require special examination conditions, valid and credible assessment
of student achievement requires that they all take the examinations under standardised
conditions of time, resources and rules of conduct.
In 2007, the Curriculum Council conducted 34 tertiary entrance examinations and four
WACE course examinations. Each subject or course had a two-and-half-hour or threehour written examination. Additionally, candidates in languages subjects had an oral
interview while Aviation, Music and Drama Studies each had a performance
examination. Art candidates submitted a visual diary.
In order to ensure standardised examination conditions, integrity and fairness, the
Curriculum Council employs supervisors to administer the examinations according to
Council guidelines. Examination supervision included the language orals, Aviation,
Music and Drama Studies performance examinations, supervision of candidates who
have special examination arrangements as well as supervision of the written
examinations.
Most of the examination organisation was carried out using the software database
program SIRS–External Assessment which was introduced in 2006.
Practical examinations
Practical examinations were conducted in eleven subjects and one course for
approximately 2,307 candidates. Also, 832 Art visual diaries were submitted for
marking.
During the October school holidays, 846 Drama Studies candidates were examined in
the performance component of the course at two metropolitan and six country
venues, ranging from Port Hedland in the north to Albany in the south. Candidates
from other country areas travelled to Perth for the practical examination, and
examinations were also conducted in Singapore for three candidates at St Francis
Methodist School.
In 2007, for the first time, practical examinations in Aviation were conducted using a
flight simulator. The examinations were held at Kent Street Senior High School,
during the October school holidays, with 44 candidates being examined via computer
simulation.
Art candidates had until the first week of Term 4 to complete their visual diaries
before they were submitted for marking, a process that took just over two weeks.
Seven students from Indonesia submitted Art visual diaries for assessment, an
increase of four from 2006.
Indonesian: Second Language and Chinese: Second Language were the first oral
interview examinations, held on 20 October. Other language subject (French,
German, Japanese: Second Language and Italian) oral examinations were conducted
during the last week of October at The University of Western Australia. Oral
examinations were also held in Albany (French), Bunbury (French and Japanese:
Second Language) and Kuala Lumpur (German). Telephone interviews were
conducted for non-metropolitan candidates in French (2), Indonesian: Second
Language (3), Italian (2) and Japanese: Second Language (2). Candidates in Modern
Greek (6), Arabic (14) and Hebrew (4) were all examined by telephone as these
examinations are set by interstate agencies.
Certification and Examinations Report 2007
21
All country schools with language subject candidates where examination centres were
not established were offered the opportunity for their students to be examined by
video conference rather than telephone interview. Only one school took up the offer,
with one candidate in Japanese: Second Language residing in Kununurra being
examined in this manner. The feedback was extremely positive and it is hoped to
extend this method of oral examination to more country candidates in 2008.
In addition to TEE subjects, the Curriculum Council conducts examinations in Year 12
language subjects based on the Collaborative Curriculum and Assessment Framework
for Languages (CCAFL). In 2007, three candidates were examined in Polish, using the
examination set by SSABSA, and their oral interviews were conducted by telephone.
The Music performance examinations commenced on Saturday 27 October and were
conducted at the Western Australian Academy of Performing Arts, Hale School and
The University of Western Australia for metropolitan area candidates. Some country
students were examined in Geraldton, Bunbury and Albany, whilst others travelled to
Perth for their examinations. More than 360 students were examined on twenty-eight
different instruments, with ten candidates choosing to do two half-electives and seven
submitting projects as their electives option.
The use of three metropolitan venues for the Music performance examinations
required careful planning and management. Overall the process ran much more
smoothly than in 2006, apart from a concern raised about the use of a digital piano
for accompaniment at one of the centres.
English language competence test
The English Language Competence Test is designed to give students who do not
achieve a C grade in a Year 12 English subject (English, English Literature or English
as a Second Language) the opportunity to demonstrate their capacity to fulfil the
language competence component of the Western Australian Certificate of Education.
Scripts are marked either pass or fail using performance criteria and annotated
student samples that represent the minimum level of English language competence
necessary to interact effectively in the broader community. Any script that receives a
fail mark from one marker is re-marked. The chief marker re-checks scripts that are
failed by both markers to confirm the rating.
Students participating
The Curriculum Council English language competence test was held on Wednesday 31
October 2007. There were 101 schools (106 in 2006) that administered the test to
358 students. 75% of these students (268) passed this test. There were four more
participants in 2007 than in 2006.
1600
1400
1200
1000
800
600
400
200
0
1994
1996
1998
2000
2002
2004
2006
2008
Year
Figure 10: Participation in the English language competence test, 2007
Certification and Examinations Report 2007
22
In 2007, optical scanning forms (teleforms) were used for the recording of pass/fail by
markers. These forms eliminated many of the student number errors that had been
reported in previous years as they were pre-printed with student identification
numbers.
Written examinations
The written examinations were held over a fifteen-day period between Thursday 1
November and Wednesday 21 November. At least two subjects were conducted on
each day except the last day of the examination period when only one examination
was held.
Three language examinations, Japanese: Advanced (NSW), Hebrew (Victoria) and
Arabic (Victoria) were held prior to the written examinations set by the Curriculum
Council to conform to the relevant examination schedule in the state in which they
were set. Modern Greek was held during the Curriculum Council examination period
at 12 noon which was outside the normal examination commencement times of
9:20am and 2:00pm.
The written TEE/WACE examinations timetable was published in July. There were
approximately 57,400 candidate/examination enrolments for the 38 examinations.
There were a large number of combinations of subjects/courses for the 13,927
candidates but the examination timetable ensured that every student could sit each of
the examinations for which they were enrolled.
Supervision
There were 589 people employed as supervisors (129 chief supervisors and 460
supervisors) at 134 examination centres throughout the state. As in past years
candidates from eight overseas schools located in Singapore, Indonesia, Malaysia,
Vietnam and China sat for examinations in the more than 24 subjects being offered by
these institutions.
Eight individual students sat a total of 21 examinations outside of Western Australia.
Three students sat their examinations interstate. Five students sat their examinations
overseas in the following locations–USA, China, Iraq, Sri Lanka and the United Arab
Emirates.
All metropolitan and some country chief supervisors attended a three-hour meeting on
23 October at which examination protocol, conduct and specific duties of the chief
supervisor were discussed. A focus of the meeting was the introduction of new WACE
courses, the proposed growth in the number of examinations and new multiple choice
sheets. Similar supervisor training sessions were held in China and Vietnam. All
costs associated with the overseas training were paid for by the respective schools.
All supervisors of examinations were required to complete a Working-with-Children
Check. This nationally accredited check is compulsory for all people who carry out
child-related work in Western Australia. The cost of these checks ($65) was met by
the Curriculum Council.
Examination visits
During the written examinations, Curriculum Council staff made 111 examination
centre visits to 60 examination centres (59 visits to 37 centres in 2006). All
metropolitan centres and one country centre with a new chief supervisor received a
visit on the first day of the examinations and special consideration was given this year
Certification and Examinations Report 2007
23
to visiting centres that had not received a visit in the last two years. Few problems
were identified by visiting staff. Overall they reported that examination
accommodation was satisfactory, supervisors were well organized and proactive within
the examination room, and centres were very well organised.
An officer from the Curriculum Council attended all examination centres at which an
examination with a sound component was being conducted. Members of the
examining panel for each of these subjects were also invited to attend. This practice,
that has become a regular feature of the examination timetable, was implemented to
provide greater feedback to examiners on the sound component of the examination
and to the Curriculum Council on conditions within a venue.
There was a complaint from one school regarding the quality of the recording of the
French examination at St Mary’s Anglican Girls’ School. There were no complaints
about the recording quality from the other four schools or their students who also sat
their French examination at St Mary’s College.
Candidate identification
Personalised examination timetables were produced in the same format as in the
previous two years. The document is a compact A5 statement showing the candidate’s
Curriculum Council identification number, name, date of birth and the examination
timetable. The back of the timetable contains important information relating to the
conduct of the examinations. In 2007, schools were able to reprint whole school or
individual student personalised examination timetables through SIRS. When this
occurred, the students would typically be given the reprint on A4 paper including a
second sheet with the information that was printed on the back of the original A5
timetable. All chief supervisors were made aware that the A4 sheets were legitimate
personalised examination timetables.
Breaches of examination rules
During the 2007 TEE/WACE examinations, 15 candidates were reported by the
examination centre supervisor as having breached the examination rules. All were
found to have breached the examination rules. Thirteen of the candidates had a
mobile telephone in their possession and one candidate had an iPod in their
possession during an examinations. One candidate was found to have breached the
examination rules due to his behaviour in the examination room.
In reaching its decision, the committee, chaired by the Chief Executive Officer,
considered whether the evidence indicated the candidates had breached the
examination rules. The committee agreed that each case the candidates had
breached the examination rules and therefore should lose 5% of their examination
marks. The candidate who was found to have breached the examination rules due to
his behaviour appealed the decision and his appeal was upheld.
Following a recommendation in 2006, a new breach of examination rules form was
developed and introduced. All chief supervisors were made aware of this new form in
correspondence sent to chief examiners and at the chief examiner’s meeting. This
form proved to be successfully as the reporting of the alleged breach was more
comprehensive.
Resourcing the examinations
The cost of ensuring efficient and effective conduct of the examinations for all
candidates in 2007 was $444,667. This is a 38% increase on the figure for 2006
($321,026). Refer to Appendix B for a breakdown of costs. This increase in overall
cost was due to introduction of four WACE examinations and two further interstate
Certification and Examinations Report 2007
24
language examinations and the increase in payment rate for supervisors. This also
includes the Working-with-Children check ($65 per applicant) that is now required.
There has been an increased average cost per candidate from $34.38 in 2006 to
$48.42 in 2007.
Continuing and future improvements
During 2008, further refinements and applications to the SIRS-External Assessment
database for examination conduct processes will be undertaken.
A process for training new supervisors (metropolitan) will also be explored.
Video-conferencing is encouraged for all country candidates in language examinations.
Certification and Examinations Report 2007
25
Section 7: Marking the examination scripts
Valid assessment of student achievement requires that reliability be high, which in turn
requires markers to apply uniform standards.
Over 670 teachers and university lecturers were engaged in the marking of the
examinations written scripts in 2007. The chief markers, nominated by the examining
panel, directed the marking process for each subject. Marking began on 2 October
2007 with Drama Studies performance examinations and concluded on 7 December
2007 with the completion of reconciliation for all written examinations and final checks
by chief markers.
Practical examinations
Details of marking of practical examinations in 2007 are set out in table 12. Thirteen
TEE subjects have a practical component and 166 teachers and university lecturers
under the direction of eleven chief markers were employed to ensure that candidates’
performances, interviews or visual diaries were marked fairly and accurately according
to the prescribed standards.
Applications for markers’ positions were called for through the Curriculum Council
Circular and past markers were notified by mail. For some language subjects, the
number of applicants was the bare minimum required to conduct the examination. In
some subjects, additional markers had to be found to replace applicants who were not
able to mark for the whole examination period or who withdrew after accepting a
position. One marker flew to Singapore to examine candidates in Drama Studies and
another to Kuala Lumpur for the German oral interviews at KBU International College.
Oral interviews for Modern Greek, Arabic and Hebrew were conducted by telephone
and assessed by markers appointed by SSABSA and VCAA which set these
examinations.
Table 12: Marking of practical examinations, 2007
Centres*
Subject
M
Art
C
O
Markers
Candidates**
Supervisors
Exam days
832
N/A
N/A
29
18
20
N/A
Applied Appointed Total***
Music
3
3
-
363
9
8
46
33
36
Drama Studies
2
6
1
846
13
16
34
23
25
Aviation
1
-
-
44
1
2
5
4
5
Chinese: 2nd L
1
-
-
31
1
1
4
2
4
French
1
2
-
378
4
5
27
22
25
1
-
1
83
2
3
10
8
10
2
-
-
103
2
2
18
10
13
1
-
-
239
2
3
14
14
15
1
1
-
191
3
3
12
11
13
Arabic
1
-
-
14
-
1
N/A
Hebrew
1
-
-
4
-
1
N/A
Modern Greek
1
-
-
11
-
1
N/A
16
12
2
3,211
37
46
German
Indonesian: 2
nd
Italian
Japanese: 2
nd
Total
L
L
199
145
166
*
M=Metro, C=Country, O= Overseas
** Figure indicates number who attended examination.
*** Total includes examining panel members who marked and chief marker (‘Applied’ does not include panel
members or chief marker)
Certification and Examinations Report 2007
26
Written examinations
Public confidence in the tertiary entrance and WACE examinations is of utmost
importance and, to ensure this, the chief marker, subject curriculum officers and
officers from the certification and examinations branch closely monitored the marking
process of each examination. In each subject or course, the marking panel consisting
of the chief marker and selected markers held a pre-marking meeting. A common
understanding of the marking guidelines was established through a discussion of the
examination questions and sample marking. This process was overseen by Curriculum
Council subject or course curriculum officers. The chief marker monitored the
reconciliation of marks, aided by the provision of bundle statistics that showed the
marking performance of individual markers and each marking pair in each
examination.
2007 saw the introduction of on-line marking of scripts for 933 candidates who sat the
Media Production and Analysis examination. An external agency provided the
technical support necessary for this operation and despite some problems with the
technology, the marking of the scripts was completed successfully. The on-line
marking process took 31 days compared to the average of 21 days for equivalent
sized subjects or courses and cost $73,762, or $79.06 per candidate.
Officers of the Certification and Examinations and the Information Services branches
carried out the processing of marks, employing a series of comprehensive checking
processes. Following the input of all marks to the database, these officers undertook
a series of internal integrity checks to ensure the accuracy and completeness of this
data.
To further ensure the accuracy of marks, chief markers were required to re-check
examination scripts where the raw score varied greatly from the school mark. The
scripts of possible subject/course exhibition winners were also re-marked to validate
this award.
The marking of the 2007 examinations involved the marking of 52,551 written scripts
(49,273 in 2006, 51,821 in 2005) by 672 markers. As in previous years, some chief
markers were faced with the problem of finding the necessary number of qualified
markers due to late withdrawals and the difficulty of finding suitable replacements at
short notice. In every case, a suitable replacement was found. The payment for
marking in 2007 was significantly increased to ensure markers received a sum
commensurate with teacher-relief rate.
Curriculum Council policy requires that, with the exception of subjects with less than
five markers, marking teams for written examinations should include at least 5% new
markers. Aviation and Art History had the largest percentage of new markers (67%
each). Three new courses were examined in 2007: English, Engineering Studies and
Media Production and Analysis. Eight markers were employed to mark Engineering
Studies and sixteen to mark Media Production and Analysis. Overall 138 (21%)
written markers were new in 2007, compared with 12% in 2006.
In 2007, examination marks in all subjects and courses with the exception of
Information Systems and Media Production and Analysis were collected using teleform
technology. Markers wrote marks on an explicitly designed form that was read by an
optical scanner. The process was faster and lead to a more accurate recording of
student results. There were some issues due to the large volume of scanning to be
completed in a very short time-frame; however, the process was completed
successfully.
Certification and Examinations Report 2007
27
Checking of marks
After receiving their statement of results, candidates have the opportunity to have
their scripts checked to ensure that the marking guide has been correctly applied.
The chief marker conducts this check and also confirms each question attempted has
been awarded a mark and that this mark had been recorded correctly.
A total of 342 candidates requested a results check in 632 examinations. This
represents 1.2% of the 52,551 examinations undertaken. Eight errors were detected.
By comparison with previous years, 2006 had 431 checks (0.9%) and 2 errors while
2005 had 229 checks (0.8%) and 3 errors. The increase in requests for results checks
reflects the uncertainty created by media coverage of issues associated with the
examination of the new courses.
To date (4 February, 2008), 129 candidates have submitted applications seeking a
breakdown of their examination scores in 343 examinations (117 candidates in 288
examinations in 2006). Applications for statements of raw examination marks close
on 14 March 2008.
Requests for scripts
Candidates who sat the examinations in 2007 were given the opportunity to purchase
copies of their examination scripts. Scripts are made available to candidates when all
the processes connected with the TEE had been completed. 304 scripts were
requested by 99 candidates, which raised revenue of $3,157 ($3,071 in 2006 from
102 candidates).
The cost of marking the examinations
The total cost of marking the 2007 examinations was $1,580,561 (written) and
$139,590 (practical), an increase of 61% on 2006. This large increase was due to the
introduction of new examinations, an increase in the number of scripts marked, a
substantial increase in payment rates awarded to markers and the introduction of online marking for Media Production and Analysis.
The cost per candidate for the marking of an examination in 2007 ranged from $26.43
for Economics to $117.80 for Aviation. See Appendix B for a break-down of the
marking costs for examinations.
Continuing and future improvements
An independent person will be engaged to undertake a review of the on-line marking
of the Media Production and Analysis scripts.
It is planned that on-line marking trialled with Media Production and Analysis in 2007
will be extended to other subjects in 2008.
For the 2008 marking process, all marks collection will be by teleform technology.
While most subjects or courses will continue to record their marks on a paper form, it
is planned that for some subjects or courses marks will be collected onto an electronic
teleform. On-line marks collection would have the advantage of increased security
and accuracy and save processing time.
Certification and Examinations Report 2007
28
Section 8: Special considerations for candidates
One of the strengths of our hybrid internal/external assessment system is that it is possible to
make accurate assessments of students’ academic achievements if they are absent or
handicapped by sickness or misadventure immediately prior to, or on the day of, the tertiary
entrance and WACE examinations.
Sickness/misadventure procedure
Every year, it is usual for some students to take the TEE/WACE examinations under
adverse circumstances that cannot be anticipated and for which they are not
responsible. In these cases, it is reasonable to expect that they may be
disadvantaged and that their examination marks may not give an accurate indication
of their level of achievement. The Curriculum Council has procedures for ensuring
that such students are not disadvantaged.
Candidates who suffered from a temporary sickness, non-permanent disability or
event close to or during the TEE/WACE examinations that they believed may have
resulted in performance below expectations or non-attendance in particular
examinations were given the opportunity to apply for assessment consideration.
Candidates were responsible for the lodging of sickness/misadventure application
forms within one week of the last TEE/WACE examination. Private candidates could
not make application for sickness/misadventure.
A committee comprising representatives of the secondary and tertiary sectors and a
medical practitioner met to consider the applications. If an application was approved,
the Curriculum Council calculated an examination mark using the applicant’s school
assessment as a basis. Normally, the derived mark is an estimate of the raw
examination mark obtained from the school assessment by regression. If this
estimated mark was higher than their actual mark, it replaced the actual mark for the
purpose of subsequent stages in the adjustment of marks.
Letters informing applicants of the outcome of their application were sent at the same
time as their statement of results. If an applicant considered there was a breach in
the process followed by the committee, they could lodge an appeal in writing.
Applications received
Tables 13, 14 and 15 provide statistics on sickness/misadventure applications
received over the past four years.
Table 13: Sickness/misadventure applications by sector and gender, 2007
School type
Female
Male
Total
Enrolments*
Government
105
47.5%
47
38.9%
152
44.4%
6,415 (46.6%)
Non-government
112
50.7%
73
60.3%
185
54.1%
6,930 (50.4%)
4
1.8%
1
0.9%
5
1.5%
409 (3.0%)
Overseas**
Total
221
121
342
13,754
*
Number of school candidates enrolled to sit TEE/WACE (as of 4/12/07) – excludes totally private
candidates.
**
Overseas schools
Note: The proportion of female applicants has decreased slightly in 2007. Ratio of females to males is
1.8:1 (2.2 in 2006, 1.54 in 2005, 2.1 in 2004, 1.97 in 2003, 2.1 in 2002, 1.8 in 2001).
Certification and Examinations Report 2007
29
In 2007, the number of applications received was 342, which represents a slight
decrease (6%) on the number of applications received in 2006. Fluctuations from
year to year seem to be the result of unpredictable and random events.
Table 14: Outcome of sickness/misadventure applications, 2004–2007
2004
2005
2006
2007
All subjects accepted
369
87.6%
413
88.1%
317
86.6%
292
85.4%
No subjects accepted
29
6.9%
39
8.5%
35
9.6%
31
9.0%
Some subjects accepted
23
5.5%
17
3.6%
14
3.8%
19
5.6%
Total
421
469
366
342
Table 15: Sickness/misadventure applications by location, 2004–2007
2004
2005
2006
2007
Metropolitan Area
310
73.6%
403
85.9%
284
77.6%
253
74.0%
Country
105
25.0%
63
13.4%
79
21.6%
84
24.5%
6
1.4%
3
0.6%
3
0.8%
5
1.5%
Overseas
Total
421
469
366
342
Outcomes for 2007
Of the 826 applications (relating to individual examination performances rather than
candidates) approved for written papers, 371 (43.6%) were from candidates who did
worse in the TEE/WACE examinations than expected from their school results. These
candidates were therefore assisted by the procedure.
There were eight group claims considered by the committee. The committee
approved a claim from the two students at St Francis Methodist School, who were not
able to undertake the aviation practical due to complications getting a marker to the
overseas examination venue. One claim for the Drama Studies written examination
was approved based on the school (Mandurah Catholic College) teaching the incorrect
text. Another group claim was approved for all candidates at the Goldfields Arts
Centre when an evacuation disrupted the Political and Legal Studies examination.
Four schools (Shenton College, St Stephen’s School – Duncraig, St Stephen’s School –
Carramar and Carine Senior High School) had applications approved for the French
examination, based on problems encountered with the sound component. The
approved procedure for adjusting the scaled mark was carried out for all relevant
candidates and the school was advised in writing of the outcome.
Appeals
There were three appeals lodged in 2007. These were considered by a specially
convened appeals panel. Two appeals on medical grounds were accepted. One
appeal related to misadventure was dismissed, with the committee’s original decision
upheld.
Certification and Examinations Report 2007
30
Section 9: Evaluation of the examination papers
Valid and credible assessment requires publicly acceptable examination papers with good
psychometric properties when used by candidates who have just completed the appropriate
subjects in Year 12 in Western Australia.
The 2007 examination papers have been evaluated statistically and by an analysis of
public comments on them.
Summary statistics on examination papers
The following comments on specific subjects/courses and their examinations are
based on statistics presented in Appendix F (table A16, page 68-69).
Full use of the marking scale
Examiners are expected to make full use of the marking scale. A restricted range of
marks increases the risk of mis-ranking candidates. In nine subjects/courses (14 in
2006) the marks spanned 90 or more percentage points and in a further fifteen
subjects/courses (6 in 2006) the range was 80–89. Subjects/courses with a range of
70 to 80 were: Chinese: Advanced, Drama Studies, Media Production and Analysis,
Music, and Physical science. No subjects/courses with a candidature greater than 100
had a range of less than 70 percentage points.
Subjects/courses with candidatures of less than 100 can be expected to have a
restricted range of ability relative to the range of possible examination scores and
therefore a restricted range of examination marks. This was true of Aviation, Chinese:
Second Language, Geology, Indonesian: Advanced, and Malay: Advanced, which all
had a range of less than 70 percentage points. Most notably, the range for Chinese:
2nd Language was lowest at 41.
Level of difficulty
TEE/WACE examiners were asked to set examinations that would result in mean raw
marks in the range 55–60 percent, with 58 being the ideal because it is the mean
score of the scaled mark scale employed by the Tertiary Institutions Service Centre.
Nineteen examining panels (15 in 2006, 13 in 2005, 12 in 2004) achieved the desired
level of difficulty.
Three subjects (4 in 2006) had comparatively easy papers with mean raw marks
above 65 — English as a Second Language (69.37), Geology (67.02) and German
(65.51). English Literature (64.81) and Music (63.78) also had a high mean marks.
Examiners of these subjects will be asked to make their papers more difficult for
2008.
Aviation (44.54), Discrete Mathematics (48.57), Information Systems (49.38) and
Media Productions and Analysis (46.77) and Engineering Studies (43.18)1 had
comparatively difficult papers, and examiners of these subjects/courses will be asked
to make their papers a little easier for 2007.
Reliability
Overall, the reliabilities of all the examinations in 2007 were high, considering that
they were untrialled tests. They ranged from 0.63 to 0.98 (0.46 to 0.96 in 2006).
Fifteen subjects had a higher reliability in 2007 than in 2006, while twelve had lower
The Engineering Studies examination comprised six separate examinations which have been averaged to produce this
score. Refer to Section XXX and Appendix YYY for more detail.
1
Certification and Examinations Report 2007
31
reliabilities. The largest increase in reliability (from 0.46 in 2006, to 0.79 in 2007)
occurred in Drama Studies. This increase was significant because the number of
candidates (862 in 2007, 909 in 2006) exceeds the smaller number of candidates
usually associated with volatility in this statistic. The largest decrease in reliability in
2007 occurred in Physical Science, for which the reliability of 0.63 was 0.32 below the
2006 value (which appears to have unusually high) and close to the 2005 value of
0.58.
Evidence for concurrent validity
The Pearson correlation coefficients between external examination marks and school
marks, after adjustment to a common scale, provide evidence of concurrent validity
for the examinations. They represent the extent to which the two measures (the
external assessment and the internal assessment) measure the same construct. If
the external examination measures something markedly different from what teachers
measure with their assessments, the correlation will be substantially lower than 1.00.
It can be seen in table A16 in Appendix F that the correlations between school-based
marks and the external examination are generally high, averaging 0.85 (0.85 in 2006)
and 2006 ranging from 0.67 to 0.94 (0.69 to 0.93 in 2006). An overall judgement
that the external examinations assessed the same achievements as the school-based
assessments seems reasonable.
Appendix J contains a study of the effects on concurrent validity which might arise
from the inclusion/exclusion of a practical component in an external examination of a
subject/course for which a practical component of the examination might be indicated
by the syllabus.
Summary of new examinations
Aviation
This year was the second year in which this course was externally examined. As for
2006, the 2007 Aviation examination produced reliable and valid assessments, and
summary statistics which generally fell within the range of other subjects.
Interestingly, the course loading for 2007 was -7.98, the same as for 2006 (see table
A21 in Appendix F). This indicates that the average ability of the cohort of Aviation
students has remained stable over the first two years of its introduction, even with the
almost 50% increase of the cohort size to 44 candidates (from 29 in 2006).
In contrast to the mean examination score of 36.24 in 2006, the 2007 mean
examination score was significantly higher at 44.54, though it was still lower than all
other examination mean scores. This increase was due in part to the introduction of a
practical assessment via a computer simulation. Marks on this part of the test,
contributing 28.6% of the total, were well above the marks in the written part of the
test.
In view of the similar ability of the cohort of students, the continuing, relatively low
examination scores on the written section of the test were perhaps not surprising.
English
This year was the first year in which this course was externally examined. As shown in
Table 1 on page 69 of Appendix F, the summary statistics of this course are almost
the same as the subject ‘English’ statistics in 2006 and previous years.
Certification and Examinations Report 2007
32
One significantly different statistic was the size of the candidature, which increased by
17%, from 7,874 in 2006 to 9,109 in 2007. This increase of 1,235 students could be
been attributable to:
 natural population growth
 the inclusion of some students who in previous years had studied the non-TEE
subjects of Vocational English and Senior English, now no longer offered, but
catering for a population of more than 5,000 students in 2006.
 the inclusion of students from the marginally reduced English Literature cohort
(approximately 140 students fewer than in 2006).
While it might have been expected that the average ability of the enlarged 2007
cohort of “English” students would have been lower than in previous years, in fact the
opposite was the case. After the usual scaling procedures, the resulting data was
almost identical to previous years (see table A21 in Appendix F).
A potential inference from these observations is that somewhere in the region of 1000
students in 2007, who in previous years would not have studied English as a TEE
subject, have performed at a level which is equal to the average of previous TEE
English cohorts. This suggests that one of the targets for the introduction of the new
WACE courses is being met – the extension of the possibility of tertiary entrance to a
wider range of students.
Media Production and Analysis
This year was the first year in which this course was externally examined. With a
mean examination score of 46.77, it was evident that a majority of students found it
more difficult to attain examination marks in this course than in most other
subjects/courses. Nevertheless, with a range of 84 marks and a maximum score of
91%, it was evident that the most able candidates were able to score very high
marks. Examiners for 2008 will be asked to try to set an examination for which the
mean score will be 58, the same as the recommended score for all subjects/courses.
In terms of scaled scores, candidates for this course achieved an average score of
53.79 (course loading of -4.21), which exceeded the average scaled scores for four
TEE subjects, and two of the other three new courses.
Engineering Studies
This year was the first year in which this course was externally examined. The mean
examination score of 43.18 was the lowest mean score of all the subjects/courses. As
for Media Production and Analysis, a range of 81 marks and a maximum score of 83%
allowed the most able candidates to score high examination marks. However,
examiners for 2008 will be asked to try to set an examination for which the mean
score will be 58.
Judging from the number of complaints about the examination content, it is clear that
some schools did not completely share the examination panel’s interpretation of the
syllabus. This was a significant contributor to the low mean examination score, and
will need to be addressed in 2008 through teacher professional development.
The examination paper for this subject was unique, in that it contained two separate
examinations and each paper had significant choice, one for each of the six stage and
context combinations. To ensure no candidates received unfair advantage or
disadvantage because of the different difficulties of each of the questions within the
different sections, raw examination results for each of were rescaled to place them all
on the same scale before conducting the usual processes standardisation, moderation
Certification and Examinations Report 2007
33
and scaling. Appendix K contains more detail about the marks management of
Engineering Studies examination data.
Candidates for this course achieved an average scaled score of 53.79 (course loading
of -6.95), which exceeded the average scaled scores for three TEE subjects, and one
of the other three new courses.
Arabic and Hebrew
For the first time in 2007, Arabic and Hebrew were included in the range of
subjects/courses used to contribute to the formation of tertiary entrance scores.
As for Japanese: Advanced and Modern Greek, students of Arabic and Hebrew
completed examinations produced in other states. The successful production of scaled
scores for these subjects was achieved through the application of a process endorsed
by the TISC Scaling Policy Committee, and conducted by the re-convened Scaling
Implementation Committee. Appendix L contains details of this procedure. There were
four and fourteen candidates for each of Hebrew and Arabic, respectively.
Evaluation by the public
Copies of papers for evaluation were made available at examination centres and
recording scripts of recorded texts were available to teachers who requested them.
The on-line examination evaluation service was activated at the beginning of the
examination period and closed on 28 December.
Appendix C is a compilation of the on-line comments about the 2007 examination
papers. Copies of these comments have been sent to executive officers of ARM
panels/reference groups for discussions about the examination. Chief examiners, who
are ex officio observers on each ARM panel/reference group, will convey this feedback
to the rest of their panels. A summary of the frequency of comments by
subject/course is given in table 16.
The summaries of comments presented in Appendix C are taken directly from the online submissions. No attempt has been made to verify either the correctness of
comments or whether they are representative of general views.
.
Table 16: Distribution of evaluation comments on examination papers,2007
Subject
Applicable Mathematics
Calculus
Chemistry
Discrete Mathematics
Drama
Engineering Studies
English
English Literature
Geography
History
Human Biology
Information Systems
Media Production and
Analysis
Physical Science
Total
Certification and Examinations Report 2007
No. of comments
2
2
8
2
1
12
2
1
4
4
1
1
3
1
44
34
Conclusions
The low level of adverse public comment for all subjects must be regarded as positive.
Furthermore, of the 44 comments received:
 twelve (27%) contained comments (in reference to 8 subjects) which were only
positive or were largely positive with minor particular issues being raised.
 eighteen (40%) contained comments (in reference to 10 subjects) which were
negative about papers.
Statistical evidence from the 2007 examination papers indicates that the overall
quality of the examinations remains at the high level to which we have become
accustomed in recent years. The high reliability statistics would not be possible
without a high level of comparability in the marking process. Strong evidence for
validity comes from:
 the methodology of content control;
 the statistical evidence for concurrent validity;
 the evidence of internal consistency implied by the generally high reliability
statistics; and
 the overall public acceptability of the examinations.
In a few cases, the statistics can be used to point out to examiners specific areas for
improvement in the future.
In particular, the Engineering Studies examination was problematic for reasons largely
related to different interpretations of the syllabus which needs to be elaborated and
supported by another sample examination and further meetings with teachers.
It may be concluded from the evidence that the 2007 examination papers provided
valid and credible assessment of the appropriate Year 12 subjects/courses.
Certification and Examinations Report 2007
35
Section 10: Statistical processes to achieve comparability of
assessment
The final results (scaled marks) in all TER subjects/courses are expressed on the same scale so
that comparability between students is possible, even though they may have gone to different
schools and may have studied different combinations of subjects/courses.
Fairness requires that students’ marks for achievement in Year 12 must have the
same unit value, for the purpose of admission to university, from whatever subject or
course they are derived or from whatever school a student attends. This is also a
mathematical requirement, since the final marks in subjects and courses must be
capable of aggregation into a tertiary entrance score. In the process of adjusting raw
school marks and raw examination marks onto a common scale, several Curriculum
Council statistical procedures are used. These are outlined in figure 11.
School Mark
Standardised
Moderated
School Mark
Combined
Mark
Standardised
Moderated School
Assessment
Raw
Examination
Scaled Mark
Standardised
Exam
Standardised
Examination
Decile Place
Figure 11: Marks adjustment process
Manual calculations are routinely carried out to check on the processing of
computerised marks. These integrity checks in 2007 confirmed that the adjustments
were correctly made.
Standardisation
Standardisation is a process used to adjust the distribution of raw external
examination marks to a distribution that is constant from year to year for every
subject. The process removes excessive skew and bimodality and adjusts
distributions that are too peaked (leptokurtic) or not peaked enough (platykurtic).
The result of standardising marks is a distribution of marks that has an approximately
linear relationship to achievement.
In terms of the processes leading to the calculation of a TER, it is immaterial how the
numerical scale is calibrated because it is the students’ ranking that determines their
prospects of entering university. However, there is still a perception that a mark of
50 represents a pass and that any mark below 50 is a fail. Although neither the
Curriculum Council nor its predecessors have certified a ‘Fail’ in a subject for over
twenty years, this perception still persists.
Statistical moderation
Statistical moderation is the process which ensures that school assessment marks in a
subject/course are placed on the same scale as marks in the subject/course at other
Certification and Examinations Report 2007
36
schools, so that these assessments contribute fairly to calculations of combined scores
and ultimately to scaled scores, and the tertiary entrance ranks.
Parity with other schools is the key issue. An incidental and highly valuable adjunct to
the process has been the production of informative data concerning the necessary
adjustment for each subject at a school. The Curriculum Council has provided schools
with summaries of mean school assessments and mean moderated assessments
which they have used to refocus their standards and helps them to avoid
unrealistically raising students’ expectations in the external examinations.
Small group moderation
Schools and colleges offering subjects/courses where it was anticipated that there
would be less than ten examination candidates in a school/subject/course cohort were
required to combine their school-based assessments with those from another cohort
in the same subject/course. The purpose of combining distributions of numerical
school assessments is to obtain a partnership that has a larger population than the
individual cohorts. This increases the accuracy of statistical moderation. It is known
as small group moderation.
Based on information shown in Table 17, it is evident that around fifty percent of
schools/subjects enter into small group partnerships.
Table 17: School/subject cohorts, 2000-2007
Year
2007
2006
2005
2004
2003
2002
2001
2000
Total no. of
school/subjects
2,744
2,618
2,637
2,621
2,648
2,592
2,590
2,559
No. of school/subjects in
small group partnerships
1,462
1,405
1,444
1,332
1,414
1,261
1,236
1,034
Percentage of school/subjects
in small group partnerships
53.3
53.7
54.8
50.8
53.4
48.6
47.7
40.4
In 2007, the post hoc analysis of the operation of small group partnerships was
conducted and the splitting of groups undertaken according to policy and without
incident.
Those conducting the small group partnership reviews reported that there seemed to
be a significant increase in the number of partnerships which were split by the
Curriculum Council due to obvious failures to report their school marks on the same
scale.
Scaling
Aggregation of marks is straightforward if all candidates take the same
subjects/courses. However, if choice is allowed (for a tertiary entrance score to be
calculated it is necessary for typical school students to obtain final marks in at least
four subjects or courses out of a choice of 38), adjustments must be made between
subjects/courses; otherwise, candidates taking difficult subjects/courses would be
disadvantaged. This between subject/course adjustment is known in Western
Australia as scaling.
Scaling was completed without incident using the average marks scaling method as in
the previous year. The results in Modern Greek and Japanese: Advanced were
manually scaled as usual, following established procedures. The results in the new
subjects of Arabic and Hebrew also were manually scaled following approved
procedures (see Appendix M). The outcomes of scaling, for all subjects and courses,
Certification and Examinations Report 2007
37
are detailed in tables A21 and A22 contained in Appendix F. Details of the scaling
required for Engineering Studies marks are detailed in Appendix K.
Continuing and future improvements
The implementation of the revised standardisation distribution points continues to
have implications for schools’ interpretations of examination statistics and, in
particular, moderation statistics. It may be appropriate to review these points when
courses are fully implemented in 2009.
Post-examination counselling suggests that a number of schools are providing school
assessment marks on scales which are potentially misleading to students and parents.
It appears that these schools are attempting to provide marks on the standardised
scale (centred on 66). This practice may be the result of a misunderstanding about
the new standardised distribution and such practice is not recommended.
Recommended practice is that school assessment marks for a subject/course are on a
scale so that a student of average ability in the state would be awarded 58 marks.
The rationale for this recommendation is that the average scaled mark of all
candidates in all subjects is centred on 58, with individual subject/course averages
varying from 58 by an amount reflecting the ‘difficulty’ of the subject (see Table 6 of
Appendix F which details subject loadings). The consequence of a school centring
school marks for a subject/course on an average of 66, particularly when the average
ability of the cohort is lower than the state average, is that students are given an
inflated impression of their abilities. This does not affect the ultimate accuracy of the
scaled score, but experience in post-examination counselling shows that this can and
does result in a number of students being very disappointed with their scaled mark in
a subject/course that is much lower than their expectations. These students find it
difficult to understand the discrepancy between what may be a high school mark and
a relatively low scaled mark.
The scaling of TEE mathematics subjects (Applicable Mathematics, Discrete
Mathematics and Calculus) has been a concern for some of the mathematics fraternity
for a number of years, with anecdotal evidence suggesting that more able students
are able to acquire higher scaled scores through studies in Discrete Mathematics than
they can acquire in the more demanding Applicable Mathematics. This issue was
investigated, and Appendix N contains a report which seems to provide some support
this suggestion. Further investigation of this topic is required, particularly in view of
the introduction of the new Mathematics courses, to be examined for the first time in
2010.
Conclusion
The entire suite of marks-adjusting programs was executed without error. Manual
integrity checks and an absence of errors reported by the public confirmed this.
Certification and Examinations Report 2007
38
Section 11: General Achievement Test Report
Comparability of the achievement of students in their school assessments is important in the
early years of the introduction of the new WACE courses.
The 2007 General Achievement Test was conducted on June 14 for all Year 12
students enrolled in at least one unit of the following WACE courses:
 Aviation
 Engineering Studies
 English
 Media Production and Analysis
The test was set by the Australian Council for Educational Research (ACER) for the
Victorian Curriculum and Assessment Authority (VCAA).
Every secondary school in the state, with Year 12 students enrolled in any of the four
WACE courses, made arrangements to conduct the test for their eligible students.
More than 20,000 test booklets and answer booklets were distributed to schools.
Sixty-two markers were employed, under the direction of the chief marker, to mark
the 14,659 scripts.
The course-based reports were discussed with teachers at consensus meetings held
during August. A CD-ROM and the student reports were sent to principals during the
first week of September. Students, through their schools were provided with a report
stating their achievement in the three components of the test.
It cost $230,000 to administer the GAT in 2007.
In 2007, state-wide outcome achievements in the courses Aviation, Engineering
Studies, English and Media Production and Analysis were modelled through use of a
multiple-regression analysis of GAT subscale data. The model was used to predict
outcome achievements in these courses and identify schools with reported
assessments which differed significantly from model predictions.
These results of the 2007 study were inconclusive, due to continuing suspicions that
the test was not taken seriously by a large number of students, and the suggestion
that a more sophisticated analysis was required than the initially recommended
regression analysis. Further work is required to evaluate the potential usefulness of
the GAT testing programme. This is particularly relevant in view of the reduction of
the importance of the accuracy of the level and band ratings which has resulted from
the renewed primacy of ‘marks’ in the production of scaled scores for tertiary
entrance.
An independent review of the GAT is to be undertaken early in 2008.
Certification and Examinations Report 2007
39
Section 12: Certification of student achievement
Year 12 students received accurate and credible certification of their academic achievements
according to the agreed timeline. They were also able to access their results on a joint web
site (established by the Curriculum Council and TISC).
In 2007, 19,121 (18,817 in 2006) Year 12 students were eligible for the Western
Australian Certificate of Education (WACE). Of these 18,357 (18,041 in 2006)
achieved the WACE. Table 18 indicates that there has been a steady increase in the
percentage of students who achieved a WACE since 2003.
Table 18: Achievement of a WACE, 2001–2007
2001
2002
2003
2004
2005
2006
2007
16,450
18,457
18,883
18,697
19,243
18,817
19,121
Achieving a
WACE
15,385
17,202
17,576
17,671
18,300
18,041
18,357
Percentage
of eligible
cohort
93.5
93.2
93.1
94.5
95.1
95.9
96.0
Eligible for a
WACE
In 2007, there were 286 (295 in 2006, 358 in 2005, 315 in 2004, 296 in 2003, 281 in
2002) Aboriginal and Torres Strait Islander students who were eligible for a WACE. Of
these, 251 (87.8%) achieved a WACE. The corresponding figures for 2002, 2003,
2004, 2005 and 2006 achieving a WACE were 224 (79.7%), 222 (75.0%), 264
(83.8%), 305 (85.2%) and 265 (89.8%) respectively.
Schools were able to download SIRS reports, one that lists the number of full-time
Year 12 students who will receive a WACE and one that identifies the full-time Year 12
students who would not be receiving a WACE. Schools were able to run these reports
after they had uploaded data relating to their students’ results. The reports could be
run as many times as necessary to confirm that the uploaded Year 12 results were
accurate.
A special provisions committee was established to review cases in which students may
be disadvantaged as a result of the transition to the new WACE arrangements.
Schools were invited to submit applications where they considered that the recent
changes to the requirements for the achievement of the WACE had disadvantaged one
or more of their Year 12 students. Situations where students may not achieve the
WACE requirements due to change in rule include:
 not meeting the English language competence requirements; or
 not achieving the standard of a C grade average.
173 applications were received, with details summarised in table 16. Of these
applications, 40 (23.1%) were granted WACE because they were considered to be
disadvantaged by the algorithm to equate levels to grades, or they were unable to
study either Vocational English or Senior English and achieve a C grade. Schools were
informed of the outcome of applications.
Certification and Examinations Report 2007
40
Table 19: Special consideration for a WACE, 2007
10 Subjects
1st round
2nd round
3rd round
0
0
2
Approved
C grade
average
0
16
13
ELC
Rejected
Applications
reviewed
5
4
0
121
8
4
126
28
19
There were 749 Year 12 students in 2007 that used VET subject equivalents1 to
achieve a WACE. The corresponding figures for the Year 12, 2003, Year 12, 2004,
Year 12, 2005 and Year 12, 2006 cohorts are 263, 315, 509 and 538 respectively.
Qualifications achieved in full were recorded on students’ statement of results. There
were 3,337 (2,344 in 2006, 2,726 in 2005, 1,840 in 2004) qualifications recorded on
2,401 (1,782 in 2006, 2,066 in 2005, 1,569 in 2004) Year 12 students’ statements of
results. Of these, 55 (77 in 2006, 275 in 2005, 169 in 2004) qualifications were
achieved through a traineeship.
Certification changes for 2007
Changes made to the 2007 certificates included reporting achievement in endorsed
programs and updating the explanatory notes for the reverse side of the statement of
results.
During 2007, the information to be certified to students was negotiated with industry,
parent groups and system/sector representatives.
Publication of Year 12 results
On 26 December 2007 (from 3pm onwards) Year 12 students were able to access
their results from the web. From 3pm on this day to midnight 27 December 2007, the
site was visited by 5,664 (5,631 in 2006) students who accessed their results and
viewed 12,505 (12,980 in 2006) results pages. The busiest hour occurred between
4:00–5:00 pm on 26 December with 1,381 hits by 772 visitors.
The www.year12results.wa.edu.au website was a joint venture between the Tertiary
Institutions Service Centre and the Curriculum Council. Results on the website were
similar to those printed on the statement of results. The TISCLine was not available.
In addition, all Year 12 students who completed at least one Curriculum Council
subject, course unit or unit of competency were issued with a statement of results
dated 27 December 2007. There were 20,330 (20,018 in 2006, 20,577 in 2005,
20,517 in 2004; 20,407 in 2003; 19,806 in 2002) statements of results produced for
the cohort of Year 12 students.
The statement of results was accompanied by:
 The Western Australian Certificate of Education for those who met the
requirements.
 The information paper for the Western Australian Certificate of Education and
statement of results 2007.
 Order forms for the 2007 TEE/WACE examinations. These included an
application for results checks, order form for TEE/WACE examination scripts
and order form for statement of raw examination marks.
Stand-alone VET competencies may be grouped, based upon nominal hours, to form subject
equivalents. 110 nominal hours equates to 1 subject equivalent.
1
Certification and Examinations Report 2007
41
 ‘Your marks’ brochure.
 An Access Careers 2007 brochure (produced by the Department of Education
and Training).
To ensure delivery on Friday 28 December 2007, the certificates for WA country and
north-west were lodged at the Perth Mail Centre at 7.30am on the morning of
Thursday 27 December 2007 and the certificates for Perth metropolitan area were
lodged at noon on Thursday 27 December 2007.
Associated costs
Statements of results and Western Australian Certificates of Education were printed
in-house and the collation and despatching outsourced. The base stock for these
certificates was designed and printed in the corporate colours in a manner that
minimises fraudulent copying. Quality-assurance procedures were put in place to
ensure the validity and accuracy of the statement of results and Western Australian
Certificate of Education. These procedures were more involved than previous years as
the certificates were printed from the new data base – SIRS.
There were 85 requests for reprints of certificates/statements up until the third week
of January 2008. This represents 0.42% of the Year 12 students who were issued
with statements in December 2007. The corresponding number (and percentage) of
reprints of the Year 12, 2006 and 2005 cohorts were 95 (0.47 percent) and 84
(0.41%) respectively. Some schools continue to incorrectly supply results for some of
their students as 65% of the reprints were due to school error.
The cost to produce the Western Australian Certificate of Education and statement of
results in 2007 was $41,686.82 (for 2006 the amount was $55,459.57).
Continuing and future improvements
In 2008, the certification of Year 12 students will accommodate the new WACE
requirements and new English language competence requirement.
SIRS printouts will be refined to allow schools to monitor whether Year 12 students
have met WACE requirements.
Certification and Examinations Report 2007
42
Section 13: Acknowledging excellence
In accordance with the Curriculum Council Act 1997 Part 3 s.9 (h), exhibitions and awards
were granted to post-compulsory students in recognition of educational excellence.
A total of 997 awards and exhibitions were granted to students who achieved
academic excellence. The awards recognise general educational excellence as well as
subject-specific excellence.
Process for deciding the winners
Examination-based awards
Chief examiners or their nominees reviewed the ranked list of raw examination marks
to identify possible subject/course exhibition winners.
A Curriculum Council award score, based on the average of five TEE/WACE scaled
marks, with at least two from each of List A and List B, was calculated for all eligible
candidates. The top ranked candidate was recommended for the Beazley Medal: TEE
and the top forty for a General Exhibition. The Beazley Medallist: TEE achieved a
Curriculum Council award score of 99.08.
WSA awards
There were 54 (55 in 2006) wholly school-assessed subjects listed for the awards in
2007. Included in this number were two courses which for the first time had awards
available for non-examination candidates. For these subjects, schools nominated their
outstanding students. One hundred (110 in 2006, 105 in 2005, 114 in 2004) different
schools nominated 655 students (796 in 2006, 768 in 2005, 644 in 2004) for awards
in 51 (53 in 2006, 50 in 2005, 48 in 2004) subjects. Some schools nominated
students for more than one subject. There were 56 (90 in 2006, 84 in 2005, 28 in
2004) students who were nominated for at least two subjects with 5 of these students
being nominated for three subjects. No students were nominated in more than three
subjects. Appendix H contains subject nominations and an historical perspective.
There were 5 (6 in 2006, 6 in 2005, 6 in 2004) subjects which either had no
nominations or in which no students were short-listed for interviews. Of the 655
nominations received, 478 students were short-listed.
To continue to address the short time frame available for the processing and
timetabling of student nominations, teleforms were used to register student
nominations. In 2007, the wholly school-assessed subject awards process was refined
to clarify the communication of selection criteria for award recommendations and to
put the onus of application and provision of all documents on the student.
Panels, representing the systems/sectors, invited students to demonstrate their
understanding of, and achievements in, the subject through an interview,
performance and/or submission of a portfolio of work. There were 45 (43 in 2006)
subjects where interviews (including performance) were required by the students.
Another four subjects (Art and Design, English, Senior Science and Music in Society)
required students to only submit work.
Interviews and review of work were held over a five-day period during the week
commencing Monday 22 October 2007 at six (6 in 2006) different venues. This
involved 107 (107 in 2006) panel members selecting award recipients from the 478
(623 in 2006, 574 in 2005, 439 in 2004) students whose nominations were shortlisted.
Certification and Examinations Report 2007
43
Country students were given the choice of attending an interview in Perth or having
an interview via videoconferencing. The number of videoconferences decreased with
29 (52 in 2006, 33 in 2005, 19 in 2004) videoconferences set up for country students
in nine different locations (13 in 2006). Six of the country students interviewed via
video-conference were recommended for certificates of distinction, with one of these
being recommended for a subject exhibition.
The decision
A five-member committee, chaired by the chairperson of the Curriculum Council,
reviewed the recommendations made by the Secretariat for the granting of each
award and exhibition in accordance with the criteria.
The committee considered a shortlist of students for the Beazley Medal: VET. The
committee granted the Beazley Medal: VET to the student with the highest Curriculum
Council school assessment award score. The first ranked student was awarded the
Beazley Medal: VET as she had a coherent VET program with all of the units of
competency contributing to the same AQF qualification, had achieved a Certificate II
in Business and had achieved a consistently high standard throughout her senior
secondary schooling.
The exhibition and awards policy and guidelines states that where the enrolment in an
examination is below 100 the Exhibition and Awards Committee may decide to award
a subject/course exhibition or certificates of distinction if the achievement is of an
exceptionally high standard. Although the number of candidates were below 100 in
German, Indonesian: Second Language and Geology the committee granted a subject
exhibition and certificate of distinction in these subjects due to the high raw exam
mark and combined mark. The committee granted a certificate of distinction in the
new WACE course Aviation, Chinese: Second Language, Indonesian: Advanced and
Malay: Advanced due to the high combined mark.
Reviewing the policy and guidelines
The Awards Working Party met on two occasions during 2007. The group reviewed
the policy and guidelines for 2008 and has made significant progress in developing the
policy and guidelines for 2009 and beyond.
The algorithm for equating a level and band in a course unit to an ‘A’ grade for the
achievement of a certificate of excellence was reviewed and updated following the
submission of semester one results.
Announcement of the winners
The Minister for Education and Training announced the winners of the Beazley Medals
at Kings Park on Sunday 30 December 2007. The full list of award winners was
published in The Sunday Times on Sunday 30 December 2007. The announcements
were brought forward one week in 2005 and 2006, and this time-line was repeated for
2007.
Recipients of the awards and exhibitions are to be presented with the certificate
and/or prize at the Curriculum Council’s Awards Ceremony to be held on Tuesday 12
February 2008 at The University of Notre Dame. Pre-ceremony entertainment and
post-ceremony refreshments will be provided.
The five Western Australian universities jointly agreed to sponsor the Beazley Medal:
TEE. Westscheme sponsor the Beazley Medal: VET. Sponsorship to the amount of
$41,000 has been committed by fifteen different organisations to assist with the
conduct of the awards ceremony. In-kind sponsorship has been committed by two
organisations.
Certification and Examinations Report 2007
44
Summary of award winners
A Curriculum Council award was received by 713 students. A total of 997 (1,086 in
2006,1069 in 2005, 1043 in 2004, 1071 in 2003, 1029 in 2002, 1046 in 2001 and 999
in 2000) awards were made in the following categories (see Table 20).
Table 20: Number of exhibition and award winners, 2007
Number
awarded
Award
Beazley Medal: TEE
Beazley Medal: VET
General Exhibitions
Subject Exhibitions
TEE/WACE subjects/courses
WSA subjects
Special Subject Awards
TEE/WACE subjects/courses
WSA subject
Certificate of Distinction
TEE/WACE subjects/courses
WSA subjects
Special Certificate of Distinction
TEE/WACE subjects/cources
WSA subjects
Certificate of Excellence
Total
29
25
3
1
54
1
1
40
58
4
369
384
259
110
13
2
15
513
997
Continuing and future improvements
The Exhibition and Awards committee recommended that a review of the background
speaker and language eligibility criteria be undertaken during 2008.
The Awards Working Group will continue to meet to ratify the policy and guidelines for
2009 and beyond, including the provision for further recognition of VET achievement.
Certification and Examinations Report 2007
45
Section 14: Public relations
Providing information to the public about our role as an organisation and our specific
processes, particularly those contingent on the TEE/WACE, plays a major part in maintaining
our organisation’s credibility.
The Curriculum Council has an important role in keeping schools and the public
informed of the processes involved in certifying student achievement and the integrity
of the tertiary entrance examinations. Many telephone calls are made and received
by staff in the Certification and Examinations Branch clarifying issues and gathering
and explaining information. Other aspects of public relations include media liaison,
school presentations, post results counselling and responding to complaints about the
examinations and the conduct of the awards ceremony.
Media reports
Media coverage of the 2007 examinations began in August with the West Australian’s
TEE Extra feature, which included frequently asked questions and study tips from
subject experts and former high-achieving students. Media coverage of the
examinations and results continued throughout January and will continued into
February, when the official Curriculum Council awards ceremony is held.
Media enquiries throughout the exam period primarily focused on the new courses
being examined for the first time – Engineering Studies, English and Media Production
& Analysis.
This year’s Curriculum Council awards and exhibitions feature was coordinated by The
Sunday Times and featured an eight-page liftout. Articles featured the two Beazley
medallists, as well as Christ Church Grammar School’s dominance in the list of the
general exhibition award winners.
Following the wider distribution of the awards data, local and regional papers eagerly
publicised their local winners. By mid-January, more than 40 stories had appeared in
local and regional newspapers.
The release of school performance data at a media conference on 8 January 2008
attracted widespread media coverage, most notably the attendance of all four TV
stations.
A summary of the media coverage for the 2007 TEE period follows:
 A TEE Extra colour liftout in The West Australian (17/8/07).

Coverage by the four TV channels and The West Australian at the start of the
TEE written examinations (1/11/07).

An eight-page colour lift-out in The Sunday Times honouring Curriculum Council
award winners (30/1/07).

Coverage by the four TV stations at the announcement of the Beazley Medal
winners following a media conference in Kings Park on Sunday 30/1/07.

Extensive coverage by metropolitan and regional print media throughout
January highlighting general Curriculum Council award winners.

The school performance data media conference held on 8/1/08 was attended by
all four TV stations and The West Australian newspaper. Coverage included an
eight-page feature on ‘How your school rates’ (9/1/08), as well as a page one
story on how girls schools dominated the TEE rankings.
Certification and Examinations Report 2007
46
TEE/WACE examinations hotline
As in previous years, examination candidates had access to a telephone Hotline during
the examination period until 9.00pm. There were forty calls recorded. This was a
significant increase compared to 2006 when only five calls were received. There was
a range of enquiries from a candidate having an accident to one where the candidate
queried the type of dress they could wear to the examination centre. Many problems
can appear trivial, but they are of major significance to the candidate at the time.
Post-examination results counselling
This year the post results counselling commenced immediately after the Christmas
break and on the same day as the results were published on the web (Thursday 27
December 2007). The individual student results were available on the Curriculum
Council website prior to students’ statements of results being despatched.
Twelve staff from the certification and examinations branch were rostered to take
telephone calls, reply to emails and interview students and/or their parents. Many of
the early callers sought information that was available to them in the envelope
containing the statement of results. Five hundred and thirty-eight enquiries were
made by telephone, email and interview compared with 386 following the 2006 TEE.
Twenty-one (15 for 2006) of these enquiries were through interviews and thirty-two
(10 for 2006) were emails. The main area of concern, being nearly one third of all
enquiries, related to the process used by the Curriculum Council to adjust marks
(moderation, standardisation and scaling). There was particular concern about the
effect of the scaling procedures.
Number of contacts
Figure 12 illustrates the number of calls received during the seven days of the
counselling period. Calls, emails and requests for an interview received after this time
continue to be received but are not logged.
180
160
140
120
100
80
60
40
20
0
'Dec 27
'Dec 28
'Dec 31
'Jan 02
'Jan 03
'Jan 04
'Jan 07
Figure 12: Post-results counselling
Complaints concerning the TEE/WACE examinations
Complaints concerning the examinations and its administration were received by
telephone, mail, email, facsimile or through the Council’s website. There were many
more complaints received in 2007 than for the previous 10 years. The majority of
these complaints related to the new courses, Engineering Studies (20) and Media
Production and Analysis (15).
Certification and Examinations Report 2007
47
5
Awards ceremony
Academic excellence is recognised by the Curriculum Council through the awards it
offers to senior secondary students. In 2007 there were 992 awards, granted to 709
students. Recipients of the awards are to be presented with them at the Curriculum
Council’s awards ceremony to be held on Tuesday 12 February 2008 at The University
of Notre Dame. More than 2500 people have been invited to attend the ceremony.
Fiqure 6: Participation in the English Language Competence
Test, 2006
Students Participating
1600
1400
1200
1000
800
600
400
200
0
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
Year
Certification and Examinations Report 2007
48
Download