technical report

advertisement
GSA & HMC
Report on the 2015
Cambridge International Examinations (CIE)
IGCSE First Language English (0500) results
in GSA & HMC schools
Dr Peter Mason
April 2016
CONTENTS
SECTION 1
INTRODUCTION & BACKGROUND
1.1
Structure of the report
1.2
Background to the problem
1.3
Executive Summary
SECTION 2
ANALYSIS OF SCHOOLS’ DATA
Pages 5 – 15
SECTION 3
ANALYSIS OF CIE DOCUMENTATION
Pages 16 – 25
3.1
CIE briefing papers for schools on 2015 IGCSE English
3.2
CIE technical paper on 2015 IGCSE English
3.3
CIE Code of Practice
3.4
CIE’s published results on IGCSE English
SECTION 4
SCHOOLS’ FEEDBACK & COMMENTS
4.1
Comments from schools’ communications with CIE
4.2
Outcomes and comments from schools’ appeals to CIE
SECTION 5
FINAL COMMENTARY & EVALUATION
Pages 28 – 29
SECTION 6
APPENDICES
Pages 30 – 32
SECTION 7
ATTACHMENTS
Pages 33 – 62
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
Pages 2 – 4
Pages 25 – 28
page 1
SECTION 1
1.1
INTRODUCTION & BACKGROUND
Structure of the Report: this report is in seven sections:
Section 1 is this introduction
Section 2 presents a detailed analysis of the 2015 CIE IGCSE 0500 results data received from
GSA & HMC schools.
Section 3 presents a detailed evaluation and critique of CIE’s own published documents
relating to the 2015 IGCSE English examinations.
Section 4 considers schools’ own submissions including their correspondence with CIE and
reports of appeals.
Section 5 is a summary commentary and evaluation of the evidence.
Sections 6 & 7 include appendices and attachments.
1.2
The problem
For a number of years many GSA and HMC schools have opted to enter their students for IGCSE
examinations rather than the GCSE equivalent, especially in maths, English (including English
literature) and the sciences. 2015 was no exception.
A majority of those GSA and HMC schools that entered for CIE IGCSE English1 opted for syllabus
0500 which has speaking and listening separately endorsed, rather than syllabus 0522, which
has speaking and listening as an integral part of the assessment thus contributing to the overall
IGCSE grade. This element of the assessment satisfied Ofqual’s requirement for 0522 to be
available to maintained schools in England and the results to be included in DfE performance
tables.
GSA, HMC and ASCL2 monitor their member schools’ concerns about GCSE and GCE
examinations each August. Although the quality of English examinations and standards of
marking have been of prominent concern to schools for some time, prior to 2015 they were not
noticeably greater for IGCSE English than for GCSE. 2015, however, was very different; a very
large number of schools, both maintained and independent, expressed alarm about their CIE
IGCSE English results. For GSA and HMC schools this focused principally on results for syllabus
0500 and very largely, though not exclusively, at the higher grades: A*, A & B. For ASCL the
focus was on 0522. Independently, CIE must also have been aware of the extent of the alarm
since they report3 an increase of almost 200% in Enquiries About Results (EARs) for the two
English syllabuses combined: 87% on syllabus 0500 and a staggering 225% on syllabus 0522.
HMC’s policy is to write to chief executives of awarding bodies if concerns are received from at
least three schools for any given examination. Concern at or beyond this threshold level is
common each year but there is no recent precedent for the level of concern expressed last
summer over grades awarded by CIE for IGCSE English 0500. In 2015 HMC reported4 that over 80
of its member schools had serious concerns about their results. In the light of the extent of this
1
2
3
4
This is strictly entitled First Language English but for simplicity in this report it will be referred to as English. The GCSE
equivalent examination (English Language) will also be referred to as English unless there is any risk of ambiguity.
ASCL – the Association of School and College Leaders represents the majority of headteachers in England’s
maintained secondary schools and colleges.
Data provided to HMC by CIE
HMC briefing note to members 05 November 2015
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 2
unease, and following face to face meetings with CIE staff, GSA & HMC commissioned a
detailed report to receive and analyse data from schools in order to answer the questions:


was there really a problem?
if yes, how did it arise and what was the extent of the problem?

if yes, what has been CIE’s response to the problem?

If yes, what, if anything, can be done to protect the interests of the 2015 candidates and to
ensure that there is no repetition of the problem in 2016?
This is the report.
1.3 Executive Summary

Cambridge International Examinations (CIE) offer two IGCSE English syllabuses denoted as
0500 and 0522. The syllabuses are closely linked.

Syllabus 0500 is intended for overseas centres and UK independent schools; it is not
accredited by Ofqual for use by maintained schools in England.

Syllabus 0522 is accredited by Ofqual for use by maintained schools.

In recent years growth in the number of June entries has been modest for syllabus 0500
reaching just 17,619 candidates in June 2015. Schools in membership of GSA and HMC
provided a significant proportion of these candidates.

By contrast syllabus 0522 has seen extraordinary growth in entries reaching 194,469 entries in
June 2015. Such dramatic growth presented its own problems to CIE particularly in setting
the standards for a vastly changed candidature.

GSA and HMC schools did not report any major or widespread problems with the June 2014
0500 examination, nor has any evidence been received that schools that entered
candidates for syllabus 0522 in June 2014 were dissatisfied with the results.

CIE report that, in May 2015, as part of their post results analysis, they judged the 2014 results,
in both syllabuses to have been too lenient. The decision was taken to tighten standards in
the June 2015 examinations at grades A, C and E. The basis for this decision is complex and
not wholly convincing. It was not conveyed to schools but CIE assert that Ofqual was kept
informed.

There was widespread alarm among schools when results were published in August 2015.
This was true for both syllabuses. GSA, HMC and ASCL all conveyed a high level of concern
to CIE. HMC records up to 80 of their member schools being affected an unprecedented
number for a single examination. CIE itself reports an enormous increase in EARs compared
with 2014 for both syllabus 0500 (87% increase) and syllabus 0522 (225% increase), surely
sufficient evidence of schools’ widespread concerns.

In response CIE published two general briefing documents and a technical document for
schools and headteacher associations in an attempt to explain how they had set the
standard for the June 2015 examinations. There was no admission that the standard was
wrong.

However, setting the standard was complex. The two syllabuses (0500 and 0522) are linked
by common components: papers 1, 2, 3 and the coursework (paper 4) all of which must
have a common standard between the two syllabuses. These different papers provide
different pathways through the assessment with the coursework being an alternative to the
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 3
written paper 3. Also, grade C is common between core and extended tiers so the grade C
standard must be aligned between these two sets of candidates. Add to this the huge rise in
new candidature for syllabus 0522, a large proportion of whom (according to CIE) were C/D
candidates and the complexity of all the different factors combined makes reliable
standard setting exceptionally difficult.

GSA and HMC schools in this study entered candidates only for the extended tier covering
grades A*-E. Their concerns centred on grades at the top end, i.e. grades A* and A.

Analysis of the detailed data provided by 53 GSA & HMC schools covering around 5,000
candidates entered for syllabus 0500 showed a major discrepancy in the qualification grade
profile between candidates entered for papers 2 and 3 and the alternative route of papers
2 and coursework component, paper 4. The 2/4 combination was taken by the majority of
candidates and produced a grade profile well below expectations though the 2/3
candidates were also affected but to a lesser extent. CIE should publish national data for
each of syllabuses 0500 and 0522 showing how the grade profiles of the 2/3 candidates and
the 2/4 candidates differ.

Closer scrutiny of paper 2 showed a unit grade profile completely out of line with the abilities
of the candidates. The problem encountered by schools could be attributed, in the main, to
this paper. CIE revealed in the technical paper that the mark range between unit grade A
and unit grade C on this paper was just 4 raw marks. Schools inevitably questioned whether
the marking of an English paper could ensure the accuracy and precision required,
especially with the large number of additional examiners recruited to mark the large
number of additional entries.

Schools reported that their candidates had done worse than they had predicted.

More in-depth scrutiny of the results revealed that candidates had done far worse in IGCSE
English 0500 than in their other subjects combined (on average) and in English literature and
in History, two subjects which require similar skills to English. A further detailed comparison
with all GCSE candidates nationally, and candidates from a basket of 25 selective
maintained schools also showed that the IGCSE English 0500 candidates had performed
much worse than would have been expected.

CIE’s own Code of Practice provides for a number of checks that must be made before
results are released. Schools appear to have little evidence that these checks were done.

As already indicated, many schools submitted enquiries about results (EARs) requests
resulting in over 2,800 grade changes for the two syllabuses (0500 & 0522) combined.

A significant number of schools remained dissatisfied with EARs and therefore lodged
appeals.

No school’s appeal has been upheld at the date of releasing this report.

Despite overwhelming evidence that there was a major problem with the June 2015 0500
examination, CIE maintain that the standards they set were correct.

It is recommended that GSA and HMC further engage with CIE about the evidence in this
report and, if it is felt appropriate, also with Ofqual, with the aim of achieving justice for the
2015 candidates and ensuring no repetition for candidates in June 2016.
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 4
SECTION 2;
ANALYSIS OF SCHOOLS’ DATA
Those GSA & HMC schools that felt they had concerns about their CIE 0500 results were asked
to submit their results data for analysis. Data requested were:
(a)
unit grades for each unit entered, and overall (qualification) grade for each candidate
(b)
unit raw marks and overall raw mark for each candidate
(c)
estimated IGCSE English grade for each candidate
(d)
grades for all candidates for all subjects.
By 31 January 2016 returns had been received from 53 schools covering almost 5,000
candidates. A wide range of schools were represented from highly (academically) selective
city day schools to less selective rural day and boarding schools.
A number of schools had chosen to pursue their concerns through CIE’s EAR process and
subsequent Stage 1 and Stage 2 appeals. Schools were asked to make available their appeal
submission and any subsequent CIE correspondence as part of this analysis.
Comments in this report about marking are restricted to information gained from CIE itself or
from schools’ EAR feedback.
Data tables supporting the analyses that follow are integrated into the text for ease of reading
rather than being added as separate appendices where it would necessitate a degree of cross
referencing.
Various detailed analyses have been carried out which are, in summary
2.1
2.1
candidate entries for the different papers and their combinations
2,2
qualification raw marks and grade profiles
2.3
unit grade profiles
2.4
consistency between the written papers (2 & 3)
2.5
comparison with schools’ estimated grades
2.6
grades compared with candidates’ average grades in all other subjects and with English
Literature and History in particular
2.7
comparison with results for all schools in England
2.8
comparison with results of maintained selective schools
2.9
comparison 2.7 and 2.8 separated by different option routes (i.e. papers 2/3 or papers 2/4)
ANALYSIS 1: papers (units) and their combinations
Candidates entering for CIE 0500 must take papers (units) 1 & 3 or 1 & 4 (core tier, maximum
grade C) or papers (units) 2 & 3 or 2 & 4 (extended tier, grades A* – E). All candidates in the
schools in this analysis were entered for extended tier thus paper 1 will not be discussed in detail
in this report.
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 5
Papers1 & 2 (both written) assess performance on Reading Passages.
Paper 3 (written) assesses performance on Directed Writing and Composition
Paper 4 is a coursework alternative to paper 3
Of the 4,903 candidates5 for whom data was available
1,202 candidates from 15 schools were entered for papers 2 & 3
3,701 candidates from 39 schools were entered for papers 2 & 4
It is not known how these proportions compared to those in the total entry.
For brevity, in this report candidates will be referred to as either 2/3 or 2/4 candidates.
Some schools adopted a mixed entry pattern.
2.2
ANALYSIS 2; overall (qualification) raw marks and grades
Each paper has a maximum raw mark of 50. Thus the overall raw mark total for the qualification
is 100 since candidates must be assessed on two written papers or one written paper plus the
coursework component.
Raw mark grade boundaries at unit grades A and C are arrived for papers 1, 2 and 3 by a
mixture of senior examiner judgement and statistical modelling. Raw mark grade boundaries for
paper 4 (coursework) are fixed year on year. The raw mark grade boundaries for the
qualification as a whole are thus determined by summing the raw mark grade boundaries of
papers 2 & 3 or papers 2 & 4. Table 1 below shows the 2015 raw mark grade boundaries and
Table 2 gives the distributions of grades achieved in the schools in this survey, expressed as
cumulative percentages.
TABLE 1
Papers 2 & 3
Grade
A* (notional at
unit level)
A
B
C
D
E
F
G
U
5
Paper 2¥
Max 50
33
31
29
27
24
22
Paper 3¥ Qualification
Max 50
Max 100
36
31
26
21
18
15
10
5
69
62
55
48
42
37
Whilst most schools provided all the information requested a small number of returns lacked some data hence the
candidature of the more detailed analyses that follow be slightly less than this total of 4,904.
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 6
Papers 2 & 4
Grade
A* (notional at
unit level)
A
B
C
D
E
F
G
U
Paper 2¥
Max 50
Component 4 Qualification
Max 50
Max 100
33
31
29
27
24
22
50
44
38
32
26
20
14
8
83
75
67
59
50
42
¥ These strictly apply to the papers that are denoted by CIE as paper 2.1 and paper 3.1 which are the
papers taken by schools in this report.
TABLE 2: actual grade distribution: cumulative percentages
Papers/components
A*
A
B
C
D
E
U
entry
2&3
42.3%
69.1%
88.4%
97.8%
99.5%
99.8%
100%
1,202
2&4
16.3%
50.6%
81.2%
95.2%
99.5%
99.9%
100%
3,701
There is clearly a problem, though the huge disparity, (especially at grades A* and A), between
the two separate entries might suggest that the problem is solely that of the 2/4 candidates
entered. But that is not the case; schools that entered candidates for papers 2 & 3 also
expressed serious concern at their candidates’ outcomes. CIE should be asked to provide the
same data for the whole entry of 0500 candidates to see if this disparity is a feature of all entries.
2.3
ANALYSIS 3: unit grade profiles: is any one paper or component responsible for the grade
profiles above? Table 3 below provides the details.
TABLE 3: paper/component grade distributions: cumulative percentages
Papers/components
A*
A
B
C
D
E
U
2 (written)
33.1%
48.0%
62.6%
75.9%
90.3%
94.7%
100%
3 (written)
50.6%
78.1%
95.6%
99.4%
99.7%
99.9%
100%
5.8%
62.7%
91.1%
99.4%
100%
100%
100%
4 (coursework)
Note:
for individual papers/components the A* grade is a notional grade. The (notional) A*/A
boundary is derived by extrapolating the grade A/B boundary by the same number of raw
marks as the A – B interval. It is worth noting at this point that for component 4 the (notional)
A*/A boundary is the maximum raw mark available (50 marks).
Brief observations are useful here.
The paper 3 profile is more characteristic of the overall grades expected from candidates at
the schools which used the 2/3 pattern of entry. A greater proportion of those schools were
highly (academically) selective. The full mark range was used and marks above the (notional)
A*/A boundary compensated for any (apparent) underperformance on paper 2.
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 7
The component 4 (coursework) profile is not unusual at cumulative grades A*-C. Schools with
well-motivated students and largely experienced and stable English staffing will prepare their
students well for the coursework assignment. It should be expected that all these candidates
would achieve at least grade C. However, the (notional) A* and cumulative A*/A percentages
(5.8% and 62.7% respectively) are much lower than would be expected, a result, in part, of
candidates needing to gain full marks on this component in order to achieve the (notional) A*
grade.
The paper 2 profile above is an amalgam of the two different entry populations: the 2/3
candidates and the 2/4 candidates. Disaggregating the two populations gives greater insight
as detailed in Table 4 below.
TABLE 4: paper 2 grades disaggregated
Papers/components
A*
A
B
C
D
E
U
2/3 candidates
32.8%
47.5%
61.2%
75.3%
89.3%
93.6%
100%
2/4 candidates
33.2%
48.2%
63.1%
76.1%
90.6%
95.1%
100%
There is very close alignment suggesting that the ability profile of the two cohorts is very similar.
Although the alignment is close, what is particularly alarming is that only marginally over three
quarters of the candidates achieved unit grade C or above on paper 2.
Given that the paper 3 profile more accurately represents the expected qualification grade
profile it is safe to conclude that
(a)
(b)
2.4
paper 2 is responsible for lowering the qualification grade profile for the 2/3 candidates
paper 2 and component 4 together have a combined effect of significantly reducing the
qualification grade profile for the 2/4 candidates especially at grades A* and A.
ANALYSIS 4: consistency between papers 2 & 3
Paper 2 caused schools greater concern than paper 3. Some schools reported that (in their
view) the marking on paper 2 had been erratic. Whilst scrutiny of marking is not part of this study
a simple comparison of paper 2 and paper 3 marks for each candidate will reveal whether
there has, in fact, been a problem. Erratic marking usually leads to scrambling of rank order; the
best candidates don’t necessarily get the highest marks; there is inconsistency. By contrast
severe marking, whilst lowering the marks awarded, tends to do so for all candidates and
retains the rank order.
How do the marks on papers 2 and 3 correlate?
The chart below displays the extent of correlation between candidates’ paper 2 and paper 3
marks. The correlation coefficient = +0.435, i.e. the correlation can be described as fair, at best.
If the two papers are assessing similar or related skills and abilities (and it would be peculiar if
that were not the case) then a much higher correlation coefficient would be expected.
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 8
2015 CIE 0500: Paper 3 marks v Paper 2 marks
50
45
Paper 3 raw marks
40
35
30
25
20
15
10
10
15
20
25
30
35
40
45
50
Paper 2 raw marks
How do marks on paper 2 correlate with those on component 4?
The distribution of marks on component 4 is very different from that of paper 2 (see appendices
2 & 3, pages 30 & 31) therefore no reliable correlation can be derived.
What role do UMS scores have in determining the overall (qualification) grade?
Fortunately, and unlike many unitised examinations, CIE does not convert unit raw marks to UMS
scores before determining the (overall) qualification grade. The qualification grade is
determined from the total raw mark scores as shown in Table 1 above. That these raw mark
totals can be converted to total UMS scores is of nil effect. It seems strange therefore that in
providing feedback to centres of changes resulting from EARs, CIE reported mark changes as
UMS, not raw marks. Schools were understandably confused; what they really needed to know,
and could readily understand, was how the raw mark(s) had changed.
What role do paper/component grades have in determining the overall (qualification) grade?
Paper grades play no part in determining the qualification grade but the fact that they are
reported led (in part) to schools’ alarm and reinforced their view that something was seriously
wrong with the 2015 examination. Tables 5 & 6 below give the details.
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 9
TABLE 5
Grade pairs on papers 2 & 3 respectively
AA
368 BA 101 CA
100 DA
AB
20 BB
20 CB
34 DB
AC
3 BC
4 CC
5 DC
AD
1 BD
2 CD
0 DD
AE
0 BE
0 CE
0 DE
AF
0 BF
0 CF
0 DF
AG
0 BG
0 CG
0 DG
AU
0 BU
0 CU
0 DU
86
35
4
1
0
0
0
0
EA
EB
EC
ED
EE
EF
EG
EU
21
21
2
0
1
0
0
0
UA
UB
UC
UD
UE
UF
UG
UU
38
18
6
1
1
0
0
0
182
147
82
6
0
0
0
0
EA
EB
EC
ED
EE
EF
EG
EU
53
59
26
3
0
0
0
0
UA
UB
UC
UD
UE
UF
UG
UU
31
60
47
8
0
1
0
0
TABLE 6
Grade pairs on papers 2 & 4 respectively
AA
968 BA
231 CA
188 DA
AB
258 BB
123 CB
156 DB
AC
36 BC
41 CC
47 DC
AD
2 BD
1 CD
3 DD
AE
0 BE
0 CE
0 DE
AF
0 BF
0 CF
0 DF
AG
0 BG
0 CG
0 DG
AU
0 BU
0 CU
0 DU
Across the two options (2/3 and 2/4) over 200 students were ungraded in paper 2. Many of
these students would never have achieved less than grade A in any examination they had ever
taken. It is unsurprising that schools, candidates and parents were alarmed and challenged the
results.
2.5
ANALYSIS 5: overall (qualification) grades compared with schools’ estimates
Schools’ first action when something appears to be wrong with examination results is to go back
to their own forecast grades. Such forecasts are not flawless but they do represent the
judgement of experienced professionals about the expected performance of their students.
They are an important check, so important in fact that CIE’s Code of Practice (CoP) uses
schools’ forecast grades as an early indicator of problems with results [see CIE CoP, paras 5.3(f)
and 5.8(b)].
From the schools’ returns in this study 3,913 matched actual/estimated grades were obtained
covering data from 43 schools. The analysis is shown below in Tables 7a & 7b below.
TABLE 7a: actual (qualification) grades compared to schools’ estimated grades.
Grade profile
A*
A*/A
A*-B
A*-C
A*-D
A*-E
A*-U
Av subject
pts score
Actual (cum %)
24.2%
58.0%
85.2%
97.3%
99.8%
99.9%
100%
6.64
Estimated (cum %)
37.1%
71.7%
93.5%
99.6%
100%
100%
100%
7.02
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 10
CIE 0500 actual & estimated grade profiles (cumulative percentages)
100%
90%
80%
cumualtive %
70%
60%
50%
Actual
40%
Estimated
30%
20%
10%
0%
A*
A*/A
A*-B
A*-C
A*-D
A*-E
A*-U
grade range
Schools’ concerns are largely focused on the top grades (A* and A*/A), where there is
significant discrepancy between estimated and actual grades, though even at A*-B there is a
difference of over 8 percentage points.
More in-depth analysis helps to identify the main source of the problem.
TABLE 7bs
Grade difference
(actual – estimate)
% of
results
+3
0.1%
+2
1.6%
+1
11.6%
+0
43.0%
-1
35.1%
-2
8.2%
-3
0.4%
-4
0.0%
Almost 88% of actual grades either match or are within one grade of the estimate. However, it
should be noted that there are over three times as many actual grades that are one grade
below the estimated, than are above. One in 12 actual grades are two grades below the
expected.
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 11
Further analysis breaks these data down so that the accuracy for each estimated grade can
be considered. Table 7c gives the details.
TABLE 7c
Grade
difference
(act – est)
est A*
est A
est B
est C
+3
0.84%
+2
+1
4.9%
8.37%
15.9%
19.6%
29.71%
+0
47.4%
40.0%
40.9%
41.42%
-1
40.7%
35.4%
29.3%
19.67%
-2
11.2%
8.1%
5.3%
-3
0.7%
0.5%
-4
0.0%
The apparently poor accuracy of estimated grades compared to actuals has been criticised
by UCAS though they analysed estimates at A-level submitted by schools as part of students’
UCAS applications, many of which were submitted 6 – 8 months before the students actually sat
their A-level examinations. They were not estimated grades in the same sense as analysed here;
they are best considered as potential grades. Even so, UCAS estimates from independent
schools were shown to have the greatest accuracy, a result no doubt of many independent
school students being on target for A* or A grades where accuracy of prediction was much
greater than in the middle range of grades.
2.6
ANALYSIS 6: overall (qualification) grades compared with candidates’ grades in other subjects
The second comparator that schools use when results in a given subject seem awry is
comparison with candidates’ results in other subject. This takes two forms: (a) comparison of the
subject’s results with results in all the candidates’ other subjects and (b) comparison of the
subject’s results against those in similar subjects. In this analysis the similar subjects used are
English literature and History.
(a)
comparison of schools’ English results with candidates’ results in all other subjects
A large body of data was available as shown in Table 8 below.
TABLE 8
grade distributions of results in this study expressed as cumulative percentages:
English and all other subjects except English.
A*
English IGCSE
(CIE 0500)
All subjects
except English
A*/A
A*-B
A*-C
A*-D
A*-E
ENTRY
Av subject
pts score‡
27.4%
61.0%
86.6%
97.5%
99.8%
100%
4836
6.72
48.7%
76.2%
91.6%
97.8%
99.5%
99.9%
43293
7.14
‡ for the purpose of this analysis the average subject points score uses A* = 8, A = 7, B = 6, etc
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 12
The residual is defined as the average subject points score for English minus the average subject
points score for all other subjects (except English) taken by the same candidates.
This equates to 6.72 – 7.14 = -0.42, i.e. on average students’ grades were almost half a grade
worse in English than in their other subjects. This, plus the huge disparity at grades A* and A*/A,
makes schools’ anxieties fully understandable.
(b)
comparison of schools’ English results with candidates’ results in English literature and in
history
TABLE 9:
grade distributions expressed as cumulative percentages: English, English literature
and History
A*
A*/A
A*-B
A*-C
A*-D
A*-E
ENTRY
Av subject
pts score
English IGCSE
(CIE 0500)
27.4%
61.0%
86.6%
97.5%
99.8%
100%
4836
6.72
English
literature
49.7%
76.9%
92.8%
98.5%
99.7%
100%
4776
7.18
history
49.9%
79.9%
94.4%
98.3%
99.5%
99.9%
2915
7.22

entries in these subjects depend on the schools’ own entry policies. They may be GCSE or IGCSE; no
distinction is made here.
Again, the data speak for themselves. On average, results in CIE 0500 English are half a grade
worse than the same candidates gained in English literature and in history.
2.7
ANALYSIS 7: is achievement in English nationally worse than achievement in all other subjects
(on average), worse than in English literature and History in particular, and to the same extent
as seen in the above grade profiles? Table 10 (below) provides the details.
TABLE 10: all schools in England: 2015 JCQ GCSE data – cumulative percentages
A*
A*/A
A*-B
A*-C
A*-D
A*-E
ENTRY
Av subject
pts score
English
3.10%
14.40%
37.10%
65.30%
86.40%
94.40%
459027
4.97
all subjects
except English
English literature
6.97%
21.69%
43.73%
69.17%
85.18%
92.74%
4376685
5.15
4.80%
21.50%
50.50%
75.10%
89.60%
95.80%
399281
5.35
History
9.50%
28.60%
50.40%
68.80%
81.60%
89.70%
227619
5.22
Whilst it is true that results in English are lower than the other comparators, the differential,
especially at the top grades is nothing like that in this CIE 0500 analysis. The problems identified
in this study can not be explained by attributing them to a national trend.
2.8
ANALYSIS 8: comparison with results from candidates in selective maintained schools (grammar
schools)
It can be argued that the ability profile of students in the GSA & HMC schools in this study is
skewed significantly towards the more able. Given that English is a subject taken at GCSE (or
IGCSE) by the whole population of 16 year olds, the English results shown in table 11 above may
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 13
not be a fair comparison. A more relevant comparison would be the results from candidates in
selective maintained schools. These are shown in Table 11 below.
TABLE 11:
2015 GCSE (IGCSE) results from a sample of 25 selective (grammar) schools in
England (cumulative percentages)
A*
A*/A
A*-B
A*-C
A*-D
A*-E
ENTRY
Av subject
pts score
English
24.3%
63.1%
92.7%
99.2%
99.9%
100%
3568
6.79
all subjects
except English
35.0%
70.2%
91.4%
98.3%
99.7%
99.9%
35368
6.96
English literature
27.8%
69.0%
93.9%
99.3%
99.9%
100%
3562
6.90
History
34.9%
71.8%
91.0%
97.5%
99.1%
99.8%
2023
6.94

results from this analysis were taken from the schools’ own websites
Three key points emerge.
(i)
(ii)
(iii)
2.9
This population of candidates more closely matches that in this CIE 0500 study
By cumulative A*-B selective schools’ results in English are very closely aligned with those
of all other subjects.
The differentials at A* and at A*/A between English and the other comparators are much
smaller than for the CIE 0500 students (table 9).
ANALYSIS 9 How do the grade profiles in these subject pairs comparisons change if results are
analysed separately according to options taken, i.e. the 2/3 candidates and the 2/4
candidates? Tables 12a and 12b, below, have the details.
TABLE 12a: grade distributions expressed as cumulative percentages for the 2/3 candidates
A*
A*/A
A*-B
A*-C
A*-D
A*-E
ENTRY
Av subject
pts score
English
46.1%
75.5%
91.7%
98.9%
99.6%
99.9%
1194
7.11
all subjects
except English
52.8%
77.1%
91.5%
97.9%
99.7%
99.9%
9458
7.19
English literature
52.8%
79.6%
94.1%
99.0%
99.7%
100%
1179
7.25
History
53.5%
83.2%
94.0%
98.0%
99.4%
99.9%
804
7.28
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 14
TABLE 12b: grade distributions expressed as cumulative percentages for the 2/4 candidates
A*
A*/A
A*-B
A*-C
A*-D
A*-E
ENTRY
Av subject
pts score
English
18.6%
54.3%
84.2%
96.9%
99.9%
100%
3437
6.54
all subjects
except English
45.4%
74.3%
90.9%
97.5%
99.5%
99.9%
30713
7.07
English literature
47.1%
75.0%
92.0%
98.3%
99.7%
100%
3392
7.12
History
47.5%
78.1%
94.5%
98.4%
99.6%
99.9%
2033
7.18
The differentials in the 2/3 candidates’ results (table 12a) are more consistent with those of the
grammar schools’ candidates. By contrast those of the 2/4 candidates (table 12b) are so far
out of line at A* and A*/A that a school could almost be accused of being negligent if it was
not alarmed.
Interim conclusion
The analyses above have involved dissecting and objectively comparing the results of several
thousand GSA & HMC candidates in the 2015 CIE IGCSE (0500) English examination. On many
fronts the results are significantly out of line with legitimate comparators especially for the
candidates taking papers 2 & 4. The immediate conclusion is that there appears to have been
a major problem with the examination. Before stating anything more definite it is important to
assess CIE’s commentary on the examination. This assessment follows in section 3 below.
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 15
SECTION 3:
ANALYSIS OF CIE’S DOCUMENTATION
How has CIE responded to date?
CIE made 3 explanatory documents available to schools:
(i)
Cambridge IGCSE First Language Briefing for school groups (attachment 1)
(ii)
Supporting document on IGCSE First Language English – setting the standard, marking and
grading (attachment 2)
(iii)
Technical Briefing Paper: IGCSE First Language English in June 2015 (October 2015)ψ
(attachment 3)
 accessed from www.ascl.org.uk/2015exams . These papers were produced following issues
surrounding the quality of marking with the 2015 summer CIE 0522 English GCSE exam, ASCL had
been in discussion with CIE about the problems experienced by schools in 2015 as well as seeking
reassurance that similar issues will not recur in 2016. As a result, CIE produced these papers outlining
their awarding process for the summer exam.
Ψ document sent to HMC following discussions with CIE about IGCSE English (0500)
There are three further sets of relevant documents published by CIE and available from the CIE
website (www.cie.org.uk)
(iv)
CIE’s Code of Practice (see relevant extracts at Appendix 4)
(v)
CIE’s grade thresholds (June & November 2014 & 2015)
(vi)
CIE’s results documents (June 2012 – June 2015)
CIE’s Code of Practice (para 1.59a) states that CIE will publish a report to centres from the
senior examiners for each externally examined component after each examination session.
However a search of the CIE website on 10 March 2016 revealed only the June 2014 report for
specification 0500 and a statement that there are no examiner reports for syllabus 0522. [see
http://www.cie.org.uk/programmes-and-qualifications/cambridge-igcse-english-first-language0500/past-papers/ and http://www.cie.org.uk/programmes-and-qualifications/cambridgeigcse-english-first-language-uk-0522/past-papers/ ]. The apparent absence of the June 2015
senior examiners’ report for syllabus 0500 in which a number of the concerns expressed by
schools as recorded in this report could have been addressed is an opportunity missed by CIE. It
also appears to contravene their Code of Practice.
(vii) EAR data for syllabuses 0500 & 0522 (CIE private communication to HMC) – attachment 4
Key points made by CIE in their documents
It is relevant to note that CIE took the very unusual step of producing 3 documents to try to
address schools’ concerns about CIE IGCSE English in 2015. That they did this confirms that they
were aware of a problem at a relatively early stage and that they took steps to head it off.
3.1
Documents (i) and (ii) (see attachments 1 & 2 on pages 33 - 39): although not specified, these
documents refer principally to CIE 0522 which was Ofqual approved for use by maintained
schools in England. Key points to note are

CIE produced the documents in response to “enquiries received from schools on a
number of different issues”. This suggests that there was an increased volume of enquiries
from schools in 2015.
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 16

CIE emphasise that they had ensured the standard of IGCSE syllabuses (does this imply
syllabus 0500 also?) are comparable with the GCSE equivalents.

CIE state that the same grading procedures had been employed as in previous years.

CIE state that in ensuring standards comparable with previous years a small tightening of
the standard at grade A and at grade C was needed. “We tightened by about 1 mark at
grade A and by less than a mark at grade C”. Assuming this refers to raw marks rather
than UMS, it is not clear what about 1 mark means or how the grade C standard can be
tightened by less than a mark.

CIE assert that the decision (to tighten) was proposed in May 2015 and approved (by
whom?) in June 2015 based on retrospective comparison with the 2014 results. There is no
evidence that this decision was conveyed to schools.

CIE assert that 2.3% fewer candidates were awarded grade A (in 2015) than would have
received grade A in 2014. At grade C the equivalent percentage was 1.7%.

CIE assert that most schools’ results were “broadly similar to last year” but acknowledges
that some schools achieved much higher results, some much worse results. This statement
seems to suggest that 2015 results presented a similar pattern to previous years which is at
odds with the evidence. 2015 obviously generated much greater concern than previous
years otherwise there would not have been a challenge by ASCL and by GSA & HMC.

CIE acknowledge the significant (60%) growth in the entry (for 0522) which resulted in the
need for recruiting, training and monitoring a large number of new examiners. Whilst they
assert that these processes were carried out to a high standard, this is bound to raise
suspicions. Poor, inconsistent and erratic English marking has been the concern of schools
for many years. To suggest that recruitment of such a large number of new examiners had
no impact on the quality of marking stretches credibility.

CIE report an increase in EARs but that the proportion of grade changes was in line with
previous years. The increase in EARs is not quantified but is indicative of the level of
schools’ concerns. The number of grade changes (obviously increased if the proportion
has stayed the same) may simply reflect a very tight marking review régime imposed by
CIE which mitigates against changing marks and hence changing grades. For example,
one school reported a student being judged to have achieved one mark extra on paper
2 following an EAR which would have resulted in the student gaining overall qualification
grade A* rather than A. However, the one mark was within the tolerance set by CIE so the
original mark stood; the student was denied the A*. Given that A* in English at (I)GCSE is a
requirement for some very competitive university courses this could have far reaching
consequences for that student.

With respect to subject (syllabus) pair comparisons, something that schools are quick to
refer to, CIE state that they ensure consistency at cohort level but there will be cases
where individual candidates achieve different grades in English language and English
literature. But it is not clear how CIE accessed all candidates’ English literature results in
order to do such comparisons. The evidence from the GSA & HMC schools’ results
(collectively) in this study suggests that there is a big discrepancy between the
candidates’ English language (CIE 0500) and their English literature results. That was the
cause, in part, of schools’ concerns.
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 17



CIE refers to volatility in individual schools’ results as being entirely normal. It is true that
Ofqual’s research: Variability in GCSE Results for Individual Schools and Colleges
2012 to 2015 (Ofqual/15/5767; August 2015) analyses such variability (volatility) but that
doesn’t justify such variability. HMC has, for a number of years, presented results data to
Ofqual from schools with very stable staffing and student cohorts where results have
inexplicably varied massively from year to year. That Ofqual have found this to be true
does not justify its happening.
CIE reports that Ofqual were kept informed of the intention to tighten standards, and
Ofqual duly reported (see attachment 8) that they had monitored CIE IGCSE English, partly
because of the large increase in entry from maintained schools in recent years and
especially in 2015. Ofqual’s report, though not specific about CIE 0522 must surely refer
only to this specification since it is the only one that comes within Ofqual’s regulatory
jurisdiction.
However, Ofqual’s brief report is not totally consistent with the CIE documents. In particular

Ofqual states that CIE informed them in early 2015 (my emphasis) that their (CIE’s)
routine analysis had identified some leniency in grading IGCSE English in summer
2014. But CIE report that it was May and June 2015 (see above).

Ofqual states that CIE told them, as a result, it intended to tighten its grade standards
at Grade C and, to a lesser extent (my emphasis), at grade A in summer 2015. But
we know from the CIE reports that grade A was tightened more than grade C.

Ofqual acknowledges (somewhat sympathetically) that CIE faced particular
challenges in 2015, not just because of the large increase in entries, many from
centres that had not entered for IGCSE English before, but also because many of the
additional entrants were grouped disproportionately around the C/D borderline. But
Ofqual is the regulator and should be concerned with how awarding bodies ensure
the maintenance of standards when such perturbations in entry occur.

Ofqual required CIE to provide evidence on how they had come to their grading
decisions and report that CIE had used cohort predictions based on KS2 results (socalled comparable outcomes) and comparisons of the results of “benchmark
centres”, i.e. schools that had results in both 2014 and 2015. Ofqual further report that
CIE informed them that evidence from the benchmark centres suggested that
tightening 2015 standards to the extent that KS2 predictions required would have
been too severe. But the CIE reports make no reference to benchmark centres, and
give the reason for the tightening of standards as post hoc comparison with 2014.

Finally Ofqual report that not only was there bunching of candidates around the C/D
borderline but there was a significant bunching of marks around the C/D boundaries.
That is a result of the examination and mark scheme, not the candidature. Ofqual
report that on one paper (it is actually paper 2), the difference between grade C
and grade A was only 4 (raw) marks. It is astonishing that Ofqual seemingly fails to
challenge this or analyse it further but says, rather meekly: “we concluded that
Cambridge International had carried out its grading appropriately. Cambridge
International has carried out further analysis since the summer and is confident that
their grading was appropriate”. We should note that this Ofqual report was being
written at a time when there was unprecedented concern about the CIE English
IGCSE results from schools, supported by three national headteacher associations.
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 18
3.2
Document (iii) (attachment 3) is a longer and more detailed technical briefing paper which,
according to the paper itself, was made available to headteacher associations and to Ofqual.
It covers the general ground of the two shorter documents (i) and (ii) and adds more
quantitative detail. Important additional points are:

A variety of analytical tools were deployed to try to align the 2014 and 2015 standards.
These included (a) determination of grade profiles from matched KS2 results (comparable
outcomes) and (b) post-award screening which uses the average grade achieved by a
candidate in all of their GCSE subjects taken in the same exam series. CIE reports that the
screening method is usually considered (by whom?) to be more robust than the KS2
method.

The screening method plus a third analysis (subject pairs) suggested that June 2014 had
been leniently graded in 2014 especially at grade C.

The report attributes the differences between the KS2 predictions and the screening (in
the 0522 entry) to an atypical ability profile skewed towards grades C and D. CIE informed
Ofqual in May 2015 that they judged KS2 predictions as being systematically lenient and
would therefore aim to be 2% more severe (than matched candidate predictions) at
grades A, C and E. In the event this was revised to 1% for grade E but retained at 2% for
grades A and C.

The report describes two further preliminary analyses: one (for 0500) estimating the effects
of any changes in the entries from different countries, the second using forecast grades
supplied by schools which the report says was of limited value for 0522 (but presumably
was of use for 0500). The report does not give details of these analyses

In addition to the significant change in candidature for 0522, the report makes reference
to the difficulties associated with standardising the multiple routes through the
qualification (an option to do written paper 3 or coursework component 4) but aligning
standards at the common grade C through papers 1 and 2 and ensuring that standards in
the inextricably linked 0500 and 0522 are aligned when their candidatures are so different.
Even a mildly informed observer might consider this an impossible task; it is certainly
exceptionally difficult. Further, since grade boundaries are fixed for the coursework
component (paper 4) year on year the only way of influencing outcomes is the use of
papers 1 or 2 and since the primary objective seems to have been tightening the grade C
standard (because of the large influx of candidates presumably viewing grade C in CIE
English as being more accessible than on other GCSE English examinations), there was
always bound to be a knock-on effect at grade A.

The report helpfully discloses the difficult task of fixing the raw mark grade boundaries for
grade C on papers 1 and 2 in order to tighten standards. Raw mark C/D boundaries of
34/50 (paper 1) and 27/50 (paper 2) were agreed; at grade A (on paper 2) the A/B
boundary was set at 31/50, resulting in a marginally smaller tightening of the 2014 standard
than had been planned.

However, the report continues (para 6.5.6): “the agreed grade threshold still produced
large drops in outcome in the benchmark baskets for the options including components 2
and 4 and these were unexpected in the sense that they were not matched by drops in
forecast grades. There was also an appreciable drop in the overall percentage of UK
candidates achieving grade A or above on syllabus 0500; this was not seen among non-
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 19
UK centres.” Unfortunately, the term “appreciable drop” is not quantified but this is
absolutely crucial importance to the issues behind this report.

The latter comment re non-UK centres may be a result of paper 3 being popular outside
the UK. In fact, given the tightening on Paper 2 at grade A, CIE decided not to tighten the
grade A on paper 3 “to avoid creating an unnecessarily large adjustment to last year’s
standard for those (mostly overseas) centres which had taken papers 2 and 3 together.”
The report comments that paper 3 continues to be slightly easier than paper 4 at grade A.

Grading decisions were formally approved (presumably by the accountable officer) in
August 2015. The grade outcomes are shown graphically (p10 in the report; see page 49)
and confirm that 0522 grades were heavily concentrated around grades C and D as
previously mentioned. However, the report states that the entry for 0500 was heavily
concentrated at the top grades “so that a movement of a single mark in the grade A
threshold can have a significant impact on the percentage of candidates achieving the
grade”. This statement is again of critical importance when considering the problems
encountered by the GSA and HMC schools in this study.

EARs and appeals: the CIE report correctly points out that schools have the opportunity to
make an enquiry about results (EAR) if they have concerns and, if still dissatisfied, schools
may appeal. The level of unease at this year’s results, already commented on above is
clear from the CIE report which says: “we have seen an unusually large number of EARs as
well as contacts from centres that feel that their results were lower than expected”. That
the CIE report does not express alarm about this is astonishing; instead it attributes the
dramatic rise in schools’ concerns, in part, to the increase in entries alongside a wider
trend for increases in EARs. HMC reports an unprecedented level of concern among its
member schools which entered candidates for 0500.
The further option of appeal is open to centres who remain dissatisfied with the EAR
process but, as the CIE report explains, such appeals are only for situations where schools
believe that CIE has not followed its processes properly or has failed to abide by its own
Code of Practice. Appeals based on a disagreement with CIE’s judgement are rejected.
Schools in this study report very few changed grades resulting from an EAR; not one
reports a successful appeal.

Section 9 (p11) of the CIE report lists the findings of further analyses by CIE as part of their
post-results checks. The initial bullet point in this section: “we see no evidence of any
reduction in the reliability of the components compared to June 2014” hardly inspires
confidence in centres who feel that their candidates have been treated unfairly. Schools
would argue that there is plenty of evidence as shown in this report.

The post-results checks show that candidates entered for a combination that included
coursework (paper 4) suffered a greater tightening of standards.

However the most revealing statement comes in bullet point 5 of this section:
“the effect of the tightening at grade A has been greater on 0500
candidates from the UK because of the concentration of candidates close
to the grade A threshold. Changes of a single mark in a threshold can have
a disproportionate impact on schools where the ability profile is untypical of
the wider cohort (specifically, where many of the candidates are to be
found close to the threshold. As such, the combination of the tightening at
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 20
grade A and the ability profile of the schools largely explains the drop in the
percentage of candidates achieving grade A in 0500 within large stable
centres in the UK”
This is exactly the experience of schools in this study, but it is a post hoc discovery by CIE
expressed with almost a degree of surprise. There is no disclosure of the extent to which
the tightening has had a greater effect on 0500 UK candidates, no attempt to assess
whether that tightening was fair on those students, no attempt to explain whether this
greater tightening effect was intended and consistent with CIEs attempts to better align
the 2015 standards with those of 2014, no details or confirmation that the prospective 0500
results of any of the schools in this study were scrutinised and checked before results were
issued. Given that CIE admit that the schools at high risk of disproportionate tightening at
grade A were the 0500 UK schools, of which the schools in this study must have made up a
large proportion, it must have been apparent before results were released that the
prospective results differed significantly from those forecast by the schools. CIE’s Code of
Practice (para 5.8(b)) requires CIE to check a sample of results from such centres before
results are issued. CIE should produce written evidence that this was done.
3.3
Enquiries about Results (EAR) data (attachment 4)
CIE provided HMC with the number of EARs for candidates in the June 2014 and 2015
examinations. The data make alarming reading showing an increase of 125% across all subjects
from 20,394 (2014) to 45,984 (2015). Of these, syllabuses 0500 & 0522 combined accounted for
30,768 EARs (an increase of 197% on 2014), 67% of the 2015 all syllabuses total. Such was the
volume of EAR requests that 7,700 (17% of the total EARs received) could not be completed
within the planned 30 days. Whilst it is accepted that there was an increase (66%) in
candidature (0500 & 0522 combined) between 2014 and 2015 this does not explain the
enormous increase in EAR requests. These data alone must have indicated to CIE that there
had been a serious problem with the results but very surprisingly, the percentage of grades that
changed following an EAR was not significantly greater than that for all other subjects (9.3%
compared to 9.0%). Contrast this with Ofqual’s data where the percentage of GCSE grades
changed following EARs was 18.0% (see the reference in the footnote on page 54).
3.4
The CIE Code of Practice – what does it require? – see the extracts in attachment 5
CIE’s own Code of Conduct (CoP) sets both the standards and procedures it must follow in
administering its examinations. (see CoP para 1.2(i)). Failure to comply with the CoP would
present strong grounds in any centre appeal.
Each paragraph in the CoP is important but comments here will be reserved to those that are
most relevant to the issues in this report.
Paras 1.3(e) and 1.3(f) give assurance about transparency and a clear audit trail of decisions
taken, why, by whom, and to whom those taking the decisions were accountable. Schools that
took their EAR through to Stage 2 appeal do not report that this was their experience. In fact
they report that they had to be persistent to get answers to routine questions.
Para 4.7(d) confirms that marks will not be changed following a marking review if the senior
examiner performing the review of marking is within the pre-established tolerance. Whilst this is
common practice by all awarding bodies there is a particular issue with the tolerance on paper
2 where the reviewing senior examiner’s mark had to exceed 2 raw marks before the original
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 21
mark was changed. Setting aside any instructions that may have been given to senior
examiners in this process, the implications for paper 2 are profound. It has already been shown
that the paper grades A to C on paper 2 are separated by only 4 raw marks. The preestablished tolerance of 2 raw marks means that the senior examiner must differ in his/her
judgement by at least 1½ unit grades before a raw mark is changed. Put another way: if a
senior examiner judges a candidate’s paper to be top grade B standard, but the original
examiner thinks it only low grade C, the original examiner’s mark will remain. Can CIE really
defend this as excellence and best practice in assessment (CoP para 1.2(a))?
Para 4.7(e) commits CIE before the issue of results and after the grading process (my emphasis)
to “targeted re-marking by senior examiners of the work of candidates who are most at risk of
receiving inappropriate results because they are close to a grade boundary”. Only one school
that appealed, either at Stage 1 or Stage 2, reports evidence that CIE actually did this for some
of their borderline candidates. Declaration by CIE in defence of such a process having
happened would be a strong defence in an appeal. Since only one school reports such, it can
only be assumed that it was not in fact done routinely though the transparent audit trail (see
paras 1.3(e) and 1.3(f) above) should be a swift way of confirming this. Further evidence that
this was not done to any widespread extent is clear from the example quoted on page 17. The
candidate was one raw mark below qualification grade A*, that mark was awarded by the
senior examiner on review of paper 2 but denied because of the tolerance rule. That
candidate must definitely have been re-marked (note re-marked, not reviewed) by a senior
examiner before results were issued. If so did the senior examiner at that stage not think the
candidate had achieved the grade A* mark? No explanation was provided to the centre on
this stage of the marking process.
One final observation on this para is relevant. The wording does not specify what is meant by
most at risk of receiving inappropriate results. To any reasonably informed teacher, candidate
or parent this carries the immediate message of being very close to a qualification grade
boundary. CIE has already confirmed that in their massively increased candidature for 0522 a
large number of candidates were at the C/D borderline. Again, in the absence of any clear
evidence to the contrary, it is most unlikely that CIE would have had unallocated senior
examiner capacity to re-mark this volume of borderline scripts, even if they prioritised those
candidates who fell just below a (qualification) grade boundary and gave those who were just
above the benefit of the doubt. This aspect of CIE’s processes appears to be far from
transparent.
Paras 4.9(a) and 4.9(b) commit CIE’s principal examiners to producing and CIE to publishing
examiners’ reports. This has not been done for syllabus 0522 and does not appear to be
available for syllabus 0500 for the June 2015 examination. See
http://www.cie.org.uk/programmes-and-qualifications/cambridge-igcse-english-first-languageuk-0522/past-papers/
and
http://www.cie.org.uk/programmes-and-qualifications/cambridge-igcse-english-first-language0500/past-papers/
The paragraphs in section 5 of the CoP deal with the crucial aspect of the grading process.
CIE’s own technical paper (attachment 3) gives details about how grade boundaries were
finally arrived at by a mixture of judgemental and statistical means.
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 22
Para 5.2(f) requires the endorsement of the grading process by CIE’s Chief Executive, “if
necessary after the production of further evidence to support it”. Given the problems with
paper 2 grade boundaries identified in this report and in CIE’s own technical paper, and the
fact that Ofqual commented on the narrow raw unit grade A to C mark range (4 raw marks) it
would be extraordinary if the Chief Executive did not question the proposed grading and ask
for further evidence. No confirmation of such was reported by schools at appeal stage.
Para 5.3(b) commits CIE to seeking the views of teachers about the difficulty of question papers
and consider them “when making grading decisions”. It is not clear how or if this was done. The
only practical way is via teacher associations but neither GSA nor HMC report having been
canvassed in this way. CIE needs to clarify whether this process was actually done as required
by the CoP.
A number of paras in section 5 promise further safeguarding checks to confirm accurate of
results before results are issued.
Para 5.3(e) wherever possible an independent measure of the ability of the cohort will be used
to compare with previous years’ cohorts. There is evidence from CIE’s technical report that this
was done for matched pairs on syllabus 0522 but no reference to candidates on syllabus 0500.
CIE need to confirm whether this was done.
Para 5.3(f) linked to 5.3(e) above commits CIE to considering changes in the aggregates of
forecasts from centres as an indicator of whether centres perceive their candidates to be of
the same ability as those in previous years. Again, it is crucial to ask if this was done, if so was it
done for all centres (of which many would be new to CIE) or was it done for those centres
which had previously entered candidates with CIE. And if it was done, where is the evidence for
that and the impact it had on grading decisions?
Para 5.8(a) commits CIE to further checks on the likely accuracy of results by reference to
benchmark centres, a control group of centres or other cohorts of candidates. The technical
paper gives no information about whether any of these checks were done. CIE needs to
specify which and provide details.
Para 5.8(b) commits CIE to yet another pre-release check where a centre’s results differ
markedly from previous years. A sample of candidates’ scripts will be checked before results
are issued. Given that all the many centres who registered their alarm at their 2015 IGCSE 0500
results fall into the category mentioned, yet not one reports CIE as having confirmed that their
results were, in fact, checked before their results were released, it is important to ask (yet again)
whether this was done for any centres.
In summary CIEs CoP sets out clear procedures to establish standards year on year. It is
especially detailed in setting out several measures to safeguard against incorrect grading, in
fact it goes well beyond the equivalent Ofqual CoP and could therefore be viewed as a model
of best practice. It is particularly alarming therefore that with all the safeguards in place, as
required by the CoP, that the 2015 CIE IGCSE English (both syllabuses 0500 and 0522) generated
such a storm of complaints, i.e. that so many centres were so outraged by the results awarded
to their candidates. It is further alarming that CIE should appear to be so dismissive of centres’
concerns and complaints when their CoP commits them to close engagement with centres.
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 23
These points, and the questions raised above, deserve further discussion between CIE and GSA
& HMC.
3.4
June 2015 grade thresholds compared to June 2014, November 2014 and November 2015
CIE publishes grade thresholds on its website so it is instructive to see how the raw mark grade
thresholds for June 2015 compare with those of syllabus 0500 in June 2014 and also the
November 2014 & 2015 papers. The data are shown below in Table 13. All raw marks are out of
50.
Table 13: raw mark grade thresholds
A*
(notional)
Paper 2.1
A
B
C
D
E
June 2014
31
29
27
24
21
18
November 2014
34
30
26
21
18
15
June 2015
33
31
29
27
24
22
November 2015
31
28
25
23
20
18
June 2014
40
35
30
26
22
19
November 2014
35
32
29
27
23
19
June 2015
36
31
26
21
18
15
November 2015
36
32
28
24
21
18
Paper 3.1
One particular point to note is that only 4 raw marks span grades A to C in June 2015 for paper
2.1 compared to 5 raw marks in each of June 2014 and November 2015, and 9 raw marks in
November 2014. The C/D boundary (27/50) in June 2015 was also much higher than in the other
three comparator years, 3 marks higher than in June 2014 (though CIE’s technical paper
quoted a smaller differential.
3.6
CIE IGCSE English June 2015 grade profiles (0500 and 0522 combined) compared to June 2012
to June 2014
% A*
%A*/A
%A*- B
%A*- C
%A*- D
%A*- E
Entry
numbers
June 2012
16.0%
37.0%
57.8%
80.2%
91.8%
97.1%
n/a
June 2013
8.4%
20.9%
37.5%
65.8%
86.7%
95.6%
65,441
June 2014
6.8%
18.2%
36.2%
66.0%
87.5%
95.8%
121,981
June 2015
4.3%
14.1%
33.4%
64.4%
86.4%
95.0%
212,028
The grade profiles above are consistent with a changing cohort population which has a
decreasing proportion of very able (A*/A) students which is what CIE assert, hence the
dramatic reduction in the %A* and %A*/A across the four examination years.
And whilst it is certainly true that the %A*-C (66.0%) in CIE IGCSE English in June 2014 was greater
than in GCSE English (61.4%, England only) – which led CIE to judge their 2014 standards as
being too lenient – it is notable that the %A*-C in GCSE English returned to a higher level (65.3%)
in 2015. The volatility at A*-C is with the GCSE results. Further, the other large scale GCSE English
(AQA) with over 200,000 entries records an A*-C percentage of 70.2% for 2014, well above that
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 24
of CIE: why then did CIE consider they had been too lenient in 2014? Was this due to Ofqual or
DfE6 pressure concerned about the growing popularity of IGCSE as an option to GCSE?
GCSE English June 2012 – June 2015 (taken from JCQ published results for England)
% A*
%A*/A
%A*- B
%A*- C
%A*- D
%A*- E
Entry
numbers
June 2012
3.4%
15.0%
35.5%
64.2%
85.1%
94.4%
611,996
June 2013
3.3%
14.2%
34.5%
63.7%
85.2%
94.4%
666,288
June 2014
3.6%
14.3%
34.3%
61.4%
84.7%
93.8%
453,350
June 2015
3.1%
14.4%
37.1%
65.3%
86.4%
84.4%
459,027
SECTION 4:
SCHOOLS’ FEEDBACK AND COMMENTS
This section considers outcomes of Enquiries about Results (EARs) and comments from individual
schools, including any comments from Stage 1 or Stage 2 appeals or other ad hoc communications.
All the schools are members of GSA and/or HMC but their identities have not been revealed in this
report.
4.1
EARs
EARs were reported from 44 schools for 1,068 candidates resulting in 176 (16.8%) of changed
grades. It is probable that the remaining 9 schools also engaged with the EAR process but this
could not be confirmed from the data submitted.
Of the 44 schools 10 had no grades changed at all; the highest proportion of grade changes
from any of the schools was 33%. In the light of the data set out in section 2 above, it is
unsurprising that most schools requested EARs on paper 2 results.
4.2
Stage 1 & Stage 2 appeals
14 schools reported going to appeal with a further three schools that entered candidates for
CIE IGCSE English (0522). Not one reports having their appeal upheld. CIE publishes guidance to
centres on Stage 2 appeals (see attachment 7).
At all stages of the appeal every effort will be made to establish whether CIE has
used procedures which were consistent with its Code of Practice and whether it
applied its procedures properly and fairly in arriving at its judgements.
The hearing of the appeal will take the form of a re-examination of the evidence,
documents, comments and reports seen at Stage 1 of the appeals process,
presented in writing by the appellant and the CIE representatives.
While it is not usual for the appellant to attend the hearing they may do so
Two observations;
6
See letter from Nick Gibb, Education Minister, to examination boards (16 January 2015) [attachment 6] in which there
is the clear message about IGCSE (level1/2 certificates) being a threat to the new (more demanding) GCSEs.
“Including level 1 / 2 certificates in performance tables risks undermining the Government’s national curriculum and
could lead to a less demanding curriculum for some students. Simply put, the Regulator believes that the biggest
market opportunity for awarding organisations would be to create level 1 / 2 certificates that are less demanding
than new GCSEs”.
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 25


Without full access to all CIE documentation para 1 above makes it virtually impossible for
an appeal to succeed. It would be interesting to know how many CIE Stage 2 appeals
(across all subjects) have been upheld this year and last.
It is clear that CIE does not expect the appellant to appear to present their case. This
might explain some schools’ comments following their Stage 2 appeal.
CIE Stage 2 appeals documents also states:
The CIE Appeals Committee is composed of independent members, who are not CIE
employees.
which gives the impression that all members of the appeals panel (committee) are
independent. However from the Stage 2 appeal reports submitted by three schools in this study
only one member was categorised as independent. Does this contravene CIE’s own
regulations? Further, and to add to the confusion, one panel member is listed as being
independent on 2 occasions, but a committee member on another occasion. Clearly he
can’t be both.
4.3
Schools’ own comments
Many schools provided detailed commentary about their attempts to rectify (what was in their
view) wrong results awarded to their candidates. A number provided detailed exchanges
between themselves and CIE. It is not possible in this report to attach all the comments which
are inevitably lengthy, detailed and individual and apply to the school’s own context, but the
common themes have been identified.
(a)
The Quality of Marking:
Inevitably the quality of marking dominated most comments. Equally inevitably CIE
rejected such challenges largely on the grounds that (i) they recruited, trained and
monitored examiners (including the large number of additional examiners) to a high
standard and (ii) whilst schools were entitled to re-mark retrieved scripts themselves, their
teachers had not been part of the standardisation process and therefore the school’s own
marking had no authority in the examination process.
Some schools, in fact, went into considerable detail, to defend the high calibre answers
provided by their candidates which had not been identified by the original examiner or
allowed by the mark scheme. In some cases it appeared that sophisticated answers, at
an intellectual level well above IGCSE were simply not recognised.
There was also criticism from some schools that marking was inconsistent, a feature
identified in Section 2.4 above. To quote from one school:
We also recalled eight scripts from CIE to study the marking process and our
findings were that the marking of the papers was inconsistent and seemed, in
some cases, to have been completed by non-specialists.
This latter charge of marking having been done by non-specialists is something that CIE will
want or need to respond to; i.e. how many of the large number of examiners employed
by CIE in June 2015 to mark IGCSE English 0500 and/or 0522 were non-specialist English
teachers?
(b)
Detailed critiques of the question papers
Schools used their expertise to evaluate and analyse the questions identifying weaknesses
in style or failure to adhere to the specification. Their comments appear to have been
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 26
rejected though the CoP does state (para 5.3b) that “CIE will seek the views of teachers
about the difficulty of question papers and consider them when making grading
decisions”.
(c)
Frustration with the Appeals process
There was extensive detailed correspondence between schools and CIE about appeals,
both at Stage 1 and Stage 2. Schools’ concerns and CIE responses followed a general
pattern as outlined below.
School’s complaint/challenge
CIE’s standard response
Poor exam questions
Paper was subject to usual quality
assurance
Poor or inconsistent marking or lack of
adherence to the mark scheme
Examiners were recruited, trained and
monitored to a consistently high standard
Procedures not followed correctly
All procedures followed correctly
Failure to adhere to the Code of Practice
All aspects were Code of Practice
compliant
Although CIE generally appeared to respond in reasonable detail and occasionally
sympathetically to concerns expressed: “despite your slightly lower forecast grades at
grades A*, A and B, I can see that the results you obtained at grades A* and A have
dropped by more than you expected, and to this extent I can understand your concern”,
schools uniformly remained dissatisfied and frustrated by the appeals process. Many
schools gave up after Stage 1, one school writing: “despite the scale of our issues with CIE,
we decided not to appeal as we deem it a pointless process, which they will ultimately
conclude on the point that they followed their processes so can’t possibly be wrong!”.
A significant number of schools did, however, proceed to Stage 2 such that CIE fell behind
their required schedule of arranging and hearing the appeals. In more than one case CIE
wrote to a school apologising for the delay saying: “This delay has been caused by an
unexpected volume in Stage 2 appeals”. The fees were waived as a result. That there has
been such an “unexpected volume” must surely cause CIE to reflect and review the
reasons behind it.
Frustration was not just with the outcome but, as in the case of the school quoted above,
with schools’ feeling that there were no grounds on which a Stage 2 panel could or would
uphold an appeal. The standard response following a Stage 2 appeal appeared to be
The panel reviewed the evidence submitted and heard the cases presented by
Cambridge and the centre. Following the hearing, the panel discussed the
appeal and found that Cambridge had (1) used procedures consistent with the
Code of Practice, and (2) applied these procedures properly and fairly in
arriving at their judgements. The appeal was therefore REJECTED.
Thus despite there being evidence of eminently poor marking (one centre had a
candidate’s paper 2 mark raised from 24/50 to 36/50: the original examiner was identified
as 03.05) CIE would not acknowledge poor or inconsistent marking to exist and it was
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 27
clearly outside the remit of the Appeals Panel to evaluate it. It is not know whether CIE
reviewed all scripts marked by examiner 03.05 as a result of this gross error.
Failure to adhere to the Code of Practice was a major line of challenge from many
schools but one school which prosecuted this line in detail at a Stage 2 appeal reports
that: “The panel seemed to think it (i.e. scrutinising whether CIE had adhered to its CoP)
was just far too much to deal with at the hearing”.
The result of all this is a large number of schools feeling aggrieved that their candidates
have been denied their correct grades and a major loss of confidence in CIE as an
examination board and in its procedures.
The one good news post script is that two schools report the results of candidates having
re-sat in November: in one, four students who had previously been awarded grade D in
summer all gained grade C, with nil or minimal preparation; in another their one re-sit
candidate’s grade improved from C to A, again with no additional preparation.
SECTION 5:
FINAL COMMENTARY & EVALUATION
The detailed analysis and commentary above attempts to answer four questions about the 2015 CIE
IGCSE First Language English (0500) examination results
1.
was there really a problem?
2.
if yes, how did the problem arise and what was its extent?
3.
if yes, what has been CIE’s response to the problem?
4.
If yes, what, if anything, can be done to protect the interests of the 2015 candidates and to
ensure that there is no repetition of the problem in 2016?
On 1 the evidence is indisputable: there was a major problem with the CIE 0500 results in 2015,
especially at the top grades of A* and A. CIE’s own reports acknowledge this but not in sufficient
detail. The detailed analysis in this report shows this to be largely, but definitely not exclusively,
among those candidates who were entered for papers 2 and 4.
On 2 there were a number of contributory factors including:






CIE’s judgement that the 2014 standards were too lenient especially at grade C but also
at grades A and E
CIE’s attempt to correct this for 2015 which also saw a significant increase in entry
especially in the tied syllabus 0522
The need to recruit, train and monitor a very large number of additional examiners for a
subject in which marking quality has been of serious concern for some years. It is no
surprise that schools reported poor and inconsistent marking.
The exceptionally difficult task of adjusting the grade C standard between 2 tied
syllabuses, between core and extended tiers and between candidates who had taken
different options (i.e. papers 2/3 or papers 2/4). This had a major knock-on effect for grade
A* and A which CIE themselves have identified.
The very small grade A-C raw mark range that ensued on paper 2
The capping of raw marks on paper 4 (coursework) such that a notional unit A* could only
be achieved by candidates gaining full marks.
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 28
However, even when these contributory factors are acknowledged the question remains as to why
CIE undertook such a realignment of standards in the first place. One very plausible reason is that
they were under pressure to do so from Ofqual (for syllabus 0522) who themselves were under
pressure from the DfE to take action on what was perceived as the easier standard of IGCSE (level
1/2 certificates) at a time when large numbers of maintained sector schools were opting for it. GSA &
HMC might consider testing this by making a Freedom of Information (FoI) request to Ofqual to gain
access to communications with CIE about IGCSE English between January and September 2015.
On 3 that CIE produced explanatory documents once the scale of the schools’ concerns became
apparent is to be welcomed but we suspect that this was done only because of the volume of
complaints about the results. It would be extraordinary if CIE did this routinely for every subject.
However, aspects of CIE’s documents played down the scale of the problem and failed to provide
the necessary detailed breakdown of grades between option routes that would have shown the
nature of the problem. Their documents only hint at problems associated with the top grades; they
should publish the full details so that the extent of the A* and A problem is clear to everyone,
including Ofqual as regulator. In their Code of Practice, CIE commit to being fully transparent; this is
their opportunity to make that commitment a reality.
On 4 CIE should, as a matter of urgency, consider what action can be taken to protect the interests
of the 2015 candidates who have been subject to the IGCSE English 0500 problems detailed in this
report. GSA and HMC could consider sending this report to Ofqual both to inform them of the
report’s findings and to ask them to monitor any retrospective action taken by CIE. For the June 2016
examination CIE should ensure that none of the fundamental flaws seen in June 2015 are repeated,
and Ofqual should monitor them.
Finally, it is essential to return to the impact all of this has had on the candidates themselves, large
numbers of whom will have an IGCSE English grade that does not reflect their ability or performance,
and the impact that this could have on their career aspirations. It is also important to remember their
teachers and parents, whose confidence in the examination system will have further decreased as a
result.
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 29
SECTION7: APPENDICES
Appendix 1: distribution of raw marks on paper 2
2015 CIE 0500: Paper 2 raw mark scores in HMC & GSA schools
400
350
Frequency
300
250
200
150
100
50
0
<10 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49
Paper 2 raw mark score (max 50)
Appendix 2: distribution of raw marks on paper 3
2015 CIE 0500: Paper 3 raw mark scores in HMC & GSA schools
80
70
Frequency
60
50
40
30
20
10
0
<10 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49
Paper 3 raw mark score (max 50)
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 30
Appendix 3: distribution of raw marks on paper 4
2015 CIE 0500: Paper 4 (coursework) raw mark scores in HMC & GSA schools
450
400
Frequency
350
300
250
200
150
100
50
0
<10 11
13
15
17
19
21
23
25
27
29
31
33
35
37
39
41
43
45
47
49
Paper 4 (coursework) raw mark score (max 50)
Appendix 4: distribution of raw marks on papers 2+3
2015 CIE 0500: Paper 2 + 3 raw mark scores in HMC & GSA schools
60
50
Frequency
40
30
20
10
0
<40 41 43 45 47 49 51 53 55 57 59 61 63 65 67 69 71 73 75 77 79 81 83 85 87 89 91 93 95 97 99
Total
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 31
Appendix 5: distribution of raw marks on papers 2+4
2015 CIE 0500: Paper 2 + 4 raw mark scores in HMC & GSA schools
250
Frequency
200
150
100
50
0
<40 41 43 45 47 49 51 53 55 57 59 61 63 65 67 69 71 73 75 77 79 81 83 85 87 89 91 93 95 97 99
Total raw marks (Paper 2 + Paper 4)
GSA & HMC report (April 2016) v2: 2015 CIE IGCSE English (0500)
page 32
SECTION 7: ATTACHMENTS
Attachment 1: CIE briefing paper for schools (i)
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 33
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 34
Attachment 2: CIE briefing paper for schools (ii)
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 35
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 36
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 37
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 38
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 39
Attachment 3: CIE’s Technical Paper
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 40
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 41
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 42
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 43
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 44
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 45
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 46
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 47
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 48
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 49
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 50
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 51
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 52
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 53
Attachment 4: CIE EAR data June 2015
All syllabuses
June 2015
45,984
0522: regulated IGCSE
English Language
June 2015
26,858
0500: non-regulated
IGCSE English Language
June 2015
3,910
0500/0522 combined
June 2015
30,768
Grade changes
(June 2015)
Enquiries closed
20,394
June 2014
8,263
June 2014
2,090
June 2014
10,353
EAR Grade
changes %
% increase
125%
% increase
225%
% increase
87%
% increase
197%
EAR grade changes as %
of subject cohort
All syllabuses
45,984
9.0%
0.4%
0522
26,858
9.4%
1.3%
0500
3,910
8.6%
0.7%
30,768
9.3%
1.2%
0500/0522 combined
Ofqual report on 2015
examinations7
7
June 2014
18.0%
See https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/512899/2016-03-31-enquiriesabout-results-summer-2015.pdf
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 54
Attachment 5 CIE Code of Practice – extracts relevant to this Report [The full Code of Practice can
be downloaded from http://www.cie.org.uk/images/7881-code-of-practice.pdf
Para 1.2 (a) CIE will remain committed to excellence and best practice in assessment
Para 1.2 (d) CIE will ensure that all candidates receive the results that their performance merits
when judged against the relevant syllabus content and assessment criteria.
Para 1.2(g)
CIE will use expert judgements and statistical evidence to set and maintain
internationally recognised performance standards.
Para 1.2(h)
CIE’s assessments will be criterion-referenced or standards-referenced rather than norm
referenced.
Para 1.2(i)
CIE will comply with its customers’ regulatory and procedural requirements and act in
accordance with this Code of Practice to ensure that assessment standards are
maintained.
Para 1.2(j)
CIE will take steps to encourage the appropriate use of the outcomes of its assessments.
Para 1.3(a)
CIE will use customer feedback and self assessment to target areas for development,
innovation and continual improvement.
Para 1.3(e)
CIE will ensure that for each process in its assessment system there is an audit trail that
sets out the key evidence that supports the decision taken.
Para 1.3(f)
CIE’s assessment system will be transparent, in that the evidence on which decisions are
based will be clear, it will be possible to audit the decision-making process and it will be
possible for those making decisions to be held accountable.
Para 1.4(e)
CIE will assess all candidates for what they show that they know and can do, not for
what they might have achieved had circumstances been different.
Para 1.5(a)
To help teachers prepare students for future examinations, CIE will publish a report to
Centres from the senior examiners for each externally examined component after each
examination session.
Para 4.2(b)
Principal Examiners will be responsible to the Product Manager, who will ensure that the
examination as a whole meets the requirements of the syllabus and maintains
standards year on year.
Para 4.7(d)
When a senior Examiner awards a mark that is different from the original Examiner’s, it
will be the senior Examiner’s mark that prevails, subject to any pre-established tolerance
within which it is agreed that marks will not be adjusted.
Para 4.7(e)
Before the issue of results and after the grading process (Section 5) has been
completed, there will be a targeted re-marking by senior Examiners of the work of
candidates who are most at risk of receiving inappropriate results because they are
close to grade thresholds.
Paras 4.9(a) & (b)
Principal Examiner’s Report
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 55
Para 5.2(b)
CIE’s grading processes will use a combination of professional judgement and statistical
evidence. The statistical evidence used will relate to more than one previous
examination session.
Para 5.2(d)
CIE’s grading processes will ensure that the standard of a qualification is maintained
from one year to another, so that CIE’s results will be standards-referenced. Due
consideration will be paid to the need for alignment with any equivalent qualification
taken in the UK.
Para 5.2(e)
The Product Manager will be responsible for the initial grading of a syllabus, but at least
one other person will be involved. Where appropriate, representatives of Ministries or
other partner assessment organisations will be involved in the grading process.
Para 5.2(f)
The outcome of the grading process for each syllabus will be endorsed by CIE’s Chief
Executive, if necessary after the production of further evidence to support it.
Para 5.3(a)
If the assessment tasks for a component are unchanged or are believed to be of equal
inherent demand, the grade thresholds will remain unchanged from one year to
another. Otherwise grade thresholds will be raised or lowered from one session to
another.
Para 5.3(b)
CIE will seek the views of teachers about the difficulty of question papers and consider
them when making grading decisions.
Para 5.3(e)
Wherever possible, CIE will use an independent measure of the ability of the cohort
(e.g. their results in previous examinations or in control/ reference tests) to confirm any
belief that they are better than some other cohort and should therefore do better in an
examination.
Para 5.3(f)
Consideration will be given to changes in the aggregates of forecasts from candidates’
Centres as an indication of whether teachers perceive the candidates to be of the
same ability as those in previous years.
Para 5.4(c)
The aim in the determination of grade thresholds will be that alternative options within
papers, alternative papers within syllabuses, alternative syllabuses in the same subject,
and alternative subjects within the same qualification will be equally demanding of
candidates.
Para 5.5(b)
Each Principal Examiner will make a report to the Product Manager for their paper
stating and explaining the minimum mark that they recommend should be taken for
each of those thresholds designated as ‘key’ thresholds. They will do so at the end of
the marking period and before the grading process itself.
Para 5.6(c)
Those responsible for decisions about grade thresholds will sign a record of the
outcome.
Para 5.7(a)
A candidate’s syllabus grade will be calculated directly from the total of their marks on
the components that they took (weighted in accordance with the syllabus), not from
the component grades calculated for them.
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 56
Para 5.8(a)
Checks on the likely accuracy of prospective results will be conducted by reference to
benchmark Centres, a control group of Centres or other cohorts of candidates.
Para 5.8(b)
Centres where the prospective results in a subject differ greatly from the previous year
or from the forecast grades will be identified, and a sample of their results will be
checked before issue.
Para 5.8(c)
Particular attention will be paid before the issue of results to candidates around critical
borderlines that are commonly used for the determination of progression to the next
stage of education or where failure negates or reduces candidates’ success in other
examinations (e.g. for a Group Award).
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 57
Attachment 6: DfE letter to awarding bodies
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 58
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 59
Attachment 7: CIE’s guidance on Stage 2 Appeals
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 60
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 61
Attachment 8: extract from the Ofqual Report on the 2015 examinations
Monitoring Cambridge International IGCSE First Language English (0522)
Entries for this qualification have increased in recent years and in summer 2015 they almost doubled
to just over 200,000. As a result we have monitored this qualification closely in recent years.
Cambridge International informed us early in 2015 that their routine analysis had identified some
leniency in grading IGCSE First Language English in summer 2014. Cambridge International told us
that, as a result, it intended to tighten its grade standards at Grade C and, to a lesser extent, at
Grade A in summer 2015.
Cambridge International had particular challenges in setting standards in this qualification. This was
partly due to the increased entry: about half of the increase was from schools new to the syllabus,
while the other half was from existing schools entering more students. Cambridge International told
us that the increase in entry was also disproportionately focused on students who might be expected
to achieve C/D, potentially exacerbating a clustering of students around the C/D borderline.
We asked Cambridge International to provide us with the evidence for how they had come to its
awarding decisions in 2015. In setting standards in IGCSEs, Cambridge International use very similar
evidence to the GCSE exam boards. Cambridge International considered predictions based on KS2
prior attainment and comparisons of the results for 'benchmark centres' - schools with stable entries
for this syllabus in 2014 and 2015. Evidence from the benchmark centres suggested that to tighten
grade standards as far as Cambridge International had intended (in relation to KS2 predictions)
would have been too severe.
The other factor that made awarding more challenging was the bunching of marks, particularly
around the C/D boundaries. On one paper, the difference between C and A was only 4 marks. We
concluded that Cambridge International had carried out its grading appropriately. Cambridge
International has carried out further analysis since the summer and is confident that their grading was
appropriate.
GSA & HMC report (April 2016): 2015 CIE IGCSE English (0500)
page 62
Download