Advanced Users –
MidYIS, Yellis & ALIS
Durham 2013
Understanding the Students
Introduction to the Test Data
Underlying Principle
If we measure a student’s ability we can determine ‘typical
progress’ for the individual and use this to inform likely
outcomes and against which to measure performance of
individuals and groups.
Q. How does this work ?
Q. How do we measure and interpret ‘ability’ ?
Q. How do we interpret the data fairly and reliably ?
Measuring and Interpreting
Ability
Options
1) Use Pre-existing qualification data
Post-16 – Average GCSE
2) Use Baseline Test
Post-16 and Pre-16 – Computer Adaptive Baseline
Test
Note: Issues regarding use of CABT alongside Average GCSE at Post-16 will be
examined later in the day with predictive information.
Adaptive approach
Low
Average
High
Baseline Test Standardisation
•
Test scores are standardised; Mean = 100, SD = 15
Standardised
Score
National
Percentage
Comment
>130
Top 2.5%
Traditional classification of ‘mentally gifted’
>120
Top 10%
>100
Top 50%
<80
Bottom 10%
<70
Bottom 2.5% Potential special educational needs ??
Stanine
1
2
4%
3
4
6
7
8
9
7% 12% 17% 20% 17% 12% 7%
A
v
e
r
a
g
e
Below Average
Band D
25%
SEN ??
50
5
60
70
80
Percentiles:
1
5
10
Above Average
Band C Band B
25%
25%
Band A
25%
G & T ??
90
100
110
120
Standardised Test Score
20
30 40 50 60 70
4%
80
90
130
95
140
99
150
Cohort Ability
Intake Profiles
Intake Profiles
Intake Profiles (Historical)
Student Ability
IPRs
Individual Pupil Record Sheets (IPRs)
Look for sections that
are inconsistent
Two students
Same Ability
Different Profiles
General IPR Patterns
•
•
•
Pupils with high scores across all components
Pupils with low scores across all components
Pupils with significant differences between one or two components
•
Vocab lower than others
•
Vocab higher than others
•
Maths higher than others
•
Maths lower than others
•
Non-Verbal higher than others
•
Non-Verbal lower than others
•
Low Skills
•
High Skills
www.cem.org/midyisiprbooklet
Vocab significantly lower than other sections
•
•
•
English Second Language ?
Understanding language used in learning and assessment ?
Language enrichment ?
Vocab significantly higher than other sections
•
•
•
Good Communicator ?
Work in class may not be to this standard
→ Weak Non-verbal
→ Weak Maths
→ Weak Skills (speed of working ?)
Many benefit from verbal descriptors ?
Maths significantly higher than other sections
•
•
•
•
Strong Maths ability
Not 100% curriculum free
May depend on prior teaching effectiveness
Far East influence ?
Maths significantly lower than other sections
•
•
•
Implications not just for maths but other numerate or databased subjects
General poor numeracy ?
Remedial Maths ?
Non-Verbal significantly higher than other sections
•
•
•
•
Good spatial and non-verbal ability
May have high specific skills
Low Vocab, Maths & Skills may indicate has difficulty
communicating
Frustration ?
Non-Verbal significantly lower than other sections
•
•
•
•
Difficulty understanding diagrams or graphical instructions ?
Verbal explanation ?
Physical demonstration ?
Physical Models ?
Low Skills Scores
•
•
•
•
•
Skills = Proof Reading and Perceptual Speed & Accuracy
Speed of Working
Work well in class / homework but underachieve in exams ?
Problems checking work or decoding questions ?
Low Skills + Low Vocab
→ Poor written work in class (unable to work quickly)
→ Dyslexia ? Further specialist assessment required
High Skills Scores
•
•
•
•
•
Skills = Proof Reading and Perceptual Speed & Accuracy
Can work quickly and accurately
Difficulty communicating and expressing ideas ?
May perform poorly in areas using numeracy skills and subjects
needing 3D visualisation and spatial concepts ?
May struggle in most areas of curriculum
Working with Individual
Pupil Records (IPRs)
Objectives
• To gain understanding of interpreting IPRs
• To share strategies for supporting individual pupils
Strategy
• To look first at interpretation of MidYIS IPRs
• Exercises with MidYIS IPRs
• Use generic patterns to apply to exercises on
IPRs with ALIS, Yellis and INSIGHT though there
are slight differences
What does the MidYIS test measure?
• Vocabulary
– Most culturally linked. Affects all subjects but most important in
English, History and some Foreign Languages. Measures fluency
rather than knowledge.
• Maths
– The Maths score is well correlated with most subjects but is
particularly important when predicting Maths, Statistics, ICT, Design
& Technology and Economics.
• Non-verbal
– Tests 3D visualisation, spatial aptitude, pattern recognition and
logical thinking. Important when predicting Maths, Science, Design &
Technology, Geography, Art and Drama.
• Skills
– Tests proof reading skills (SPG) and perceptual speed and accuracy
(e.g. matching symbols under time pressure). Measures fluency and
speed necessary in exams and in the work place. Relies on a pupil’s
scanning and skimming skills.
Using MidYIS IPRs to Inform Teaching and Learning
The IPR on its own simply tells us about the relative performances
of the pupil on the separate sections of the test, where the pupil
is strong, where performance has been significantly above or
below national averages or where the pupil has significantly
outperformed in one section or another.
It is when the IPR is placed in the hands of a teacher who knows that pupil
that it becomes a powerful tool.
It is what teachers know about individual pupils: what has
happened in the past, how they respond to
given situations and how they work in the teacher’s
specific subject that inform the interpretation of the IPR.
If the IPR data from MidYIS, the teacher’s personal and
subject specific knowledge and experiences regarding
the pupil can be shared, then there becomes a much more
powerful instrument for supporting pupils’ learning needs.
Examples of Individual Pupil Profiles
For each example look at
– the information contained in the graph
– the issues that may arise for this pupil in your
subject
– strategies you could employ to support that pupil
(either for the whole class or for that specific
individual)
Student A
Strategies for Student A
• Word banks for each topic
• Practise writing with words rather than symbols e.g.
To find the common denominator, first of all you ...
• Discussion groups (although ensure pupils with low
vocabulary scores do not all congregate)
• Wider reading
• Visits/trips etc. to enrich language and cultural
experience
Student B
Strategies for Student B
• May struggle to understand diagrams – use spoken
and written explanations, paired work or group work
to interpret
• Physical/practical/kinesthetic explanations may
help (e.g. modelling solar system with clay/string or
demonstrating distance between planets on football
pitch etc.)
• Use drama/active methods to demonstrate difficult
concepts
Student C
Strategies for Student C
• Pupil may seem more able than is the case, e.g.
‘talks a good talk’
• Allow paired work or group discussion to
communicate answers orally
• Describe maths problems
• Encourage leadership roles as well as
debates/drama
• Support with scaffolding/writing frames etc.
Student D
Strategies for Student D
Analysis
A pupil like this may:
• struggle to proof read his work, therefore achieve a lower grade
than he seems capable of
• struggle to interpret or understand exam questions
• either work slowly with more accuracy OR work quickly with less
accuracy – result is similar i.e. lower test score than expected
Strategies:
• allow extra time
• practise timing e.g. clock on IWB
• use a range of question words to develop ability to understand
instructions
• develop proof reading technique e.g. spotting comon errors
• consider further testing for dyslexia
Some pupil data MIDYIS
2009
Vocabulary
Maths
Non-Verbal
Skills
Overall
Score
Band
Score
Band
Score
Band
Score
Band
Score
Band
Pupil 01
122
A
125
A
116
A
107
B
126
A
Pupil 02
105
B
110
A
127
A
95
C
108
B
Pupil 03
105
B
93
C
110
B
89
D
99
C
Pupil 04
91
C
116
A
130
A
115
A
103
B
Pupil 05
111
A
144
A
122
A
103
B
129
A
Pupil 06
107
B
112
A
85
D
97
C
109
B
Pupil 07
115
A
106
B
100
C
86
D
112
A
Pupil 08
141
A
137
A
132
A
135
A
143
A
Pupil 09
104
B
92
C
105
B
109
B
98
C
Pupil 10
99
C
119
A
114
A
99
C
109
B
Pupil 11
108
B
126
A
130
A
140
A
118
A
Pupil 12
106
B
123
A
120
A
105
B
116
A
Pupil 13
103
B
96
C
103
B
104
B
99
C
Pupil 14
108
B
110
B
112
A
108
B
110
A
Pupil 15
95
C
104
B
103
B
122
A
99
C
The class from Waterloo Road
A useful quick reference for staff
A Selection Of MidYIS Scores For ‘Waterloo Road’
Why would this be a very challenging class to teach?
Vocabulary
Surname
Maths
Non Verbal
Skills
MidYIS Score
St. Score
Band
St. Score
Band
St. Score
Band
St. Score
Band
St. Score
Band
Sex
A
F
81
D
110
B
108
B
112
A
94
C
B
F
128
A
107
B
105
B
94
C
120
A
C
M
106
B
121
A
103
B
90
D
114
A
D
F
107
B
84
D
96
C
107
B
96
C
E
M
96
C
90
D
130
A
91
C
92
C
F
F
86
D
86
D
120
A
74
D
84
D
G
F
100
B
115
A
80
D
103
B
108
B
H
F
121
A
96
C
114
A
86
D
111
A
I
M
92
C
100
C
96
C
123
A
95
C
J
M
100
C
105
B
100
C
99
C
102
B
K
M
128
A
132
A
114
A
131
A
133
A
L
M
76
D
70
D
74
D
73
D
71
D
What do I need to know/do to teach this (difficult) class of twelve pupils?
These are real anonymous scores from a number of schools around the UK
IPR Patterns – A Summary
•
•
Vocabulary scores significantly lower than other component scores
Second language? Deprived areas? Difficulty accessing curriculum.? Targeted help does work. Seen in
nearly all schools. Worth further diagnosis. Could potentially affect performance in all subjects.
•
•
Vocabulary scores significantly higher than other component scores
Good communicators. Get on. Put Maths problems in words?
•
•
Mathematics significantly higher than other scores
From Far East? Done entrance tests? Primary experience?
•
•
Mathematics significantly lower than other scores
Primary experience. Use words and diagrams? Sometimes difficult to change attitude… Difficulties with
logical thinking and skills such as sequencing.
•
•
Low Mathematics scores with high Non-verbal Scores
Use diagrams. Confidence building often needed.
•
Pupils with non-verbal scores different from others (High) Frustration? Behaviour problems? Don’t
do as well as good communicators or numerate pupils? Good at 3D and 3D to 2D visualisation and
spatial awareness. Good at extracting information from visual images.
•
Pupils with non verbal scores different from others (Low) - Peak at GCSE? A level ?
•
Pupils with low Skills scores - Exams a difficulty after good coursework? Suggests slow speed of
processing.
•
High Skills Scores - Do well in exams compared with classwork?
•
The Average Pupil - They do exist!
•
High scores throughout - Above a score of 130 puts the pupil in the top 2% nationally
•
Low scores throughout - Below a score of 70 puts the pupil in the bottom 2% nationally
Interpreting IPRs Exercises
Have a look at the IPRs on the following pages.
These show examples for Yellis (Year 10) and
ALIS (Year 12) as well as MidYIS.
What do the scores suggest about the students
and how would you use this information to aid
the teaching and learning process for each of
them?
1
2
3
4
5
6
7
Proof-Reading 88
PSA 108
8
Yellis
Case Study 1
You are given data relating to an institution where students completed the ALIS computer
adaptive test. They are chosen because they show significant differences between the
various parts of the test. Remember scores are standardised around 100.
Name
Overall
Vocab
Maths
Non Verbal
Average
A Level subjects chosen
St.Score
Band
St.Score
Band
St.Score
Band
St.Score
Band
GCSE
A
78
D
49
D
99
B
92
C
na
Biology, Maths, Business, Art
B
94
C
115
A
85
D
104
B
na
Biology, Business, Psychology, English
C
88
D
97
C
85
D
104
B
5.6
History, Psychology, English, Media
D
101
B
107
B
97
C
80
D
5.9
Business, History, English, Drama
E
104
B
87
D
112
A
116
A
na
Biology, Physics, Maths, Business
F
81
D
47
D
103
B
111
B
na
Maths, Further Maths, Business
G
93
C
113
A
84
D
113
A
na
Biology, Business, French, Geography
H
97
C
111
A
89
D
99
C
7
Art, English, Psychology, Religious St.
I
87
D
68
D
100
B
109
B
5.4
Maths, Geography, French, Music
J
105
B
67
D
124
A
85
D
6.1
Maths, Further Maths, Psychology,
Economics
K
96
C
71
D
110
A
97
C
na
Biology, Maths, Art, English
L
92
C
60
D
111
A
97
C
na
Maths, History Religious St., English
a) Are there any apparent mismatches between the subjects being followed and this data?
b) What support can be given to those students who have weaknesses in Vocabulary or
Mathematics?
c) How might predictions made for these students be tempered in the light of the inconsistencies
in the test components and missing average GCSE points scores?
Case Study 2
What are the strengths and weaknesses of this A/AS level student?
To use the IPR (Individual pupil record) familiarise yourself with the terms
standard score, band, stanine, percentile and confidence band.
a) Which AS/A level subjects might be
avoided?
b) This student chose English, Film
Studies, Music Technology and
Psychology.
Is this a good choice?
Do you foresee any problems?
INSIGHT Pupil IPR
Speed Reading
Text Comprehension
Passage Comprehension
Overall Reading
Number & Algebra
Handling Data
Space, Shape & Measures
Overall Mathematics
Biology
Chemistry
Physics
Overall Science
Vocabulary
Non Verbal
Skills
Overall Ability
Comments?
Standard
KS3
Score
Equivalent
Band
Stanine
Percentile
B
B
6
5
69
60
107
104
6c
5a
C
5
45
98
5b
B
5
59
103
5a
D
3
19
87
4a
B
D
6
4
61
23
104
89
6a
5c
C
4
31
93
5b
A
A
A
7
9
8
82
96
89
114
127
118
6a
7a
7c
A
8
93
122
7b
B
B
A
6
6
8
65
70
92
106
108
121
C
4
36
94
Overall Ability
Skills
Non Verbal
40
Maths
60
Vocabulary
Overall Science
Physics
Chemistry
Biology
Overall Mathematics
Space, Shape & Measures
Handling Data
Number & Algebra
Overall Reading
Passage Comprehension
Text Comprehension
Speed Reading
Standardised Scores
Standardised Scores With 95% Confidence Band
160
140
120
100
80
Looking Forwards
Introduction to ‘Predictions’
Theory
How CEM ‘Predictions’ are made…
Subject
SubjectXX
Result
A*
A* / A
A
B
C C
D
E
‘Ability’
(Baseline)
Some Subjects are More Equal than Others….
A-Levels
140
A*
A
Grade
120
B
100
>1 grade
Photography
Sociology
English Lit
Psychology
Maths
Physics
Latin
80
C
60
D
E
40
5
C
6
B
A7
Average GCSE
A*8
Some Subjects are More Equal than Others …
Performance varies between subjects, thus analysing and
predicting each subject individually is essential.
e.g. Student with Average GCSE = 6.0
Subject Choices
Predicted Grades
Maths, Physics,
Chemistry, Biology
C, C/D, C/D, C/D
Sociology, RS, Drama,
Media
B, B/C, B/C, B/C
Some Subjects are More Equal than Others …
GCSE
A*
A
GCSE Grades
B
C
1 grade
D
E
F
Test Score
Art & Design
Biology
Chemistry
Economics
English
French
Geography
German
History
Ict
Mathematics
Media Studies
Music
Physical Education
Physics
Religious Studies
Science (Double)
Spanish
Feedback
Predictions – MidYIS example
5.0
C
B
6.0
4.4
C/D
4.8
C
3.9
D
Similar spreadsheets available from Yellis, INSIGHT
Adjusting Predictions in MidYIS / Yellis / INSIGHT
0.5
6.3
6.9
5.8
6.1
5.5
Chances Graphs
Individual Chances Graph for Student no.5 - GCSE English
MidYIS Score 82 MidYIS Band D
40
33
35
Prediction/expected
grade: 3.8 grade D
Percent
30
23
25
23
Most likely grade
20
15
10
10
5
2
5
3
1
0
A
A*
0
U
G
F
E
D
Grade
C
B
Post-16 : CABT vs Average GCSE
Average GCSE correlates very well to A-level / IB etc,
but by itself is not sufficient….
• What is a GCSE ?
• Students without GCSE ?
• Years out between GCSE & A-level ?
• Reliability of GCSE ?
• Prior Value-Added ?
The Effect of Prior Value Added
Beyond Expectation
In line with Expectation
Below Expectation
+ve Value-Added
0 Value-Added
-ve Value-Added
Average GCSE = 6
Average GCSE = 6
Do these 3 students all have the same ability ?
Average GCSE = 6
Rationale for CABT in addition to GCSE
•
•
•
•
•
Do students with the same GCSE score from feeder schools
with differing value-added have the same ability ?
How can you tell if a student has underachieved at GCSE and
thus can you maximise their potential ?
Has a student got very good GCSE scores through the school
effort rather than their ability alone ?
How will this affect expectation of attainment in the Sixth
Form ?
Can you add value at every Key Stage ?
Baseline testing provides a measure of ability that (to a large
extent) is independent of the effect of prior treatment.
‘Predictions’
Predictions Based
on GCSE
Probability of
achieving
each grade
Expected Grade
Predictions Based
on Baseline Test
Which predicted
grades are the most
appropriate for this
student ?
Step 1
Adjusting Predictions in ALIS (Paris Software)
75th
Percentile
Prior
Value-Added
Working with ‘Predictions’
(Average performance by similar pupils in previous years)
Objectives
• To gain understanding of the interpretation of ‘predictions’
• Remembering that they are not really PREDICTIONS but
part of a ‘chances scenario’
• Using chances to explore the setting of targets
• Discussion of monitoring performance against targets
A
6.8 7.2
2
F
7E 120
A
3
F
7C 110
4
F
7J
5
M
7D
1
7.2
7.5
7.0
6.3
6.7 6.4 6.1
6.4
6.5
6.3
B
5.8
6.2 5.7 5.4
5.5
5.6
5.5
101
B
5.4 5.8
5.1 4.8
4.9
4.9
4.9
82
D
4.5 4.9
3.8 3.5
3.4
3.2
3.5
M
7E 131
A
2
F
7E 120
A
3
F
7C 110
B
B
B
4
F
7J
101
B
B/C
B
C
5
M
7D
82
D
C/D
C
D
A
A
English
7.1 6.8
Biology
Science
Art & Design
7E 131
Mathematics
MidYIS Band
M
History
Form
1
French
Sex
Concentrate
on student 4
student no.
WHY ARE THE
SUBJECT
PREDICTIONS
DIFFERENT?
MidYIS Score
Point and grade ‘predictions’ to GCSE
A
A/B A/B A/B
A
B
A A*/A
A
A/B A/B A/B
B/C B/C B/C B/C B/C
C
C
D/E D/E
C
C
E
D/E
Individual Chances Graph for Student 4- GCSE English
MidYIS Score 101 MidYIS Band B
Teacher's Adjustment : 0 grades / levels / points
45
39
40
35
Percent
30
Prediction/expected grade:
5.1 grade C
27
25
20
19
Most likely grade
15
9
10
5
5
0
0
U
G
1
1
0
F
E
D
C
B
A
Grade
What are the chances
a) of getting a grade C or above ?
b) of not getting a C ?
A*
Yellis predictive data: baseline score 103 (55%)
Yellis
Average
Subject
Predicted
Business Studies
5.4 (B/C)
English
5.7 (B/C)
French
5.4 (B/C)
Geography
5.6 (B/C)
Mathematics
5.7 (B/C)
Physical Education
5.7 (B/C)
Science: GCSE
5.6 (B/C)
Science: GCSE Additional 5.6 (B/C)
SC Religious Studies
5.3 (B/C)
Yellis
Average
Subject
Predicted
Business Studies
5.6 (B/C)*
English
5.9 (B)*
French
5.6 (B/C)*
Geography
5.8 (B)*
Mathematics
5.9 (B)*
Physical Education
6.0 (B)*
Science: GCSE
5.8 (B)*
Science: GCSE Additional 5.8 (B)*
SC Religious Studies
5.6 (B/C)*
Weighted Average
Weighted Average
Comment?
5.6 (B/C)
Yellis
Average
Subject
Predicted
Business Studies
5.4 (B/C)
English
5.7 (B/C)*
French
5.4 (B/C)
Geography
5.8 (B)*
Mathematics
5.9 (B)*
Physical Education
6.7 (A/B)*
Science: GCSE
6.3 (A/B)*
Science: GCSE Additional 5.6 (B/C)
SC Religious Studies
5.7 (B/C)*
Weighted Average
5.8 (B)
5.8 (B)
Chances graphs MidYIS and Yellis
Situation
You are a tutor to a Year 10 pupil and you wish to help him/her to set target
grades. Here is a chances graph based on the pupil’s Year 7 MidYIS test (114)
and one based on the Year 10 Yellis test (58%)
MidYIS Chances Graph
This graph is based on the pupil’s exact MidYIS score, adjusted to include
the school’s previous value-added performance.
Yellis Chances Graph
This graph is based on one ability band and has no
value-added adjustment.
a) What do the graphs tell you about this pupil’s GCSE chances in this subject
(Maths)?
b) What could account for the differences between the two graphs and are these
important?
IMPORTANT FOR STAFF AND STUDENTS TO UNDERSTAND THE DIFFERENCE
Fixed Mindset:
[My intelligence is fixed and tests tell me how clever I am.]
This graph tells me I’m going to get a B, but I thought I was going to get an A. I’m
obviously not as clever as I hoped I was and so the A and A* grades I’ve got for my
work so far can’t really be true.
Growth Mindset:
[My intelligence can develop and tests tell me how far I have got.]
This tells me that most people with the same MidYIS score as me achieved a B last
year, but I think I have a good chance of an A and I know that my work has been
about that level so far so I must be doing well. What do I need to do to be one of the
10% who gets an A*?
How was this information produced?
The MidYIS graphs are produced using the predictions spreadsheet. Select the pupil(s) and subject(s) to display or print using the GCSE
Pupil Summary 1 tab. Adjustments for value-added can be made for individual subjects on the GCSE Preds tab.
The Yellis graphs for all GCSE subjects (showing all four ability bands) can be downloaded from the Secondary+ website.
Commentary
From MidYIS - The most likely grade is a B (35%) but remember there is a 65%
(100-65) chance of getting a different grade but also a 75% (35+30+10) chance
of the top three grades.
From Yellis - The most likely grade appears to be a C but remember that the
band has been decided over a range, not for the individual student and this
pupils score is near the top of that range, 58 compared with 60.8. It has also
not been adjusted for this school’s prior value added in the past.
In an interview with the student one has to use your professional judgement
about that student, taking everything into account. Certainly the Yellis chart
warns against complacency, but if the school has a strong value added history
it is better to rely in this case on the MidYIS chart for negotiating a target. Grade
A is a fair aspirational target for the student but accountability for a teacher
cannot fairly be judged by not achieving this grade with this student. Even a
very good teacher may only achieve B or C with this student.
Can the aspirational target set for the student be the same as that used for staff
accountability purposes? There is a trap here.
ALIS
You are the subject teacher and are discussing possible A2 target grades
with individual students. You are about to talk to Jonathan who achieved
an average GCSE score of 6.22. This gives a statistical
prediction=28.35x6.22-99.57= 77 UCAS points using the regression
formula at A2 for this subject (Grade C at A2). Assume that the computer
adaptive baseline test confirms this prediction. Chances graphs for this
subject are shown showing the percentage of students with similar profiles
achieving the various grades.
Individual chances graph for Jonathan
a) Why are these two chances graphs different?
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
(b) ‘Most candidates with Jonathan’s GCSE background score achieved a C in my
subject last year so Jonathan’s target grade should be a C’.
What are the weaknesses of this statement?
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------(c) What other factors should be taken into consideration apart from chances graph
data, when determining a target grade?
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
The difference in the chances graphs is that one of them provides for a range of
GCSE scores whilst the other is linked to Jonathan’s individual average GCSE
score of 6.22. The strength of the chances graph is that it shows more than a bald
prediction.
True, most students starting from an average GCSE score like Jonathan did
achieve a C grade at A2 in examinations for this subject. However the probability of
a B grade is also high since his score was not at the bottom of this range. This
might be reflected too if the department also has a history of high prior value
added. The converse is also true with a D grade probability warning against
complacency. Students are not robots who will always fit with statistics so it is
dangerous to make sweeping statements based on one set of results.
As well as looking at the prediction you should use the chances graph as a starting
point, with your professional judgement taking into account factors such as his and
the department’s previous performance in the subject, his attitude to work, what he
is likely to achieve based on your own experience. You might want to start with the
most popular outcome grade C and use your judgement to decide how far up (or
down!) to go. He may be a very committed student and if the department has
achieved high value added in the past, an A/B grade may be more appropriate
though A* looks unlikely. If you are using aspirational targets for psychological
reasons with students then A may be appropriate even though it less probable than
B/C.
Key Questions for Intelligent Target Setting
• What type of valid and reliable predictive data
should be used to set the targets?
• Should students be involved as part of the
process (ownership, empowerment etc.)?
• Should parents be informed of the process and
outcome?
Key points to consider might include:
• Where has the data come from?
• What (reliable and relevant) data should we
use?
• Enabling colleagues to trust the data: Training
(staff)
• Communication with parents and students
• Challenging, NOT Demoralising, students…….
• Storage and retrieval of data
• Consistency of understanding what the data
means and does not mean
• The process of setting targets is crucial…….
There is wide-ranging practice using CEM data to set student,
department and institution targets.
Increasingly sophisticated methods are used by schools and colleges.
The simplest model is to use the student grade predictions. These
then become the targets against which student progress and
achievement can be monitored.
Theoretically, if these targets were to be met, residuals would be zero so
overall progress would be average.
The school/college would be at the 50th percentile.
More challenging targets would be those based on the basis of history.
For example. Where is the school/college now? Where is your subject
now?
If your subject value added history shows that performance is in the
upper quartile it may be sensible to adjust targets. This may have the
effect of raising point predictions between 0.2-0.5 of a grade.
This would be a useful starting point, but it would not be advisable to use
the predictions for below average subjects, which might lead to
continuing under-achievement.
Yellis Predictions For Modelling
FOUR approaches
• YELLIS GCSE Predictions
• YELLIS GCSE Predictions + say 0.5 a grade
• Prior value added analysis based on 3 year VA per
department
•
75th percentile (upper quartile) analysis
Subject
Art & Design
Business Studies
Design & Technology
Drama
English
English Literature
French
Geography
German
History
Home Economics
ICT
Maths
Music
Physical Education
Religious Studies
Double Science
Welsh
Number of
Students
Percentage
of A* to C
Grades
Percentage
of A* to G
Grades
68
64
103
27
181
15
53
84
7
49
48
71
180
12
72
37
180
177
84
48
63
85
64
60
64
63
71
67
48
68
54
67
65
70
52
72
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
School Average GCSE score:
Average
Grade
5.2
4.3
4.7
5.3
4.8
4.6
4.9
4.8
5.1
5.1
4.5
4.9
4.5
5.2
4.9
5.2
4.4
5.1
(C)
(C/D)
(C/D)
(B/C)
(C)
(C/D)
(C)
(C)
(C)
(C)
(C/D)
(C)
(C/D)
(C)
(C)
(C)
(C/D)
(C)
4.7 (C/D)
Counted Performance Statistics (Based on Subject Choice Predictions)
5 or more A* to C Grades:
106
1 or more A* to C Grades:
141
5 or more A* to G Grades:
181
1 or more A* to G Grades:
181
58%
77%
99%
99%
5 or more A* to C Grades inc Maths and English:
2 or more A* to C Grades - Sciences:
1 or more A* to C Grades - Modern Foreign Language:
54%
51%
20%
98
93
36
The underlying predictions summarised here are based on expectations for an average school achieving
zero value added results. Appropriate care should be taken in interpreting them within your school.
Please note that the cut-off points for grade C and grade G have been set at 4.5 and 0.5 respectively.
Due to the sensitive nature of the cut off points, predictions may vary for your school if the cut off points
could be altered.
(*Predictions Adjusted for Positive Prior Value-added Performance)
Subject
Art & Design
Business Studies
Design & Technology
Drama
English
English Literature
French
Geography
German
History
Home Economics
ICT
Maths
Music
Physical Education
Religious Studies
Double Science
Welsh
Number of
Students
Percentage
of A* to C
Grades
Percentage
of A* to G
Grades
68
64
103
27
181
15
53
84
7
49
48
71
180
12
72
37
180
177
84
48
87
100
69
67
96
73
86
67
79
96
57
92
65
70
59
86
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
School Average GCSE score:
Average
Grade
5.2
4.3
5.3
6.0
4.9
4.9
6.4
5.2
5.6
5.1
5.2
5.7
4.6
5.7
4.9
5.3
4.7
5.5
(C)
(C/D)
(B/C)*
(B)*
(C)*
(C)*
(A/B)*
(C)*
(B/C)*
(C)
(C)*
(B/C)*
(C/D)*
(B/C)*
(C)
(B/C)*
(C/D)*
(B/C)*
5.1 (C)
Counted Performance Statistics (Based on Subject Choice Predictions)
5 or more A* to C Grades:
125
1 or more A* to C Grades:
162
5 or more A* to G Grades:
181
1 or more A* to G Grades:
181
69%
89%
99%
99%
5 or more A* to C Grades inc Maths and English:
2 or more A* to C Grades - Sciences:
1 or more A* to C Grades - Modern Foreign Language:
56% *
58% *
30% *
102
106
54
*
*
*
*
(*Predictions Adjusted for 75th Percentile)
Subject
Art & Design
Business Studies
Design & Technology
Drama
English
English Literature
French
Geography
German
History
Home Economics
ICT
Maths
Music
Physical Education
Religious Studies
Double Science
Welsh
Number of
Students
Percentage
of A* to C
Grades
Percentage
of A* to G
Grades
68
64
103
27
181
15
53
84
7
49
48
71
180
12
72
37
180
177
97
63
73
96
70
67
74
70
71
84
63
77
61
83
72
81
59
82
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
School Average GCSE score:
Average
Grade
5.5
4.6
5.0
5.5
5.0
4.9
5.1
5.1
5.4
5.4
4.8
5.2
4.8
5.5
5.2
5.5
4.7
5.4
(B/C)*
(C/D)*
(C)*
(B/C)*
(C)*
(C)*
(C)*
(C)*
(B/C)*
(B/C)*
(C)*
(C)*
(C)*
(B/C)*
(C)*
(B/C)*
(C/D)*
(B/C)*
5.0 (C)
Counted Performance Statistics (Based on Subject Choice Predictions)
5 or more A* to C Grades:
123
1 or more A* to C Grades:
162
5 or more A* to G Grades:
181
1 or more A* to G Grades:
181
68%
89%
99%
99%
5 or more A* to C Grades inc Maths and English:
2 or more A* to C Grades - Sciences:
1 or more A* to C Grades - Modern Foreign Language:
60% *
58% *
23% *
109
106
41
*
*
*
*
Case Study
Here is the Individual Pupil Record
from the ALIS computer adaptive
test taken in Year 12 for a current
Year 13 student.
This student had a high positive
value added in every GCSE subject
as measured using MidYIS as a
baseline.
( Average GCSE score 7.44)
On the next page are her A level
predictions and chances graphs.
Why are the predictions different?
Are the chances graphs useful
here?
Predictions and chances graphs
Using PARIS software and
tweaking the predictions for
prior value added by these
subjects, then from a GCSE
baseline A*s are predicted in
three of the four.
If we did the same for the
adaptive test baseline solid Bs
might be predicted in all three.
It is also worth looking at
the value added at GCSE.
See commentary
Commentary
The differences in prediction from the GCSE baseline and the computer adaptive
test for some students are interesting and these can be in either direction. Here
there has been a very large value added at GCSE which may or may not be
sustainable at A level. This student’s history is shown below:
GCSE PREDICTIONS MIDYIS ALL
GSCE Grade predictions
GCSE ACHIEVED
VALUE ADDED RAW
Drama
5.8
BA*
2.2
English
6
B
A*
2
GCSE PREDICTIONS MIDYIS IND.
GSCE Grade predictions
GCSE ACHIEVED
VALUE ADDED RAW IND
6.6
A/B
A*
1.4
6.8
AA*
1.2
English Lit German
6
5.5
B
B/C
A
A
1
1.5
6.7
AA
0.3
6.5
A/B
A
0.5
Latin
6.5
A/B
A*
1.5
Maths
6
B
A*
2
Music
5.9
BA
1.1
6.9
AA*
1.1
6.9
AA*
1.1
6.9
AA
0.1
Science
5.8
BA
1.2
6.9
AA
0.1
from year7 data
from year9 data
Average GCSE score =7.44
The value added here at GCSE is between 1 and 2 grades (for all institution data at year
7) and significantly positive for subjects (for the Independent school data from year 9).
Actually if we measure this student’s value added from an average GCSE score of 7.44
next year, it does not tell the whole story. We need to look as well at the value added from
the computer adaptive test too.
The chances graphs should be used with extreme caution here and the growth mindset is
vital if used with students.
Case study : setting departmental targets
•
•
•
•
Uses valid and reliable data e.g. chances graphs
Involves sharing data with the students
Gives ownership of the learning to the student
Enables a shared responsibility between student,
parent(s)/guardian, and the teacher
• Encourages professional judgement
• Leads to the teachers working smarter and not harder
• Leads to students being challenged and not ‘over
supported’, thus becoming independent learners…
DEPARTMENT:
GCSE ANALYSIS
year
2006
2007
2008
2009
no. of
pupils
66
88
92
108
av. Std.
raw resid. Resid
0.8
0.8
1.1
0.7
0.6
0.5
0.8
0.6
n.b. A raw residual of 1.0 is equivalent to one grade.
TARGETS FOR 2011, using CEM predictive data and dept's prior value-added
The target grade has a prior value-added of 0.8
1
2
3
4
5
6
7
8
9
10
12
M
F
M
F
M
F
M
M
F
M
M
prediction
5.4
3.8
3.6
4.2
5.7
6.5
7.0
3.8
4.2
5.9
3.8
pred
grade
(B/C)
(D)
(D/E)
(D)
(B/C)
(A/B)
(A)
(D)
(D)
(B)
(D)
target
6.2
4.6
4.4
5.0
6.5
7.3
7.8
4.6
5.0
6.7
4.6
target
grade
B
C
D
C
B
A
A*
C
C
A
C
dept adj
grade
A
C
D
D
B
A*
A*
C
C
B
D
etc.
Student no.1 GCSE Geography
Individual Chances Graph for student A- GCSE English
MidYIS Score 105 MidYIS Band B
Teacher's Adjustment : 0 grades / levels / points
45
40
36
35
Percent
30
32
Prediction/expected grade: 5.4
grade B/C
25
20
Most likely grade
14
15
14
10
5
0
0
0
U
G
F
3
2
0
E
D
Grade
C
B
A
A*
Student no.1 GCSE Geography
Individual Chances Graph for Student A- GCSE English
MidYIS Score 105 MidYIS Band B
Teacher's Adjustment : 0.8 grades / levels / points
45
40
36
35
Percent
30
25
Prediction/expected grade: 6.2
grade B
Most likely grade
32
20
20
15
9
10
4
5
0
0
0
0
U
G
F
E
0
D
Grade
C
B
A
A*
Results
13
19
23
21
10
6
COMMENTS?
Monitoring Student Progress
Monitoring students’ work against target grades is established practice in
schools and colleges, and there are many diverse monitoring systems in place
Simple monitoring systems can be very effective
Current student achievement compared to the target grade done at
predetermined regular intervals to coincide with, for example internal
assessments/examinations
Designated staff having an overview of each student’s achievements across
subjects
All parents being informed of progress compared to targets
Review of progress between parents and staff
Subject progress being monitored by a member of the management team in
conjunction with the head of subject/department
A tracking system to show progress over time for subjects and students
Monitoring Progress:
Schools and departments use various monitoring systems
for comparing present progress with either the target grade
or in some cases the minimum acceptable grade or basic
suggested grade.
Six examples from schools are shown.
If you were Polly Bolton’s Form Teacher, how would you
approach a discussion with her parents at a Parents’
Evening? Should parents be told the baseline scores?
Polly Bolton
B
SECURE
5
ORG
MC
E
B
UNLIKELY
3
Science
CPa
E
C
UNLIKELY
Science Additional
CPa
D
C
French
CK
C
History
KM
RS
CG
CB
Maths
B
SECURE
5
C
B
LIKELY
5
HW
D
B
POSSIBLE
4
4
ORG
D
C
POSSIBLE
4
ORG
LIKELY
4
ORG
B
C
LIKELY
4
ORG
B
LIKELY
4
C
B
LIKELY
5
A
A
SECURE
5
B
A
SECURE
5
D
B
POSSIBLE
4
C
B
POSSIBLE
4
KEY - target is:
SECURE
LIKELY
POSSIBLE
UNLIKELY
effort: 5 excellent - 4 good - 3 satisfactory - 2 poor - 1 very poor
concern: WW working well - ATT attitude - BEH behaviour
TEN attendance - PUN punctuality - HW homework
CON confidence - ORG organisation - EAL language
concern
B
English Literature
B
is:
effort
ORG
target
grade
5
year 10
exam
LIKELY
concern
concern
B
CB
is:
effort
effort
C
English
target
grade
is:
Teach
Subject
SUMMER
current
grade
target
grade
SPRING
current
grade
AUTUMN
Subjects
Tracking at departmental level for one student
Student: Peter Hendry
test: Geol
Time Scale
target
grade
test essay:
radiometric
dating
test: dating
homework
rock cycle
pract: rock
textures
2006-8
test:
igneous
rocks
15/09/2006 22/09/2006 06/10/2006 20/10/2006 06/11/2006 21/11/2006
97%
A
84%
68%
B
C
Department: Geology
57%
54%
D
E
U
50%
SURNAME
Briggs
Fletcher
Green
Havard
etc
punctuality
meeting deadlines
D
B
A
A
1
2
1
3
2
2
1
3
1
2
2
4
C
B
B
B
1
2
2
4
1
1
2
2
1
1
2
2
meeting deadlines
punctuality
DEC
effort
OCT
current level
effort
C
B
B
A
current level
C
A
C
A
meeting deadlines
Alice
Kevin
Felicity
Michael
punctuality
yr 12
effort
BIOLOGY
current level
FORENAME
negociated target
grade
subject:
initial target grade
Traditional mark book approach
07-08
MAR
Targets
for learning…. reporting to pupils
-7.07013
0.59938
A
B
C
D
E
F
g
h
I
J
K
L
M
N
O
P
Q
R
S
T
U
V
W
X
Y
Z
ZA
ZB
MidYis Score Test Score
80
96
95
119
111
84
67
88
118
91
120
108
115
87
117
105
98
69
69
115
118
109
123
89
115
76
90
97
33
63
80
80
73
45
45
63
50
60
50
35
35
58
83
45
73
5
30
70
50
45
60
30
65
10
55
70
41.12001 49.34402
50.17065 60.20478
49.87096 59.84515
64.1362 76.96344
59.46104 71.35324
43.33772 52.00526
33.02838 39.63406
100 55.02614
45.85511
63.83651 76.60381
90 56.96813
47.47344
64.79552 77.75462
80 69.12355
57.60296
62.09831 74.51797
45.31567
70 54.37881
62.99738 75.59685
55.80482
60 66.96578
51.54922 61.85907
34.4669
50 41.36028
34.10727 40.92872
61.91849
40 74.30219
63.71663 76.45996
58.32222
30 69.98666
66.47378 79.76854
46.03493
20 55.24191
61.55887 73.87064
38.48274
10 46.17929
46.57437 55.88924
50.88990 61.06789
-7.07013 -8.484156
60
-7.07013 -8.484156
-7.07013 -8.484156
-7.07013 -8.484156
-7.07013 -8.484156
-7.07013 -8.484156
Test Score
Name
32.89601
40.13652
39.89677
51.30896
47.56883
34.67017
26.42271
36.68409
51.06921
37.97875
51.83641
46.08237
49.67865
36.25254
50.3979
44.64386
41.23938
27.57352
27.28581
49.5348
50.97331
46.65777
53.17903
36.82794
49.24709
30.78619
37.2595
40.71192
-5.656104
70
-5.656104
-5.656104
-5.656104
-5.656104
-5.656104
Astronomy 7N
MidYis Test Review
80
90
100
MidYis Score
110
120
130
J
M 97.3
C
F
71.8
101 A
MIDYIS ON ENTRY
99 B
132 131 127 105
94 5
4
5
-2.2 6
5
6
6
-2.5 5
6
6
5
-3
101
86 6
4
5
-0.1 5
4
3
4
-2 5
5
5
4
-1.8
83 116
KEY STAGE 3 STATUTORY TEACHER ASSESSMENT
94
92 113
SOSCA STANDARDISED SCORES
98
96
83 102
98
90 103
87
97
95
98
83
95
88
SOSCA Maths
SOSCA St.ScoreSPACE.Maths
SOSCA St.ScoreNUMBERMaths
SOSCA St.ScoreH.DATAMaths
SOSCA St. Score Physics
SOSCA St. Score Chemistry
SOSCA St. Score Biology
SOSCA (STA.) Reading
St. res. MIDYIS- KS3 Sc
SC TA Science Subject Wa
PE TA Phys Ed Subject Wa
MU TA Music Subject Wa
MF TA MFL Subject Wa
St. res. MIDYIS- KS3 Ma
MA TA Maths Subject Wa
IC TA Inf Tech Sub Wa
HI TA History Subject Wa
GE TA Geography Sub Wa
St. res. MIDYIS- KS3 En
EN TA English Subject Wa
DA TA Des and Tech Sub Wa
AR TA Art Subject Wa
MidYIS Skills Standardise
MidYIS Non Verbal Standar
MidYIS Vocabulary Standar
MidYIS Maths Standardised
MidYIS Overall Standardis
MidYIS Overall Band Year
LONDON READING
% Attendance Y10
Gender
Surname Forename
Not a label for life
... just another piece of information
• The Chances graphs show that, from almost any baseline
score, students come up with almost any grade - - - there
are just different probabilities for each grade depending
on the baseline score
• In working with students these graphs are more useful than
a single predicted or target grade
• Chances graphs show what can be achieved:
– By students of similar ability
– By students with lower baseline scores
Student 1
Student 2
Student 3
Student 4
Student 4 - IPR
Performance Monitoring
Introduction to Value-Added
Theory
How CEM ‘Value-Added’ is calculated…
Result
Subject X
-ve VA
+ve VA
Residuals
VA
Ability
(Baseline)
Burning Question :
What is my Value-Added Score ?
Better Question :
Is it Important ?
Key Value Added Charts
1) SPC (Statistical Process Control) chart
VA Score
Performance above expectation
Good Practice to Share ?
Performance in line
with expectation
Performance below expectation
Problem with Teaching & Learning ?
2000
2001
2002
2003
2004
2005
Year
2006
2007
2008
2009
2010
4
3
0.0
0.1
0.0
0.1
0.1
-0.3
-0.3
0.7
0.4
0.2
0.0
0.0
-0.4
-0.3
-1
0.2
0.2
0.5
1.0
0.5
1
0.8
1.1
2
0
-2
-2.9
-3
-4
Short Course Religious
Studies
Spanish
Science
Religious Studies
Physics
Physical Education
Music
Mathematics
History
German
Geography
French
English Literature
English
Drama
Design & Technology
Chemistry
Business Studies
Biology
Art & Design
Additional Science
Additional Applied Science
Average Standardised Residual
2) Subject Bar Chart
Average Standardised Residuals by Subject
3) Scatter Plot
Religious Studies
Scatter Plot Example 1
A2 – English Literature
General Underachievement ?
Scatter Plot Example 2
A2 – English Literature
Too many U’s ?
Other things to look for…
Why did these students do so badly ?
Why did this student do so well ?
How did they do in their other subjects ?
Post-16 : Impact of Baseline
Choice on Value-Added
GCSE as
Baseline
Same School - Spot the Difference ?
Test as
Baseline
Does the Type of School
make a Difference ?
Comparison to all schools
Comparison to Independent Schools Only
Comparison to FE Colleges Only
Comparison to all schools
Questions:
→
→
How does the unit of comparison used
affect the Value-Added data and what
implications does this have on your
understanding of performance ?
Does this have implications for Self
Evaluation ?
Using Value-Added Data
Necessary knowledge base to use CEM systems to their potential
1.
•
•
•
•
•
The forms of Value Added Data:
scatter graphs
raw and standardised residuals
SPC charts
tables of data
use of PARIS for further analyses (e.g. by gender, teaching group)
2.
•
•
•
Predictive Data:
point and grade predictions
importance of chances graphs
availability of different predictive data
3. Baseline Data
• band profile graphs
• IPRs
• average GCSE score
• computer adaptive tests
4. Attitudinal Data
If you have the tools you can use them to do these:
•
Make curriculum changes
•
Adjust staffing structure and cater for student needs
•
Self-evaluation procedures including the analysis of examination results using
value added data
•
The target setting process
•
School and department development plans…….
•
Improve your monitoring and reporting procedures
•
Provide information to governors
•
Have conversations with feeder primary schools
•
Etc. etc.
Below are the value added charts from Yellis to GCSE for two contrasting
institutions. Which subjects are outside the confidence limits in a ‘negative value
added’ way? There must be questions to ask regarding teaching and learning?
Which subjects are outside the confidence limits in a ‘positive value added’ way?
Any result within the outer shaded area
decreases the probability that the value
added result is down to chance. The
probability here is about 1 in 20.
Outside the 99.7% confidence limit
chance is less than 3 in a 1000.
GCSE value added A challenging school
GCSE value added A successful school
2.0
3.0
0.3
0.4
Double Science
Welsh
0.0
-0.1
-0.1
-0.2
Religious Studies
-1.0
-2.0
-3.0
-4.0
SC ICT
Voc Health & Social Care
Voc Applied Science
SC History
SC Religious Studies
SC Geography
SC Art & Design
Science: GCSE Additional
Science: GCSE
Religious Studies
Music
Media Studies
Mathematics
History
Geography
French
English Literature
English
Design & Technology
-4.0
0.2
Physical Education
-3.0
0.1
0.0
0.9
0.8
0.6
Music
-2.0
0.0
0.5
0.3
ICT
-1.3
-0.5
-0.9
0.4
0.2
Maths
-0.5
0.4
Home Economics
-0.5
-1.1
1.0
History
-0.3
German
-0.5
-0.1
Geography
-0.6
-1.0
-0.1
-0.3
French
-0.2
1.4
0.9
English
0.0
2.0
English Literature
0.3
0.1
Drama
0.1
Design & Technology
0.1
Art & Design
0.4
0.2
Business Studies
1.0
Average Standardised Residual
3.0
4.0
Art & Design
Average Standardised Residual
4.0
Here is a value-added subject report from a recent examination session at
a school
G
C
S
E
s
c
o
r
e
MidYIS score
Write the equivalent GCSE grades next to the points scores.
Compare the value-added performance of candidates scoring A*, B, and D grades.
Which result would cause you to ask questions?
Compare the data for Student A and Student B
Student A
Sex
Female
Weight
1
1
1
1
1
1
1
2
1
Subject
Drama
English
English Literature
French
Geography
Maths
Physical Education
Double Science
Welsh
Weighted Average
Student B
Sex
Male
Weight
1
1
1
1
1
1
1
2
1
Subject
Business Studies
Drama
English
English Literature
Geography
Maths
Physical Education
Double Science
Welsh
Weighted Average
Band
B
Maths
46
YELLIS
Predicted
5.6 (B/C)
5.4 (B/C)
5.4 (B/C)
4.9 (C)
5.3 (B/C)
5.2 (C)
5.4 (B/C)
5.1 (C)
5.6 (B/C)
5.3
(B/C)
Band
A
7.4
(A*/A)
Maths
55
YELLIS Predicted
5.7 (B/C)
6.1 (B)
6.0 (B)
6.0 (B)
6.0 (B)
6.0 (B)
6.0 (B)
5.8 (B)
6.1 (B)
6.0
Achieved
Grade
8 (A*)
7 (A)
7 (A)
8 (A*)
8 (A*)
7 (A)
6 (B)
8 (A*)
7 (A)
(B)
Achieved Grade
4 (D)
6 (B)
5 (C)
5 (C)
3 (E)
6 (B)
2 (F)
5 (C)
5 (C)
4.6
(C/D)
Vocab
61
Pattern
53
Raw
Residual
2.4
1.6
1.6
3.1
2.7
1.8
0.6
2.9
1.4
Standardised
Residual
1.8
1.6
1.4
2.5
2.2
1.7
0.4
2.6
1.0
2.1
1.8
YELLIS
54
Vocab
71
Pattern
76
Raw
Residual
-1.7
-0.1
-1.0
-1.0
-3.0
0.0
-4.0
-0.8
-1.1
Standardised
Residual
-1.4
0.0
-1.0
-0.9
-2.4
0.0
-3.1
-0.7
-0.8
-1.4
-1.1
YELLIS
63
Find students A and B on each of the scatter graphs English and Maths
Scatter graph English
A*
8
A
7
C
5
D
4
E
3
F
2
Scatter graph Maths
G
1
U
0
A*
8
0
10
20
30
40
50
60
70
80
90
100
YELLIS Test Score (% )
A
7
B
6
GCSE Grade
GCSE Grade
B
6
C
5
D
4
E
3
F
2
Is there anything to learn from
these scatter graphs?
G
1
U
0
0
10
20
30
40
50
60
YELLIS Test Score (% )
70
80
90
100
Common Scenario…
Jane has completed her 6th form studies and a review has been
received for her by the college after A level results. Choose one
subject at a time and look carefully at what happened in that subject
both from a baseline of average GCSE grades and from a baseline of
the computer adaptive test.
This student has been placed in different bands, band B from average GCSE score and
band C from the computer adaptive test. This sometimes happens. It may be that the
student had an off day when she did the computer adaptive test or it may be that there
could have been a lot of ‘spoon feeding’ at GCSE. Jane may do better at coursework!
Even though we may not know the cause it can act as a warning when analysing results
though the predictions are not wildly out.
Profile Sheet: Jane (from Average GCSE)
Year: 2007
DOB: 01/06/89
(Average GCSE = 6.00 (Band B))
FINAL RESULTS
PREDICTIONS
Review: Final_Result
Review Review
Subject
Points Grade
(A1) Health & Social Care30.00
D
(A2) Religious Studies
80.00
C
(A2) English Literature
80.00
C
(A2) Drama & Theatre St100.00 B
Average Average
Points Grade Residual Std. Residual
39.64
C
-9.64
-0.66
89.84
B/C
-9.84
-0.52
87.29
B/C
-7.29
-0.40
91.84
B/C
8.16
0.46
STANDARDISED
RAW
Chances Graphs - Band B from average GCSE
Individual Chances Graphs for Jane from average GCSE score
Profile Sheet: Jane (from Computer adaptive test)
Year: 2007
DOB: 01/06/89
(Online adaptive test = 0.11 (Band C))
FINAL RESULTS
PREDICTIONS
Review
Subject
Points Grade
(A1)Health & Social Care 30.00
D
(A2) Religious Studies
80.00
C
(A2) English Literature
80.00
C
(A2) Drama & Theatre St 100.00 B
Average Average
Points
Grade Residual St
25.97
D/E
4.03
0.24
79.01
C
0.99
0.04
74.3
C/D
5.7
0.26
85.16
B/C
14.84
0.70
RAW
STANDARDISED
a) Did Jane reach her potential in all subjects?
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------b) Jane had been set aspirational targets prior to AS and A level by her teachers
as below:
Health and Social Care C
Drama
B
Religious Studies
B
English
B
Were these reasonable target grades?
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
c) Why should these grades not be used for accountability of her teachers?
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Commentary
a) Jane certainly reached her potential in Drama and Theatre Studies
with positive standardised residuals by both methods. On the basis
of the computer adaptive test she broadly reached potential in all
subjects. On the basis of average GCSE two A level subjects were
broadly down about half a grade and she dropped a grade in the AS.
b) Hopefully you agree these were reasonable target grades.
Remember we don’t know the student, but use the chances graphs
and if these were aspirational grades for the student then
accountability of a department’s staff on that basis is not appropriate,
but it certainly is on the basis of a whole class’s average
standardised residuals, particularly over a number of years.
YELLIS ATTITUDINAL
You are looking at the attitudinal feedback from Year 10 over time:
Do you notice a pattern between the four charts? There was an initiative in
the school that contributed, but was it sustainable?
Yellis Further Comparison charts for English and Maths
What concerns you about these charts? Can you suggest which is the stronger
department?
Departmental analysis (based on average GCSE score baseline)
SUBJECT A (A LEVEL)
Exam Year
-1.0
-2.0
Exam Year
2011
-1.0
2010
0.0
2009
0.0
2008
1.0
2007
1.0
2006
2.0
2005
-2.0
Final_Result
2.0
2004
-2.0
2011
-1.0
2010
-1.0
2009
0.0
2008
0.0
2007
1.0
2006
1.0
2005
2.0
2004
2.0
AS LEVEL
2003
Final_Result
Average Standardised Residual
A LEVEL
2003
Average Standardised Residual
There may be lots of reasons for changes in performance but here one factor is
known by the school. The subject teacher for subject A goes on long sick leave for
one of the autumn terms. When do you think this happened? To help you the AS
chart is shown below for this same subject. It may or may not be relevant.
-2.0
This school has a 5-year development plan which includes as one of its goals:
“To help students prepare for university and the world of work by developing
independent learning skills, the ability to reflect and to learn from others and to
maximise the benefits to learning offered by emerging technologies.”
The graphs on the next page reflect students’ perceptions of the style of
learning that has been adopted in their A-level classes in two broadly similar
subjects.
a) If the students’ perceptions are an accurate reflection of what takes place in
the classroom, which subject seems more on board with the school’s
development plan?
b) How would these perceptions inform the Senior Management Team’s
evaluation of progress with its 5-year plan if the subject achieving significantly
better value-added results was
i) Subject 1?
ii) Subject 2?
Subject 1
Subject 2
Case study A: ALIS value-added data
Many sets of VAD are available!
From average GCSE baseline:
• all ALIS cohort
• type of Institution
• syllabus
Also the same from the baseline test
SPC Chart with confidence limits: WHOLE SCHOOL
All ALIS Cohort
Syllabus
Institution
Using PARIS software: Baseline Test
Whole School
From your perspective, which set of VAD would you use for the
different user groups? (Governors, HoDs, Media, Parents, SMT/SLT...)
•
USE ONE YEAR’S DATA WITH CAUTION!
•
Better to use three years’ data, as patterns over time
are more significant.
Using data to inform leadership decisions
Some key questions:
1. Which data do I need AND which data do I not
need? (e.g. MidYIS cohort or Independent
Sector)
2. What does the data mean and what does the
data not mean? (e.g. staff INSET and support)
3. Who is the data for?
4. Storage, retrieval and use of data (e.g. selfevaluation and preparing for Inspection)
The use of this data needs to allow us to do our
best to help every student to at least achieve if
not exceed their potential.
It may challenge
•
•
•
•
The culture of ‘my’ school/college
Accountability policy
Expectations
Staff training in use of data and ability to cope with data
(data overload)
• Integrating the data into school procedures, storage,
retrieval, distribution and access
• Roles and Responsibilities
Who should data be
shared with?
Colleagues
A. Subject Teachers
B. Heads of Department
C. Pastoral Staff
D. Managers
Subject Teachers/HODs
1. This will be interpreted as a personalised
prediction
2. The data doesn’t work for this particular student
3. You’re raising false expectation – he’ll never
get that result
4. You’re making us accountable for guaranteeing
particular grades – when the pupils don’t get
them we’ll get sacked and the school will get
sued
Subject Teachers/HODs
Remind them that:
1. Baseline data can give useful information about a
pupil’s strengths and weaknesses which can assist
teaching and learning
2. “Predictions” are not a substitute for their professional
judgement
Reassure them that:
1. It is not a “witch hunt”
2. Value added data is used to assess pupil performance
not teacher performance!
Pupils
1. Make sure they know why they are taking the
test
2. Make sure they take it seriously
3. Make sure they don’t deliberately mess it up in
order to lower their BSGs!
4. Be prepared to look for clear anomalies and retest if necessary
5. Explain the chances graphs to them clearly
Parents
1. Make sure they know why the pupils are taking
the test
2. Explain the results to them
3. Explain lots of times that the chances graphs
and BSGs do NOT give personalised
predictions
4. Ensure that they receive good quality feedback
from staff when ambers or reds are awarded
5. Encourage them to ask lots of questions
YOUR QUESTIONS
Thank You
Robert Clark
(robert.clark@cem.dur.ac.uk)
Neil Defty
(neil.defty@cem.dur.ac.uk)