OECD country practices on reporting and implications for Australia

advertisement
OECD country practices on reporting and
implications for Australia
Barry McGaw
Director for Education
Organisation for Economic Co-operation and Development
2005 Curriculum Corporation Conference
Curriculum and assessment:
Closing the gap
1
Brisbane, Australia
2-3 June 2005
National reflections on PISA results.
2
Germany seeking more specific comparisons
560
540
High quality
Low equity
Finland
Reading literacy
Canada
Ireland
New Zealand
Australia
United Kingdom
520
500
Germany
480
Belgium
Austria
France
Norway
United States
Denmark
Switzerland
Czech Republic
Hungary
High quality
High equity
Korea
Japan
Sweden
Iceland
Poland
Greece
Portugal
Spain
Italy
460
Luxembourg
440
420
Low quality
Low equity
-20
3
Mexico
Low quality
High equity
-15
-10
-5
0
5
10
15
20
Social equity (OECD regression slope – country regression slope)
Source: OECD (2001) Knowledge and skills for life, Table 2.3a, p.253.
25
Denmark’s concern about efficiency
550
Finland
Ireland
Korea
525
Australia
UK
Belgium
France
500
Reading literacy
Japan
Sweden
Czech Republic
Hungary
Greece
Poland
475
Spain
Germany
Norway
Austria
USA
Denmark
Italy Switzerland
Portugal
450
425
Mexico
400
10,000
20,000
30,000
40,000
50,000
60,000
70,000
Cumulative expenditure per student to age 15 ($US equivalent PPP)
4
Source: OECD (2001) Knowledge and skills for life, Fig. 3.7a, p.91..
80,000
Place of national assessments

Recognition that more could be known domestically



Why bother?




Accountability – summative
Improvement – diagnostic, formative
Scope

5
Weighing a pig won’t make it fatter.
Without weighing the pig, how can you know how well
feeding regime is working?
Purposes of assessment


Denmark judged to have no ‘culture of evaluation’
Some countries report that international comparisons
have stimulated domestic evaluations

Sample or census?
Depends on whether focus on system, schools, students
Driving system reform may be helped by
having disaggregated data
Improving systems by monitoring and improving
units:
- education
- health
6
A UK health example:
Reducing wait time in Accident and Emergency Units
7
% of A&E patients waiting no more than 4 hrs
100
95
90
85
2001
2002
2003
2004
Target first introduced
in National Health
Service Plan, June 2000
Public Service Agreement
announced, June 2001
No improvement
occurring
80
75
8
70
S O N D J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A S O N D
Source: Barber, M. (2005) Presentation to Informal Meeting of OECD Education Ministers, St Gallen, Switzerland.
% of A&E patients waiting more than 4 hrs
100
2001
2002
2003
2004
95
90
85
Dept of Health
taskforce begins
80
75
9
70
S O N D J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A S O N D
Source: Barber, M. (2005) Presentation to Informal Meeting of OECD Education Ministers, St Gallen, Switzerland.
% of A&E patients waiting no more than 4 hrs
100
2001
2002
2003
2004
95
90
85
Accident & Emergency
included in hospital
star ratings
Dept of Health
taskforce begins
80
75
10
70
S O N D J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A S O N D
Source: Barber, M. (2005) Presentation to Informal Meeting of OECD Education Ministers, St Gallen, Switzerland.
% of A&E patients waiting no more than 4 hrs
100
95
90
85
2001
2002
2003
2004
Target revised to take account
of clinical exceptions
Accident & Emergency
included in hospital
star ratings
Dept of Health
taskforce begins
Incentive scheme
introduced
80
75
11
Performance management
Tailored support for specific problems
70
S O N D J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A S O N D
Source: Barber, M. (2005) Presentation to Informal Meeting of OECD Education Ministers, St Gallen, Switzerland.
% of A&E patients waiting no more than 4 hrs
100
2001
2002
2003
2004
90
80
70
60
50
40
30
20
10
12
0
S O N D J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A S O N D
Source: Barber, M. (2005) Presentation to Informal Meeting of OECD Education Ministers, St Gallen, Switzerland.
Driving system reform may be helped by
having disaggregated data
Improving systems by monitoring and improving
units:
- schools?
- teachers?
13
Data form and data use

Breadth of data




What we measure signals what we value.
Does what we don’t measure signal what we don’t value?
Watch for unintended consequences.
Type of data

student performances
– measurements of current performance
– estimates of value added by school
– comparisons with ‘like’ schools


Uses of data


14
other data on schools – input, process, outcomes?
school (and system) only
public use
– results or rank orders
– website accessibility (UK, Norway, Just4kids, Standard & Poors)
Do assessment programmes make a difference?

Research evidence

a little but it is positive – programmes improve systems
– Hanushek, E.A. & Raymond, M.E. (2004) The effect of school
accountability systems on the level and distribution of student
achievement, Journal of the European Economic Association, 2(23, p.406-415.

System evidence



England reports improvement among poorest performing
schools
Less improvement of next 15%
Importance of monitoring the monitoring

Evaluating trends in performance
– overall
– for subgroups (as in US No Child Left Behind Act requirements

15
Evaluating interventions intended to improve
performance
Thank you.
16
Download