SSME: Snapshot of School Management Effectiveness

advertisement

EdData II

Education Data for Decision Making

Snapshot of School

Management Effectiveness

Aims, Initial Development, Instruments, Methods

December 18, 2008

Dr. Luis Crouch

RTI International, Research Triangle Park, North Carolina, USA

About This Presentation

• This presentation was prepared for the Snapshot of School

Management Effectiveness (SSME) Workshop held in Washington,

DC, December 18, 2008. The workshop was organized by RTI

International for a group of international education experts .

Participants reviewed pilot applications of the SSME, the initial motivations for its development, and the validity of the resulting data. Participants also provided advice related to future development of the instrument.

• The USAID EdData II project is led by RTI International. Activities related to the SSME are carried out under EdData II Task Order

Number 1, EHC-E-01-04-00004-00.

• Icons on some slides in this presentation represent links to embedded files that are not available in the PDF version of this document. To obtain copies of the embedded files, please contact

Dr. Luis Crouch, lcrouch@rti.org, 202-974-7819.

2

Acknowledgments

As overall task manager for this effort, RTI thanks

• Professor Hyacinth Evans, Dr. Carol Gentles, and

Dr. Lorna Down, of the University of the West Indies,

Mona, Jamaica; and

• Fernando Rosas, Jorge Luis Bazán, Oscar Millones, and Jesus Salinas of Fundación para el Desarrollo

Agrario, Lima, Peru, for their implementation of the pilot SSME in their respective countries.

3

Lack of Quality and Management Data

1.

Government and donor focus on

• Access

• Completion

2.

Perception that quality and management data collection is

• Difficult

• Costly

3.

Projects, loans started without much data on quality

• “Walkabout,” or administrative data; or spotty, old, unsystematic data, on quality issues.

– Large data sets are available if a country is in the Trends in

International Mathematics and Science Study (TIMSS) or

Program for International Student Assessment (PISA), but many countries are not.

• “Sector assessments,” often with little systematic data.

4.

Ultimate aim is to improve instruction . . .

4

Why SSME?

Challenge: Develop tool that

– Is better and more systematic than “walkabout” collection

– Has more fresh data, is available for any country, and costs less than large international assessments

– Note: Some country-specific attempts to bridge the gap with

“systematic walkabout,” but is it the right point in the tradeoff?

SSME

1.

Quick Application

• 1 assessor

• 1 school day

2.

Just-large-enough sample size

• 50-70 schools

• Some motivation from health sector’s attempts to do “just large enough;” e.g., EPI tools (30 clinics, 7 users per clinic)

• 90% confidence interval (CI), 10 points wide

Yet, enough to . . .

5

Offer Rich, Multifaceted Data

• Systematic

• With fixed protocol

– School

– Principal

– 2 classrooms

– 2 teachers

– 8 children

– 1-2 parents

• With an “optimal” sample size

• One person/day per school to administer the instrument

6

Process Followed

• Reviewed literature (Lockheed and Verspoor, Henneveld,

Schiefelbein and Wolff, Moura Castro, etc.).

• Reviewed practical checklists and tools (Schiefelbein’s

“measurement stick”) from projects, experts.

• Compiled large item data bank.

• Crossed out redundancies; streamlined.

• Met with first expert panel in fall of 2006.

• Sought written input from network of experts.

• Streamlined; took on board recommendations.

• Conducted two pilots in 2007: Jamaica and Peru.

• Evaluated results and called second expert panel in 2008.

• Refine; talk about next steps; implement them (see agenda for today).

7

Item Bank Summary 1

• 101. Teachers’ access to pedagogical advice, support, teamwork

• 102. Teachers’ instructional practices

• 103. Learning evaluation practices and feedback

• 105. School head’s engagement in the school

• 106. Student engagement in learning

• 107. Student learning performance

• 108. Teachers’ pedagogical preparation and training

• 110. Learning expectations on the part of educators for their students

• 201. Equipment and materials in the classroom

• 203. Class composition (by gender, age, grade level, etc.)

204. Student management and discipline in the classroom

• 207. Use of class time

• 301. Communication

303. Planning, decision making, and management teamwork

• 305. School funding, financial management

• 306. Infrastructural management

8

Item Bank Summary 2

• 307. Equipment and materials management

• 308. School head characteristics

• 311. Personnel management and treatment

• 312. Student attendance

• 313. Student management/treatment

• 314. Time organization and use

• 319. Overall school characteristics

• 401. Parent, community in-kind, or financial contribution to school.

• 402. Parent, community role in school management and governance

• 404. Parent engagement and investment in own child’s/children’s learning

• 406. School-initiated communication with parents

• 409. Parent perceptions of school quality

• 501. Financial management and legislation

• 502. Human resource management

• 503. Monitoring and supervision

• 504. Pedagogical advice and support

• 505. Sectoral planning

• 506. Educators’ professional development

• 507. Overall management support provided by district or broader system

9

Instruments

1.

Principal interview: overall school issues (~100 items)

2.

School observation: overall school issues (~40 items)

• Are teachers teaching?

• School-level inputs

3.

Teacher instrument (2): methods of teaching (self-reported) [~60 items]

4.

Classroom observation (2): inputs in classroom (e.g., have books)

[~25 items]

5.

Classroom activity grid (4 “slices”) (methods observation) [~40 items]

6.

Student instrument (8): quick reading, triangulate teacher reports

(e.g., do students get homework?) [~40 items, plus reading]

7.

Parent instrument (1-2): how much support do they provide; how well does school report? [~50 items]

10

Seem to Work

1. Distinguishes between (effective and ineffective) behaviors practiced

2. Distinguishes schools with more effective vs. less effective behaviors

3. Allows regional or cross-country comparisons

4. Confirms relationships between behaviors and student performance

11

Distinguishes Between Effective Behaviors

If differences are this big, then they tend to be statistically significant, even with a small sample.

If differences are big enough to worry about, then they are big enough to detect with validity, even if sample is 50-70 schools. But not smaller than that!

Have we found an optimal point?

30.4%

25.3%

20.2%

15.2%

8.9%

60% or less 61 - 80% 81 - 90% 91- 99%

Student Attendance

100%

12

Statistical Examples for 90% CI

Schools with chalkboards: 93% to 100%

Children with language books: 75% to 86%

% pupils’ exercise books with only a few pages marked:

60% to 71%

So, can rank order, with statistical power, key behaviors or issues.

Sample = 64 schools

13

Can Flag Issues that Need Immediate Attention

• In Jamaica, 11% of teachers absent from classrooms.

• Only 31% of principals reported textbooks received per policy and on time.

− Delayed books arrived 2.1 months late on average.

• 45% of Peruvian principals had never received an inspection visit in the past year.

• 77% of Peruvian principals said it is impossible to remove a nonperforming teacher.

14

Peruvian Schools by Management Index Score

45%

32%

13%

0 1 2

School Management Index

9%

3

15

Jamaican Classrooms by Availability of

Textbooks

20%

17%

22%

24%

17%

40% or less 41 - 60% 61 - 80% 81 - 99%

% Students with Language Arts Book

100%

16

Availability of Textbooks

Big differences are obvious! Easy to check.

17

Allows Data Comparison Across Countries

(or regions within countries)

Weekly student absence

Repeaters

Student work on walls

Repetition of teacher phrases

Peru

(%)

14

Jamaica

(%)

12

11

80

27

2

82

60

18

Confirms Relationships Between Behaviors and Student Performance

1.55

But note: Purpose is not to duplicate “production function” or “effective schools” studies (sample size not large enough). Yet it would be disconcerting if no relationship could be found.

0.63

0.65

0.4

0 1 2

School Management Index

3

19

Jamaica Teacher Feedback and Reading Fluency

74

61

57

32

None Some Many

Frequency of Teacher Marks

All

20

Things That Seem to Work

Ability to capture extremes in behavior.

Even if “only” 12% of school days had been missed on average the previous week, 8% of children had missed

3 days or more.

Even if (self-reporting) schools are in session 4 hours,

(even with self-reporting), the bottom 10% were open only 2.75 hours.

22

Things That Do Not Work So Well

• Could not find wholly effective or wholly ineffective schools, except in outcome terms.

– No group of schools seemed to cluster a lot of ineffective behaviors.

– Rely on learning results, not indicators of process management, to decide to intervene or close down schools.

– Can clearly identify extremes on specific behaviors or small clusters of behaviors (e.g., 3-4 behaviors, marking exercise books, comments on tests, frequency of homework).

• Long, multi-instrument, join databases

– Analysis is not easy, hard to envision district offices doing it.

– “20-question” version? Shorter and quicker? For analysis reasons, not cost reasons.

23

Worked OK —Could Be Better

• In some areas, “effective” guidance from literature is somewhat ambiguous.

• Shows up in teacher questionnaire. For example, how important is “group work”?

– Much advice and many donor projects push a discovery approach.

– For instance, student repetition as “bad.” Thus, frequent repetition would be a “bad” practice.

– But doubts arise as to whether more direct methods work better for some skills: Perhaps in certain skills some repetition every day might be wise.

• A balanced view suggests that we ask teachers not about frequency of practice, but about time spent on practices, because no practices are wholly deleterious.

– But is it feasible to ask about time? Or observe only?

24

Questions

• Narrow, methodological

– Is sample size enough? Are we hitting “sweet spot”?

– Is it the right approach not to attempt to confirm basic literature, but just to use it?

– Is it acceptable not to be able to find “basket case” schools or

“super” schools?

– Length, complexity: we will address here

• But: Radical? 20-question version?

– Teacher questionnaire: formulate questions as frequency or amount of time?

25

Questions to Discuss Later Today

Instruments and methodology

• Should there be a formal item bank? On Web site?

• Make available sample questionnaires, or let people pick and choose from item bank?

• Allow people to contribute to item bank?

• Is 1-person-day method worth preserving?

• Develop analysis protocols/manuals or just assume people can analyze? Is it worth it?

How to get more people using tools (such as SSME) to drive quality and learning, rather than just research? Should this “insertion” itself be piloted?

Should one develop a more formal community of practice? Not for

SSME, but for the development and use of practical, empirical, fairly standardized school quality measurement tools? How formal?

26

Download