2013 Research Digest 2013 Highlights

advertisement
Office
of
Research
2013
Research Digest
2013 Highlights
Brief Reviews
of the Research
Literature
Page 4
Formative and
Informative
Research
Page 5
Evaluation
Studies
Page 9
Our staff
Page 14
Milestones
Page 14
We in the Office of Research believe in the power of logic models to guide
planning for the work of organizations. We appreciate the way they compel
people to carefully think through what outcomes and impacts they hope to see,
and what specific things they will need to do to achieve them. And best of all
from our perspective, logic models provide a clear path for checking at every
step of the way the effectiveness of their planning—that is, which activities they
should continue because they are producing the desired outcomes, which ones
to abandon, and which ones need adjustments. Formative and summative
evaluations at their best provide precisely this type of information. So, in keeping
with the advice we have so often given others, in 2013 we developed our own
logic model. Our Highlights section for this Research Digest briefly describes
evidence that we are making progress toward our own hoped-for outcomes.
Outcome 1. An institution-wide expectation that formative and summative evaluations
will be a necessary component of educational programming.
Beginning in 2012, our office has worked closely with the WVDE Office of Early
Learning (OEL) to create a logic model intended to guide all of their work in
response to the Governor’s challenge to develop a comprehensive PreK–Grade
3 literacy program, as well as their work in developing a high-quality workforce
in West Virginia’s universal preschool program, and other aspects of PreK–Grade
5 curriculum and instruction. OEL has formed workgroups to design activities
and outputs (i.e., services and products) to achieve particular outcomes in each
of the task areas in their logic model. Late in 2013, research staff began working
with OEL staff to develop indicators for each of the outputs and outcomes, which
will be used to help them measure the effectiveness of their work with early
childhood educators across the state. Other offices in the Department are now
considering the potential for using logic models in their own work.
Outcome 2. An institution-wide expectation that the results of research and evaluation
studies should inform appropriate audiences in a timely manner and impact
organizational decision-making.
In preparation for the 2014 legislative session, we were asked by the state board to conduct a study of the
appropriate length for planning periods at the various grade levels and for the different types of class schedules.
Within a 3-month timeframe, staff designed, conducted, and reported a study of the issue that included both
a review of the research literature (with support from the federal regional educational laboratory serving the
Appalachia region), and a survey of a representative sample of 2,000 West Virginia educators (see Instructional
Planning Time: A Review of Existing Research and Educator Practice During the 2012-2013 School Year, page 5).
The report was delivered to the state board and Governor well in advance of the session.
Another study, Improving School Discipline Data Collection and Reporting: A Status Report for the 2012–2013
School Year, released in December 2013 (see article, page 6), included a statewide analysis of school disciplinary
incidents reported during the 2012-2013 school year—the first full year under the revised Policy 4373, and the
first time an in-depth annual analysis of student behavior has been undertaken. Findings from the analysis were
provided to help inform districts and schools about what supports they may need to improve school climate,
including more positive approaches to student discipline. In addition, the project provided an opportunity for the
WVDE to build a database with corresponding queries and standardized report templates that will enable WV to
replicate the project and generate similar statewide and district level reports annually with nominal effort. This
will be particularly relevant as the Department provides annual reports
and policy recommendation to the state board. It is our view that the
work has generated increased support for the issue of improving school
discipline practices. Feedback has been positive; recommendations
stemming from the project have been well received.
We were also charged with conducting formative evaluation studies
of two crucially important pilot projects, providing data about
implementation issues that were used to improve the rollout of major
statewide programs:
• During 2011-2012, teachers in 25 WV schools (12 counties)
participated in the pilot test of the new educator evaluation system. The first summary and cross analysis of data collected and shared with project leaders during the pilot was published midyear: West Virginia Revised Educator Evaluation System for Teachers 2011-2012: First Year Pilot Report (page 7).
• The West Virginia Universal Free Meals Pilot project provided a nutritious breakfast and lunch to all students, regardless of financial need, in 72 schools in seven counties during the 2011–2012 school year. This report published in early 2013 examines the implementation and impacts of the pilot, including both the benefits realized and the challenges encountered and overcome: West Virginia Universal Free
Meals Pilot: Evaluation Report (page 8).
Outcome 3. Protection of human subjects who participate either directly or
indirectly in research and evaluation activities conducted by Department
staff or external researchers.
Staff monitored 17 studies through its Institutional Review Board during
2013, including new applications, modifications, and the continuation
of studies. All IRB members were in compliance with all education and
procedural requirements. Additionally, Office of Research staff reviewed
2
and responded to nine external requests for student data, and were deeply involved in developing the Department’s
new data governance structures.
Outcome 4. Increased capacity of Department program staff, district, and school staff to engage in self-evaluative
activities.
In addition to the logic model activities described above, staff conducted a comprehensive review of the research
literature on effective practices in professional development (see Creating the Context and Employing Best Practices
for Teacher Professional Development: A Brief Review of Recent Research, page 4). Findings from the review were
presented to the state board’s High Quality Educator committee, and were used by research staff to develop a new
survey instrument to measure the quality of professional learning sessions being offered by WVDE staff, the Center
for Professional Development, institutions of higher education, and regional education service agencies across the
state. The instrument is also available to any district or school personnel who wish to use it in their own professional
development program evaluations.
Also in 2013, staff presented evaluation information at the Safe and Supportive Schools Summer Conference, the
Student Success Summit, and the KidStrong Conference.
Outcome 5. Research and evaluation needs of the Department and other clients fully met.
Last, but not least, some of our evaluation activities are ongoing and are required by federal or other funding,
such as those described next:
•Student responses to the WESTEST 2 Online Writing Assessment are scored by a computer-scoring engine. We conduct an annual scoring comparability study (see Findings from the 2012 West Virginia Online Writing Scoring Comparability Study, page 9) that compares scoring by trained human raters to scoring by the computer engine.
•The 4-year federal Safe and Supportive Schools (S3) program supports targeted interventions to improve
and measure conditions for learning at the high school level. Each year we conduct an evaluation targeted to different aspects of the program (see West Virginia Safe and Supportive Schools Project: Year 2 Implementation Evaluation Report, page 10).
•The Special Education Technology Integration Specialist (SETIS) program provides professional development for special education teachers to assist them in achieving proficiency with 21st Century Technology Tools. This study, The West Virginia Special Education Technology Integration Specialist
(SETIS) Program: 2011-2012 Evaluation Report (page 12), examines SETIS program implementation,
use, and impact across three key stakeholder groups: SETIS, teacher colleagues, and school
administrators.
Occasionally we are asked to evaluate short-term programs done in collaboration with other groups, such as
the SCALE Project, which focused on professional development and technical assistance provided by the WVDE
and the West Virginia Symphony Orchestra (WVSO). Teachers from 16 high-poverty elementary schools helped
students plan and implement an arts-based cross-curricular project, and prepared them to attend a theme-related
concert performed by the WVSO (see Evaluation of the Student-Centered Arts-Learning Environments (SCALE)
Project, 2013 Report, page 11).
All in all, it was a busy and productive year. At this writing, we are well into a number projects that we believe will
provide relevant, timely, and useful information to WVDE staff, the state board, the legislature, and to educators
and citizens at large. Follow our progress on our website: http://wvde.state.wv.us/research/. And call us if you
would like help in developing your own logic model.
3
Brief Review of the Research Literature
Creating the Context and Employing Best Practices for Teacher Professional
Development: A Brief Review of Recent Research
Patricia Cahape Hammer, September 2013
A review of the research on teacher professional development identified an emerging consensus on important
contextual and implementation characteristics that can promote or inhibit teachers’ use of new knowledge and
skills in their classroom practice.
Findings. Teachers’ professional development does not happen in a vacuum and should not be a purely individual
pursuit. Research suggests that professional development is best viewed as one component in an overall system
that also requires alignment among tests, policy, and curriculum. Further, when curriculum for improving teaching
overlaps with curriculum and assessment for students, teaching practice and student learning are more likely to
improve. On the other hand, when policies and implementation do not meet these conditions—for example, by
introducing new assessments or curriculum without offering teachers adequate opportunities to learn them or by
offering professional development that is not well aligned—the chances for success are greatly reduced. Within
this context, research has shown that effective professional development tends to have the following elements:
Content and content pedagogy focus—This element includes both deepening teachers’ knowledge of the subject
matter they are teaching and the pedagogical approaches that
Research suggests that
have been shown to be successful in helping students learn
professional
development is
that subject matter. Effectiveness is improved if the professional
development uses the curriculum materials that teachers will later best viewed as one component
in an overall system that also
use with their students.
requires alignment among
tests, policy, and curriculum.
Coherence—This element involves providing professional
development experiences in a progression that builds on previous
experiences and aligns with school goals and with state standards, curriculum, and assessments. Coherent
professional development programs encourage continuing professional communication among teachers, either
in their own school or with others in the district who teach similar subject matter or students.
Active learning—Opportunities for active learning can include reviewing student work, practicing a new skill
and obtaining feedback, planning how new curriculum materials and new teaching methods will be used in the
classroom, and engaging in discussions and in written work.
Collective participation—Professional development that has collective participation of teachers from the same
school, department, or grade helps increase opportunities to discuss concepts, skills, and problems that arise
when teachers work to integrate what they have learned into their classroom practice. Over time, it can lead to
a professional culture in which teachers in a school or teachers who teach the same grade or subject develop
a common understanding of instructional goals, methods, problems, and solutions—an understanding that is
sustained over time, even when some teachers leave and others join the group.
Duration, including time span and contact hours. Depending on the complexity and difficulty of the knowledge
and skills teachers are learning, the number of contact hours may
When curriculum for improving vary, but research suggests that at least 30 hours are needed
teaching overlaps with
to impact student achievement. Sustaining the experience over
curriculum and assessment
one or more school years is also important, allowing for more
for students, teaching practice opportunity for teachers to try out new practices and benefit from
and student learning are more additional feedback and communication with trainers, coaches, or
likely to improve.
colleagues in professional learning communities in their schools.
For more information, contact author Patricia Cahape Hammer, Office of Research (phammer@k12.wv.us), or
download the full review and bibliography from the WVDE Office of Research website at http://wvde.state.wv.us/
research/reports2013.html.
4
FORMATIVE AND INFORMATIVE
RESEARCH
Instructional Planning Time: A Review of
Existing Research and Educator Practice
During the 2012-2013 School Year
Nate Hixson, Amber D. Stohr, and Patricia Cahape
Hammer, November, 2013
A study of instructional planning periods was
undertaken in late 2013 pursuant to West Virginia
State Code §18A-4-14, which states: “The state board
shall conduct a study on planning periods. The study
shall include, but not be limited to, the appropriate
length for planning periods at the various grade
levels and for the different types of class schedules.”
Other duties often usurp daily
instructional planning time.
Personal time spent outside of
school for instructional planning
varies considerably, but averages
about 69 minutes daily.
Method of study. A review of the research literature
and an educator survey were conducted to study
this issue. The educator survey was administered
between August 19 and September 30, 2013 to
a representative sample of 2,000 West Virginia
educators.
Findings. Research on the impact of individual
planning is limited; however the use of collaborative
planning has been associated with improved
student achievement, especially at the secondary
level. Currently, there is no definitive researchbased recommendation regarding the amount of
instructional planning time needed to realize benefits
to students. Results of the survey revealed that
elementary educators had the lowest average daily
planning time of all programmatic levels (40 minutes)
followed by middle school (51 minutes) and high
school (60 minutes). Elementary educators, who have
an average of six daily preps compared with three
preps for middle and high school educators, have
considerably less time to plan per daily prep—about
9 minutes compared to more than 20 for middle and
high school educators. A high percentage of middle
school educators reported their schools use both
independent and team planning (71%), compared
with elementary and high schools. High school was
the only programmatic level where a vast majority
of individuals reported having only independent
planning time (74%). High school educators working
within a block schedule (over a third) reported, on
average, having approximately 40 more minutes of
On average, West Virginia
educators believe they ideally
need about 22 more minutes of
planning time at school daily to
support effective instruction.
in-school planning time available than educators
in traditional schedule high schools, even though
the average number of preps (about three) does
not vary significantly among traditional and block
schedule high schools. Despite large differences
in the amount of time available for planning each
day and per prep, there was almost no difference
in the amount of additional time (69 minutes)
educators reported spending planning outside of
school hours. Educators overwhelmingly indicated
that duties beyond instructional planning often usurp
their planning time. These duties include IEP and
SAT meetings, student interventions, administrative
tasks, providing coverage for other educators, and
a variety of other tasks. Some are central to effective
instruction, but many are solely preparatory in nature
or administrative. There is a sentiment that these
tasks greatly impact the amount of time reserved for
actual lesson planning. On average, West Virginia
educators believe they ideally need about 22 more
minutes of planning time at school daily to support
effective instruction.
Recommendations include, (a) maintain or increase
current levels of planning time; (b) advocate strongly
for the integration of collaborative planning as a
central feature of school practice, especially among
secondary schools; (c) provide support to district and
school leaders to build leaders’ capacity to prioritize
and protect collaborative time, organize collaborative
teams that are working in alignment with other school
and district goals, and establish expectations for
collaborative planning; (d) consider teacher role as
a factor in determining the amount of planning time
necessary; and (e) consider seeking additional input
from administrators and LEAs regarding this issue.
In West Virginia, collaborative
planning is employed most often at
the middle school level, and to a lesser
extent in elementary schools. Nearly
74% of high school educators report
only independent planning is used in
their schools.
For more information, contact Amber Stohr, Office
of Research (astohr@k12.wv.us), or download the
full report at http://wvde.state.wv.us/research/
reports2013.html.
5
Improving School Discipline Data Collection and
Reporting: A Status Report for the 2012-2013 School Year
Andy Whisman, December 2013
A statewide analysis was conducted on school disciplinary incidents reported
during the 2012-2013 school year-the first full year under the revised Policy 4373.
Findings from the analysis are provided to help inform districts and schools about
what supports they may need to improve school climate, including more positive
approaches to student discipline.
Method of study. Using 2012-2013 data entered into the West Virginia Education
Information System (WVEIS), we conducted two sets of analyses—one focused
on discipline referrals (DRs) to examine the number, seriousness, and types of
behaviors and interventions used by schools; and a second addressed questions
about student subgroup representation in the discipline data.
Findings. The analysis used 220,656 discipline referrals entered into WVEIS, which
represents a rate of 786 discipline referrals per 1,000 students. Some schools
submitted no DRs, suggesting underreporting. About 45% of DRs were made for
students in high school, 39% for middle school, and 17% for elementary school.
About two thirds of DRs were for Level 1 minimally disruptive behaviors, followed
by 27% for Level 2 and 10% for Level 3 behaviors. Referrals for the most severe
and illegal behaviors (Level 4) were rare and accounted for less than 1% of all DRs.
In response to these DRs, about two thirds of interventions/consequences used
by schools were detentions, in-school suspensions, or out-of-school suspensions
(26%, 19%, and 17%, respectively). About a third of interventions/consequences
for Level 1 minimally disruptive behaviors were some type of detention. However,
nearly 27% were in-school suspensions or out-of-school suspensions. There also
were 12 expulsions-related actions associated with Level 1 behaviors, which
may be disproportionate to the behaviors involved. Nearly 80% of students were
absent from the discipline data (no DRs were made for them), while many other
students were referred for only a single offense. Students with multiple referrals,
however, accounted for 88% of all DRs. Many students were reported for five or
more offenses; the highest number for a single student was 71. Black students
and students with disabilities were present in the discipline data at rates higher
than their representation in the overall student population. Risk ratio calculations
indicate Black students to be about two or more times more likely to experience
suspensions—although this disparity is lower in West Virginia than for the majority
of other states. Students with disabilities also are at higher risk.
Limitations of study. 2012–2013 was a transition year as West Virginia deployed
a new discipline management system. It is not clear what effect this transition had
on the completeness or accuracy of data summarized in this report.
Recommendations. Four recommendations are offered: (a) encourage diligence
in accurately reporting discipline behaviors as required by Policy 4373; (b)
provide training/technical assistance specific to positive discipline approaches
and alternatives to suspension; (c) build district and school staff capacity to
provide appropriate behavioral interventions in the context of the Support for
6
Of all students
in West Virginia
included in the
analysis, most (78%)
had no referrals
for inappropriate
behaviors. Also the
most serious offenses
(Level 4) accounted
for less than 1%
of all discipline
referrals.
During the 2012–
2013 transition year
for the new system,
there were some
inconsistencies in
reporting, including
schools that reported
no discipline
referrals at all.
Although
suspensions are
viewed in Policy
4373 as temporary
solutions, they were
among the most
frequently used
interventions or
consequences used
by districts and
schools, even for
minimally disruptive
behaviors.
Personalized Learning, three-tiered framework;
and (d) further investigate subgroup disparities and
deliver professional development and technical
assistance specific to minimizing them.
Black students and students with
disabilities are overrepresented in the
discipline referral data--indicating
the need to investigate subgroup
representation in more detail, and to
build capacity to address disparities in
discipline practices.
For more information, contact coauthor, Andy
Whisman, Office of Research (swhisman@k12.
wv.us), or download the full report at http://wvde.
state.wv.us/research/reports2013.html.
West Virginia Revised Educator
Evaluation System for Teachers 20112012: First Year Pilot Report
Anduamlak Meharie and Nate Hixson, June 2013
During 2011-2012, teachers in 25 WV schools (12
counties) participated in the pilot test of the new
educator evaluation system. This is the first summary
and cross analysis of data collected and shared with
project leaders during the pilot.
Method of study. Data were collected in surveys and
focus groups conducted throughout the pilot year.
We also analyzed content from electronic documents
submitted by educators as required components of
the system.
Findings. At the pilot’s conclusion, the distribution
of summative ratings was as follows: emerging
(14.5%), accomplished (76.1%), and distinguished
Surveys indicate that the revised system
led to greater understanding of the
WV professional teaching standards,
the process of setting student learning
goals, and identifying ways to achieve
(9.3%). (Unsatisfactory ratings were prohibited.) Some
components of the system were well implemented,
especially some aspects of student learning goals
(collaboration, rigor, and comparability). Yet, fewer
than the required number of classroom observations
took place, nearly half of teachers did not complete
their student learning goals on time, and of those
who did, many did not include the required two data
points in time. Given that administrators reviewed and
approved these goals, evidence suggests they need
more training. Surveys indicate that the revised system
led to greater understanding of the WV professional
teaching standards, the process of setting student
learning goals, and identifying ways to achieve them;
and to increasing the use of effective instructional
strategies. Still, educators reported the revised system
required too much time, had too many technologyrelated issues, and lacked needed access from home.
Notably, although the majority of teachers indicated
that particular parts of the system had a positive impact
on them, a smaller proportion indicated the system
as a whole had a positive impact on them overall as
educators. Preliminary evidence suggests at least two
factors were measured by the new system: inputs
(related to Standards 1-5) and outputs (the student
learning goals portion of Standard 6). Correlation
data indicate that the input measures are clearly and
strongly related to one another and to a lesser extent
to some of the output measures. Teachers gave high
marks to the quality of training, but less than two
thirds thought they received beneficial feedback from
administrators or that the system was implemented well
in their schools, indicating the need for more training
to develop administrators’ capacity to implement the
system.
Limitations of study. This study involved primarily
historically low-performing schools, and a small
number of volunteers; consequently the findings are
not generalizable and should not be used for making
summative judgments.
them; and to increasing the use of
effective instructional strategies. Some
components of the system contributed
positively to educators’ perceptions of
professional growth.
7
Still, educators believed the revised
system required too much time, and
many reported adverse technologyrelated issues and called for access to
the system from home.
Recommendations include (a) provide ongoing training
and support to administrators and teachers (with a
stronger focus on student goal setting), and provide all
educators access to the system outside of school; (b)
comprehensively monitor implementation at schools,
relationships among professional teaching standards
within various groups of schools, and range-ofeffectiveness ratings; (c) develop a classroom level
measure of student growth; and (d) establish a technical
advisory committee, streamline the evidence form, and
establish a protocol for managing the revision or student
learning goals.
Teachers gave high marks to
the quality of training, but less
than two thirds thought they
received beneficial feedback from
administrators or that the system
has been implemented well in their
schools, indicating the need for more
training to develop administrators’
capacity to implement the system.
having healthier students, more nutritious food, and
more food options. Stakeholders also reported that
the overall environment of the schools improved,
and behavior problems decreased, while students’
excused and unexcused absence records indicate
that attendance rates in high schools leveled off
rather than continuing to decline. Teacher-student
relationships reportedly improved in elementary
schools
that
implemented
breakfast-in-theclassroom. According to most teachers, students
also exhibited better concentration, higher levels
of energy, and a more active engagement in the
classroom. Research suggests that achievement
gains may be expected in the future as a result of
these improvements. Major stakeholder concerns
included financing the program; inadequate kitchen
equipment and cooking staff to produce more
school-made meals; finding strategies to prevent
loss of time for classroom instruction; insufficient
time between breakfast and lunch; student wait
Schools in the project reported having
healthier students, more nutritious food,
and more food options. According to
most teachers, students also exhibited
better concentration, higher levels of
energy, and a more active engagement in
the classroom.
For more information, download the full report or
executive summary at http://wvde.state.wv.us/research/
reports2013.html.
West Virginia Universal Free Meals Pilot:
Evaluation Report
Anduamlak Meharie, Andy Whisman, Nate Hixson,
Patricia Cahape Hammer, Yetty A. Shobo, Amber Stohr,
January 2013
The West Virginia Universal Free Meals Pilot project
provided a nutritious breakfast and lunch to all students,
regardless of financial need, in 72 schools in seven
counties during the 2011–2012 school year. This report
examines the implementation and impacts of the pilot,
including both the benefits realized and the challenges
encountered and overcome.
Method of study. The report draws on information from
surveys, individual and focus group interviews, extant
data sources, and WESTEST 2 results.
Findings. At the conclusion of the pilot project’s first year,
analysis of WESTEST 2 data revealed no major differences
in student achievement—an unsurprising finding given the
brief duration of the project. However, schools reported
8
Stakeholders also reported that the
overall environment of the schools
improved, and behavior problems
decreased.
time for meals and lack of time to eat; and food waste.
For the most part, these issues became less of a concern
by the end of the year, and the overall sentiment toward
the program remained very high. The overwhelming
majority of stakeholders reported that they wished to
continue implementation of the program despite any
challenges they encountered.
Research suggests that achievement
gains may be expected in the future as
a result of these improvements.
Recommendations. The program should be expanded
and a longitudinal study of these 72 schools should
coincide to analyze long-term impacts. To alleviate
financial concerns, the WVDE should continue to aid
districts in obtaining funding. Time should be allocated
for key stakeholders to meet and exchange information
about successful strategies. Districts initiating the
program must provide schools with adequate time to
make arrangements to avoid shortages in staffing,
kitchen equipment, and supplies. Counties and schools
should involve all relevant stakeholders in the decision
making process, especially regarding the choice of
appropriate breakfast strategies, scheduling, and type
and quality of meals. Districts must allow schools the
freedom to explore strategies that fit their needs (e.g.
grab-and-go breakfast versus breakfast-after-first).
Information must be well-communicated to staff and
students that participation in school meals is voluntary
and, while calories are limited per meal, students may
have as many fruits and vegetables as they want. Proper
monitoring of the program will help ensure informed
decision making regarding meal schedules, lunch lines,
and food distribution, to help alleviate concerns about
student hunger and food waste.
The overwhelming majority of
stakeholders reported that they
wished to continue the program
despite any challenges they
encountered.
For more information, contact Andy Whisman, Office of
Research (swhisman@k12.wv.us), or download the full
report from the Office of Research website at http://
wvde.state.wv.us/research/.
EVALUATION STUDIES
Findings from the 2012 West
Virginia Online Writing Scoring
Comparability Study
Nate Hixson and Vaughn Rhudy, September 2013
Student responses to the WESTEST 2 Online
Writing Assessment are scored by a computerscoring engine. The scoring method is not widely
understood among educators, and there exists a
misperception that it is not comparable to hand
scoring. To address these issues, the West Virginia
Department of Education (WVDE) conducts an
annual scoring comparability study that compares
scoring by trained human raters to scoring by the
computer engine.
Method of study. This year, 45 educators from
West Virginia participated in the study. Each
scored a set of training essays and operational
student essays that also were scored by the
scoring engine. Each operational essay was
scored independently by two human raters.
Human raters’ scores were compared to each
other and to the engine. Two research questions
were posed: (RQ1) what is the level of calibration
This year, 45 educators from West
Virginia each scored a set of training
essays and operational student essays
that also were scored by the scoring
engine. Human rater pairs tended
to provide the most consistent scores.
However, in many cases we found
that human raters were more likely to
agree with the engine’s scores than
with each other’s.
9
to the automated scoring engine that is
achieved among human raters as a result of the
training provided by the WVDE?; and (RQ2) what is
the comparability of scores assigned by human rater
pairs as well as between human-to-engine pairs?
Findings. Approximately 58% of human raters
met three industry standard calibration criteria for
calibration; the remaining 40% did not. Human rater
pairs tended to provide the most consistent scores.
However, in many cases we found that human raters
were more likely to agree with the engine’s scores
than with each other’s. When disagreements did occur
though, human raters consistently scored student
essays slightly higher than the engine. We believe
this outcome should serve to mitigate some concerns
that the engine scores student essays wildly differently
from regular classroom educators or that the engine
scores essays too forgivingly.
Andy Whisman, June 2013
The 4-year federal Safe and Supportive Schools (S3)
program supports targeted interventions to improve
and measure conditions for learning at the high
school level. For 2011-2012 (Year 2), two evaluation
questions were investigated: (EQ1) To what extent do
participating schools implement the program with
fidelity relative to the WV Model for Positive School
Climate (WVMPSC), and (EQ2) to what extent do
program initiatives improve school climate and
culture?
Limitations of study. We do not draw definitive
conclusions about the consistency of the engine
from the results of this study because so few raters
met rigorous standards for calibration. However, we
note that the test vendor has provided considerable
evidence to establish the comparability of the scoring
process based upon studies that use only human raters
judged to be experts based upon industry standard
criteria.
Method of study. To assess implementation fidelity,
we developed 4-point rubrics for each WVMPSC core
activity, which were used in school level assessments by
school climate specialists (SCSs) and S3 school teams.
To assess school climate improvement, we compared
overall school climate index scores for S3 schools
between the 2010-11 and the 2011-12 school years
to assess change over time, and in the WV School
Climate Survey asked students and staff to indicate
whether 22 items corresponding to the school climate
index have changed compared to the previous year.
Recommendations. Continue to use the annual
comparability study as a professional development
experience for educators and additional data collection
around educators’ perception of the accuracy and
fairness of scores assigned by the engine.
Findings. Regarding fidelity of implementation,
across most core activities—aligned with all stages of
implementation—improvements were made, moving
from being altogether missing or implemented with
weak fidelity in 2011, to being implemented at weak
to moderate fidelity in 2012. Both school-based S3
teams and SCSs indicated marked improvements
relative to the strategic steps of the WVMPSC over
the 2 years. SCSs tended to be more guarded in
their assessments, however. For some core activities
For more information, contact Vaughn Rhudy, Office
of Assessment and Accountability (vrhudy@k12.
wv.us), or download the full report on the Office of
Research website (http://wvde.state.wv.us/research/
reports2013.html).
10
West Virginia Safe and Supportive
Schools Project: Year 2 Implementation
Evaluation Report
they indicated schools’ implementation fidelity to be at
lower levels than the school S3 teams rated themselves,
including (a) informing parents and community partners
about the S3 initiative and securing their commitment;
(b) building understanding of S3 behavioral norms
among school staff; and (c) using assessment results to
identify factors contributing to school climate problems,
set priorities or plan activities, and select appropriate
interventions. Regarding impacts of the S3 program
on school climate and culture, S3 intervention schools
showed significant improvement, with medium to large
effect sizes in school climate as measured by the WV
School Climate Index. Based on survey data, however,
there appears to be a fairly wide gulf between students
and staff, with students much more likely to report that
conditions stayed about the same, whereas staff were
much more likely to report conditions had gotten better.
Both school-based S3 teams and school
climate specialists (SCSs) indicated
marked improvements relative to the
strategic steps of the WV Model for
Positive School Climate (WVMPSC) over
the 2 years. . . SCSs tended to be more
guarded in their assessments, however.
Limitations of study. It will not be possible to determine
if the improvements in the Index observed in this year’s
study are genuine until data are collected for the full 4
years (including a comparison group of nonintervention
schools).
Recommendations. Schools should (a) establish or refine
behavior norms and expectations to be brief, positively
stated, and inclusive of students and staff; (b) expand
approaches for communicating and teaching behavior
norms and expectations; (c) select and implement school
climate interventions based on thorough assessments
of factors leading to school climate problems; and
S3 intervention schools showed significant
improvement, with medium to large effect
sizes in school climate as measured by the
WV School Climate Index.
of Research (swhisman@k12.wv.us), or download
the full report at http://wvde.state.wv.us/research/
reports2013.
Evaluation of the Student-Centered
Arts-Learning Environments (SCALE)
Project, 2013 Report
Patricia Cahape Hammer and Nate Hixson, June
2013
The SCALE Project focused on professional
development and technical assistance provided
by the West Virginia Department of Education
(WVDE) and the West Virginia Symphony Orchestra
(WVSO) that enabled teachers to integrate arts
into other curricular areas through cross-discipline
collaboration. Teachers from 16 high-poverty
elementary schools helped students plan and
implement an arts-based cross-curricular project,
and prepared them to attend a theme-related
concert performed by the WVSO.
Method of study. We conducted surveys of teachers
and PD/TA providers to collect data about the quality
of the PD/TA provided; the fidelity of implementation
at each of the schools; and in a pretest/ posttest
survey, changes in student engagement, school
climate/culture, and improvements in lesson design.
Findings. PD was well attended and received
remarkably high overall ratings. In most schools,
the SCALE Project was well implemented. Overall,
schools saw the greatest level of implementation for
student engagement in the arts and the lowest level
for improving lesson design. Implementation was
far from even, however. For schools with lower levels
of implementation, the most common challenges
included (a) forming a team; (b) holding regular
team meetings; (c) providing PD for school staff in
arts integration; (d) involving content areas other
(d) investigate gaps between students’ and staffs’
perceptions of school climate improvements, to identify
factors driving the perceptions of both groups.
For more information, contact Andy Whisman, Office
11
than the arts; and/or (e) involving only some, not
all, of their classrooms. Across all schools, small
improvements were noted for (a) students staying
on task, and (b) student motivation. Looking only
at schools new to the program, we found small
improvements in student behavioral and cognitive
engagement—especially, higher levels of students
staying on task and believing they were learning in
their classes—as well as increases in collaboration
with community members and more use of dance/
movement strategies in lesson planning. Teachers
from high-implementation schools reported higher
overall behavioral and cognitive engagement
among students—especially students (a) staying on
task, (b) preferring more challenging assignments,
and (c) following instructions—and less integration
of creative writing instructional strategies, and more
use of dance/movement instructional strategies.
Non-arts teachers also reported more use of dance/
movement strategies, and more collaboration with
arts teachers.
Teachers from high-implementation
schools reported higher overall
behavioral and cognitive engagement
among students— especially students
(a) staying on task, (b) preferring
more challenging assignments, and
(c) following instructions.
Limitations of study. All data are self-reported and
thus subject to various threats to validity, such as
social desirability bias (when respondents provide
overly positive responses to a survey or questionnaire
due to their desire to be viewed favorably) or
nonresponse bias (when respondents who elect not
to participate in a survey differ in a meaningful way
with those who do).
Looking only at schools new to
the program, we found small
improvements in student behavioral
and cognitive engagement—
especially, higher levels of students
staying on task and believing they
were learning in their classes.
Recommendations. Based on these findings we
recommend the following: (a) continuing this project;
(b) working to sustain initial excitement so that
schools with previous experience can continue to
realize benefits; (c) encouraging and supporting full
implementation of all components of the program;
12
(d) making sure participating schools build in sufficient
common planning time to support the necessary
collaboration; and (e) developing strategies to ensure
that once the school project concludes, the faculty
does not return to business as usual.
For more information, contact Patricia C.
Hammer, Office of Research (phammer@k12.
wv.us), or download the full report at http://
wvde.state.wv.us/research/reports2013/
EvaluationoftheSCALEProject2013Report.pdf
The West Virginia Special Education
Technology Integration Specialist
(SETIS) Program: 2011-2012
Evaluation Report
Amber Stohr, March 2013
The Special Education Technology Integration
Specialist (SETIS) program provides professional
development for special education teachers to assist
them in achieving proficiency with 21st Century
Technology Tools. In 2011–2012, its 7th year, the
program trained 16 special educators as models,
coaches, and mentors of technology integration at
schools and within classrooms. This study examines
SETIS program implementation, use, and impact
across three key stakeholder groups: SETIS, teacher
colleagues, and school administrators.
Method of study. SETIS candidates were surveyed once,
using a retrospective pre-post survey administered at
the conclusion of the school year. Teacher colleagues
and school administrators, identified and invited
by SETIS due to their close working relationships,
participated in pre-post surveys administered at the
beginning and ending of the school year.
Findings. The program is successfully equipping SETIS
with the capacity needed to implement technology
integration in schools and classrooms, as evidenced
by significant differences in mean scores and large to
very large effect sizes in the SETIS retrospective preand postprogram self-ratings. Teachers indicated
SETIS activities led to increases in coteaching among
teachers and SETIS, improved technology integration
in classrooms, raised technology knowledge among
teachers, and enhanced student experiences.
School administrators reported greater student
engagement as a result of integrating technology
into their classwork. Teacher colleagues and school
administrators reported leveraging SETISs’ skills and
resources in the ways they anticipated. SETISs named
administrative support as the most common factor in
facilitating meaningful collaboration with teachers.
Program barriers were perceived by SETISs and
school administrators as moderate. SETISs reported
a lack of time as their largest barrier; computer
access for students, and internet speed were also
primary concerns. Survey results revealed 25% of the
participating administrators were not aware a SETIS
would be present in their schools at the beginning of
the school year.
Teachers reported increases in
coteaching experience with SETISs,
improved technology integration in
classrooms, increased technology
knowledge, and enhanced student
experiences.
Limitations of study. Relying upon self-reported
information carries the risk of response bias. Among
teachers and administrators small sample sizes and
the inability to track response rates or match preand post-survey results were also limitations.
Recommendations. With the capacity to train 25 SETISs
per year and increasing technological demands in
classrooms, program staff are urged to recruit more
SETIS candidates. Other recommendations include
encouraging SETIS candidates to conduct more
staff development at their schools; providing SETISs
expanded opportunities to work together in face-toface settings, to help them more effectively implement
technology integration within the specialized content
of special education; improving communication at
all program levels to ensure greater awareness of
SETISs’ presence in schools and the optimal use
of their skills and resources; promote scheduling
that allows teachers and SETISs time to cocreate
technology-integrated lesson plans; and incorporate
mechanisms in future evaluations that will allow for
tracking and matching of teacher and administrator
responses in pre- and postprogram surveys.
School administrators observed
increased student engagement as a
result of integrating technology into
their classwork.
For more information, contact Amber Stohr, Office
of Research (astohr@k12.wv.us), or download
the full report from the WVDE Office of Research
website at http://wvde.state. wv.us/research/
reports2013.html.
13
Our Staff
Our expert staff is trained and experienced in conducting state-of-the-art qualitative and quantitative social
science and assessment methodologies. In 2013, our staff included the following:
Nate Hixson, M.A., Assistant Director
Patricia Cahape Hammer, M.A., Coordinator, Research Writer
Jason E Perdue, M.A., Technical and Online Assessment Coordinator
Amber D. Stohr, M.A., Coordinator, Research and Evaluation
Steven A. (Andy) Whisman, Ph.D., Coordinator, Research and Evaluation
We were supported by three staff associates:
Jennifer Kozak, Secretary II
Cathy Moles, Secretary III
Kristina Smith, Secretary II
Milestones
The Office of Research joined the Office of
Assessment and Accountability, under the
leadership of executive director Juan D’Brot,
M.A. The reconfigured Office of Assessment,
Accountability, and Research moved into newly
renovated offices on the eighth floor of Building 6
on the state capitol campus.
Monica Beane, Ph.D., former assistant director
in the Office of Research, was appointed executive
director of the Office of Professional Preparation.
Anduamlak (Andu) Meharie, Ph.D., left the
West Virginia Department of Education to take
14
a position in the Dallas Independent School District’s
Department of Evaluation and Accountability, allowing
him, his wife, and children to join their extended families
in the Dallas area.
Jennifer Kozak joined the Office of Research to take a
position as Secretary II, in which capacity she performs a
variety of data entry and other technical tasks.
Educate
Enhancing Learning. For Now. For the Future.
Download