FINAL REPORT- Lewis, Pearson, Hill

advertisement
SECTION 1
The Challenge of Data-Driven Decision-Making in Metro
Schools
Metropolitan Nashville Public Schools (MNPS) has communicated to the authors of this
capstone project a belief that principals have insufficient and variable capacity to use
data, with problematic implications for the district’s reliance on principals to train
teachers to use data. Considering principals’ perceived insufficient and variable capacity
to use data and the district’s reliance on principals to train teachers to use data, MNPS is
left with uncertainty about the capacity and quality of data use among all of its educators.
Having identified this dilemma, MNPS charged our capstone team with focusing on what
they deem the foundational issue – principals’ capacity to use data and develop the same
capacity among their teachers. Specifically, they have asked us to: (1) evaluate the scope
of the principal data use problem, (2) assess how well the district’s data training
strategies are aligned with data use needs, and (3) develop recommendations for
improved data training strategies.
In order to address these issues, our team has developed a research design to answer the
following questions of interest to MNPS:





How many principals understand how to analyze, interpret, and use achievement
data?
What is the level of understanding and use among principals?
To what extent do principals understand the various types and uses of
achievement assessments?
What is the level of and quality of principals’ communication to teachers
regarding achievement data and how to use that data to drive instruction?
How can MNPS improve the training strategies that they already employ, and
what additional training strategies are needed?
This report seeks to illuminate the nature of data-driven decision-making (DDDM) in
MNPS, assess the effectiveness of the district’s training strategies, and make
recommendations for improved data use. In doing so, the following topics will be
discussed. First, the importance of quality DDDM practices in schools will be placed in
the context of current literature. Then, the contextual surroundings of MNPS will be
examined. An explanation of our study’s methodology will be followed by a presentation
of the study’s key findings. Then, we will make recommendations for improved DDDM
as well as suggest specific action steps to assist in implementing these recommendations.
In the end, a plan for the evaluating the district’s continued efforts to improve DDDM
will be followed by a concluding discussion of noteworthy issues.
1
What we know about DDDM
National Landscape of DDDM
The research questions proposed by MNPS arise out of a belief among district
administrators that DDDM holds the potential to positively impact student achievement.
This belief is grounded in a body of literature that suggests that the relatively recent
emphasis on using data to inform instruction, arising out of No Child Left Behind, does
hold powerful potential for improving learning and teaching. For instance, the U.S.
Department of Education asserts that:
Effective teachers use data to make informed decisions and are constantly improving
classroom practice to better serve their students. One of the most important aspects of
good teaching, as most teachers know, is the ability to discern which students are learning
and which are not, and then to tailor instruction to meet individual learning needs (No
Child Left Behind: A Toolkit for Teachers, p. 45).
With No Child Left Behind, schools are required to regularly assess students in core
academic areas and to disaggregate test results by race, gender, socioeconomic level, and
disability. Thus, schools are faced with an unprecedented amount of student achievement
data. The production of this student achievement data and its careful disaggregation hold
the potential to help educators improve learning opportunities for students. However, by
itself, a careful report of student achievement results will do little to improve future
instructional practices. According to Marsh et. al. (2006), such raw data must be
organized in light of an understanding of its surrounding context and will only in turn
become “actionable knowledge” when educators “synthesize the information, apply their
judgment to prioritize it, and weigh the relative merits of possible solutions. At this
point, actionable knowledge can inform different types of decisions.” (p. 3)
Reasons for Using Data
Following this line of thinking, the questions presented before us are: to what extent is
the student achievement data available to MNPS principals “actionable?”; if the data is
actionable, to what extent is this potentially useful data actually being used to guide
decision-making? Again, the increased volume of student achievement data that comes
with increased testing will not improve learning and teaching by itself. In order to
improve learning opportunities for students, a fluent understanding of this data must be
used to actively provide improved learning opportunities for students. Suporvitz & Klein
(2003) provide a comprehensive description of ways teachers and administrators can and
should use student achievement data:
●Informing instruction. Standards-based reform offers hope for improved student
achievement only if teachers make informed, wise decisions regarding what material
should be taught to which students and in what manner. If teachers continue to simply
move students uniformly through a selected textbook, then the potential power of a rich
supply of student achievement data is lost. Effective teachers use data for determining
lesson objectives, sorting students into flexible ability groups, and aligning lessons with
standards.
2
●Identifying low-performing students. Schools that use data effectively identify
and discuss students who are in the most need of remedial instruction. After determining
which students are significantly behind, data is analyzed in order to determine which
skills need the most immediate attention. Finally, teachers make individualized
instructional plans for students that outline action steps and strategies that will address
specific student needs.
●Planning for professional development. Schools should use data to determine
which areas and skills their students have not sufficiently mastered. In areas where
school-wide deficiencies are noted, then professional development opportunities may be
offered for the entire staff. Similarly, individual student and teacher weaknesses can be
identified and addressed with teacher-specific professional development opportunities.
●Goal setting. Effective principals communicate goals to their staffs in terms of
measurable student achievement data. Such measurable data provide teachers with “clear
measures of improvement and progress,” as measured by both ambitious annual goals
and intermediate goals that can show relative progress towards the long-term goal (p. 22).
●Celebrating accomplishments. After goals are set, progress towards these goals
should be celebrated with teachers and students. Celebrations may be in the form of
large, whole-school recognition of performance on “high stakes” achievement tests or
simply teacher recognition for student growth on regular, ongoing individual
assessments. In both cases, the purpose of recognizing growth in student achievement is
to encourage teachers and students.
●Visual means of focusing priorities. Prominently displaying student
performance data in the school provides important symbols regarding the value placed by
the school on student achievement.
●Supporting evidence for parents. Student data should be readily available for
parents. A rich variety of student data—from standardized test scores to examples of
student work—is necessary for parents to understand the quality of work that their
children have produced and are capable of producing.
The Promise of Effective DDDM
Clearly, some schools are more effective in using student achievement than are others. A
few findings on characteristics of effective schools stand out in relation to the research
questions that MNPS has charged us to answer. First, effective schools have a “clearly
stated purpose and well-planned data collection and analysis” (Keeney, 1998, p. 14).
While the authors of this study found that it can be effective for schools to simply dive
into data and spend hours “swimming around in it,” they determined that it is more
productive to have a clear purpose for the data analysis, focused on student learning, at
the outset. Concurrent with this focus should be school-wide support for analysis of
student achievement data. Armstrong and Anthes (2001) found that principals must “lead
and support the use of data within the school” and expect every teacher to regularly use
the data. In order for teachers to successfully meet such an expectation, they must be
3
provided with significant professional development time to build the competencies
needed to understand and use the data, and they must be provided the time needed to
thoroughly examine, understand, and apply student data.
Today, there is much hope that DDDM can be used by schools in order to locate specific
student needs and to tailor instruction to match these identified needs. However, some
educators to harbor a hesitant attitude toward data use, even though they are the ones
whose practice can be most informed by it. Teachers can be slow to embrace data. In
fact, a number of teachers actually view data as “the enemy,” seeing a push to collect
more and more varied data as part of a desire to document their faults. In this view,
students who are low performing are understood to be taught by incompetent teachers,
while high performing students are thought to be capable of success independent of
teacher efforts. Coupled with this suspicious view of data is the feeling that the
requirement to “use” data is a time-consuming burden for teachers, rather than an
opportunity to understand and improve student outcomes (Doyle, 2003).
However, such a wary attitude toward DDDM can be overcome. Marsh et. al. (2006)
identify multiple factors associated with the effective application of data for improved
instruction. Specifically, in order for data to be used by educators, it must be of high
quality and easily accessible. Similarly, data should be received in a timely matter. In
order to ensure productive use of data, educators must be provided the necessary support
to build their capacity to effectively use the data. And finally, educators must be
motivated to use the data.
Student achievement data has two linked purposes. First, assessment results
communicate an absolute level of student performance at a given point in time. Test
results most obviously inform educators of the performance levels for students, subgroups of students, schools, or districts. It is at this point that many educators seem to
recoil at student achievement data. Presumably, this is because many students are not
consistently performing at desired levels. A short review of the literature suggests that
the use of data must not stop here. Rather than only showing absolute performance levels
of student performance, an effective understanding and application of student data
includes the practice of frequently informing instructional decision-making during the
course of a school year, and ultimately, raising student achievement.
Metropolitan Nashville Public Schools has expressed a belief that its efforts to encourage
principals to apply student achievement data for the purposes of improving students’
learning opportunities have not been fully effective. In the next section, we will examine
the situational context of MNPS in order that the results of our investigation might reflect
the contextual realities faced by this district.
4
SECTION 2
Contextual Analysis of Metropolitan Nashville Public Schools
Metropolitan Nashville Public Schools (MNPS) recognizes that quality principal
leadership is an essential component for school success. Principals need the knowledge,
resources, and motivation to guide teaching and learning within their schools.
Specifically, as standards-based accountability continues to be a prominent presence in
public schools, principals’ ability to understand, analyze, and use achievement data to
guide instruction is of utmost importance. Within this context, MNPS believes that their
principals’ capacity to use data is paramount for improving teachers’ capacity to use data,
and ultimately, for the success of their students. Consequently, the district is alarmed by
its perception that principals have insufficient and variable knowledge about, interest in,
and ability to use achievement data for decision-making. This perceived dilemma does
not operate within a vacuum; rather, it is accompanied and compounded by the district
landscape and trends that characterize MNPS. To fully grasp the dynamics of principal
data use, therefore, it is essential to understand the context within which the dilemma
exists.
Overview of the MNPS Context
The MNPS Landscape
MNPS is a large urban district in middle Tennessee; in fact, it is the 49th largest school
district in the nation. As of the 2006-07 school year, it is comprised of 133 schools,
including 133 principals, 9,800 certified and support staff, and over 74,000 students. In
fact, over the past five school years – from 2001-02 until 2006-07 – the district’s prekindergarten through twelfth grade enrollment has increased from 69,700 to 74,155
students (MNPS, 2006-2007 Facts, 2007).
In addition to the district’s size is its vast diversity of students and types of educational
offerings. MNPS offers an array of school types, including neighborhood schools,
enhanced option schools, charter schools, alternative education and adult education
programs, special education as well as gifted and talented schools, and professional
development schools. The district’s student population has a higher percentage of
minority ethnicities than white students – 48.3% black, 35.3% white, 13% Hispanic, 3%
Asian, and less than 1% Indian and Pacific Islander. Additionally, the student population
represents 84 different countries and speaks 77 various languages. In fact, the district –
one of 136 in Tennessee – serves nearly one-third of the state’s English Language
Learner population (MNPS, 2006-2007 Facts, 2007). Compounding the ethnic diversity
so common to the district is the diversity of students within schools during the course of a
single school year. MNPS, as with many urban districts, has a high rate of transient
students, especially among the low-SES and immigrant population.
Workforce Trends
MNPS employs a vast number of educators and experiences a phenomenon common to
many large urban settings – educator mobility within its schools. The district employs
5
over 9,800 certified and support staff, making it the second largest employer in Davidson
County, Tennessee and the fourth largest employer in the state. Of those staff, there are
over 5,700 certified staff including: 4,915 teachers, 241 principals and assistant
principals, 66 certified district officials, 74 reading specialists, and a number of guidance
counselors, librarians, social workers, and psychologists (MNPS, 2006-2007 Facts,
2007).
MNPS not only employs a large number of educators, but also deals with the reality of
principal and teacher mobility.1 However, it is not the amount of mobility that is as
noteworthy as is the rate of new educators that annually enter the district. The Statistical
Report for the State of Tennessee (2003), provides an overview of this workforce trend
within MNPS over a five-year period (1996-97 until 2000-01). Although this timeframe
precedes the arrival of the district’s current administration, it does highlight the realities
of educator mobility and the rate of new educators that enter the district each school year.
Table 1: MNPS Dynamics of the Workforce
1996-97 1997-98 1998-99 1999-2000
100%
100%
100%
100%
Total Workforce
90.6%
86.8%
88.8%
87.6%
Retained Educators
9.4%
13.2%
11.2%
12.4%
Newly Hired
Transfers In
7.2%
8.2%
5.4%
6.8%
Reentrants 46.2%
37.4%
37.0%
40.6%
First-time Educators
46.6%
54.5%
57.7%
52.5%
8.8%
8.4%
10.3%
12.2%
Total Exiting Next Year
Transfers Out 10.0%
11.9%
12.0%
12.5%
Leaving TN Education 90.0%
88.1%
88.0%
87.5%
8.8%
8.4%
10.3%
12.2%
District Attrition
0.76
1.08
0.48
0.56
Transfers In/Transfers Out
2000-01
100%
88.3%
11.7%
7.2%
36.5%
56.3%
-----------
Data from Statistical Report for the State of Tennessee (2003), page 137.
As detailed in Table 1, MNPS Dynamics of the Workforce, it is clear that MNPS does
retain the vast majority of its educators from one school year to the next. However, of
those who are newly hired, over 90% are either new to the teaching profession or have
not been in the workforce for at least one full school year. Of those exiting the district,
approximately 90% are leaving the field of education in the state of Tennessee all
together. Finally, the district has a transfer in-transfer out ratio less than one and it
decreases over time. This can be interpreted to mean that more educators are transferring
out of MNPS to another district in the state than are transferring in from other districts;
that is, the district is consistently running a deficit of recently experienced teachers.
Educator mobility is not only a district phenomenon, but a school one as well. Surveys
conducted for this project asked principals and teachers to provide the overall number of
years for which they had been a principal or teacher as well as the number of years they
1
We were unable to get principal and teacher turnover/mobility data from the district office. Therefore, the
above proxies – while not as updated and direct as current district data might be – were compiled to provide
indication of those trends.
6
have been in their position at their current school. This was an effort to gather another
proxy measure of mobility within schools. Table 2 details these findings:
Principals
Teachers
Table 2: Years in Position v. Years at School
Avg. Years as Principal
Avg. Years at School
5.0
3.1
Avg. Years as Teacher
Avg. Years at School
12.2
5.4
Average years are the mean score for each variable.
As the data portrays, both principals and teachers have longer careers in education than
they have at their current schools. While not exhaustive data, it is telling of a trend in
which educators have greater mobility within their schools than within the profession
overall. This trend is coupled by the report from district officials (2006, 2007) that the
district has experienced a great deal of school leadership instability over the last five
years, during which time every principal has been replaced or transferred. These
indicators speak to the intra-school staff instability common to MNPS.
Achievement Trends
Over the past five years, MNPS has made noteworthy strides in student achievement, but
not all indicators are encouraging. Some of the more promising trends include the rise in
district reading scores, graduation rate, and high school preparation (MNPS, MNPS
Accountability Results, 2007).
 At the culmination of the 2005-06 school year, 89% of MNPS third-graders were
at or above grade level; up from 49% in the 2000-01 school year.
 The high school graduation rate has increased by 10.5 percentage points over the
past three school years, including a notable 12.4 percentage point increase for the
district’s black student population.
 Similarly, the number of special education students earning regular education
diplomas has grown by 21.2 points since 2002-03.
 More middle school students are entering high school with high school credits. In
fact, over 5,000 entering high-school students had up to four credits this 2006-07
school year, a 75% increase from the 2,900 middle school students who had
similar credit in 2001-02.
As alluded to earlier, these encouraging trends are tempered by noteworthy obstacles for
student achievement. MNPS did not meet benchmarks for district-wide Adequate Yearly
Progress this past 2005-06 school year, landing the district in School Improvement 2
status. The district has missed AYP for three subsequent school years and is facing
mounting sanctions as a result. Even though 85 of the district’s 133 schools did meet
requirements for AYP, the district did not achieve all of the necessary aggregate
benchmarks. More specifically, MNPS did not meet benchmarks for the following
categories (MNPS, MNPS Accountability Results, 2007):
 Mathematics performance among African-American, economically
disadvantaged, special education, and ELL students
7

Reading performance among African-American, Hispanic, economically
disadvantaged, special education, and ELL students
Nor did the district meet other state benchmarks for non-academic indicators: the district
dropout rate of 19% surpassed the state goal of 10%; its high school attendance rate of
89.3% fell short of the state goal of 93%; and its graduation rate of 68.3% failed to reach
the state benchmark of 90% (TDOE, Report Card 2006, 2007).
Revisiting the Dilemma
As stated previously, the perceived dilemma of principal data use is situated within the
district landscape and trends that characterize MNPS. The confluence of phenomena
including the diversity, enormity, and mobility of the district’s student population, the
size and mobility of its teaching force, and the pressures and obstacles for student
achievement all exacerbate concerns about the adequacy of principals’ expertise for data
use.
Characteristics of the student population add to the complexity of collecting, analyzing,
interpreting, and applying achievement data to instructional decision-making. There are
not only great numbers of students for which data exists, but also many subgroups of
varying achievement trends. Additionally, the transient nature of many students makes
the application of data all the more difficult, as students tend to move within and out of
the district in sizeable numbers. Consequently, by the time data findings are ready for
application schools may have a different student makeup than at the time of data
collection.
These obstacles are compounded by the recurring number of new educators that enter the
school district each year and the net loss in experienced teachers. This net deficit of
experienced educators results in a constant pool of employees who tend to be less
familiar with the district’s achievement data, as well as uninformed and underexperienced in the practice of DDDM. This creates additional demand on district
officials and principals to train educators to understand and apply data to their
instructional practice.
Finally, the ever-present demand presented by national and state accountability forces
creates an environment in which achievement is all the more transparent, and educators
are expected to understand and apply achievement results to their daily practice. All of
these contextual forces work to simultaneously influence and aggravate MNPS efforts to
improve principals’ data use practices.
8
SECTION 3
Theory of Action for Principal DDDM
A review of research literature and the contextual realities of MNPS provides a
conceptual framework for understanding the district’s approach to developing the desired
DDDM capacities among its principals. As emphasized repeatedly by district officials,
the MNPS philosophy for DDDM is to work directly with principals who, in turn, are
expected to train their own school staff in the understandings and practices required for
DDDM. It is a philosophy that hinges on the quality by which these DDDM capacities
are transferred down the chain of command – from district officials, to principals, and
finally to teachers – with the hope of ultimately improving the quality of classroom
instruction and student learning.
Synthesis of Research and Contextual Analyses
The research on DDDM reveals strategies and resources necessary to facilitate the
development of high quality DDDM practices within schools. A review of these findings
speaks to the importance of the following key components:
 Data must be organized in light of an understanding of its surrounding context
and the challenges and opportunities it presents for teaching and learning (Marsh
et al, 2006).
 Educators must have a focused plan for data collection and analysis (Keeney,
1998).
 As instructional leaders, principals should guide and support the development of
DDDM within their schools; such support includes setting expectations for data
use, training teachers for DDDM, and providing necessary time to examine and
apply data to their instructional practices (Armstrong and Anthes, 2001).
 Educators’ attitudes and motivation matter for DDDM; aversions to data
collection and data use can stymie efforts for developing DDDM capacities
(Doyle, 2003; Marsh et al, 2006).
 Data must be of high quality, easily accessible to those who need it, and provided
in a timely manner (Marsh et. al. 2006).
Although such components do not guarantee the implementation of successful DDDM
and improved teaching and learning, they do facilitate the transformation of data from an
information resource to actionable knowledge; that is, educators will have the capacity to
synthesize data and apply it to the most pressing educational issues (Marsh et al, 2006).
As reiterated in the research literature, quality DDDM must recognize the contextual
opportunities and barriers that exist within MNPS.
As the previous section discusses, MNPS is a large urban district, characterized by
immense educator and student populations, both of which tend to be marked by
instability, especially intra-school instability. These realities not only have implications
for the district’s approach to DDDM, but also whether or not the approach proves to be
well-aligned with the demands of these conditions.
9
District’s Theory of Action for DDDM
Following a synthesis of the research literature and district context, a more vivid picture
of the MNPS approach to DDDM became evident. As detailed in Figure 1 below, an
assumed approach to DDDM, and more specifically, the development of principal
DDDM capacity, would include a series of interdependent inputs, intermediate goals, and
long-term and ultimate outcomes. This theory of action provides an overview of the
necessary components that must be investigated to truly understand the extent to which
principals’ DDDM capacity exists and whether it is having the desired impact on
teaching and learning.
Figure 1: MNPS Theory of Action for Principal DDDM
INPUTS
District Resources
Expert Staff
Technology and information resources (i.e., Data reporting and communication)
Training for principals and teachers
↓
INTERMEDIATE GOALS
Principal Capacity
Attitudes
Knowledge/Self-Efficacy
Experience
Principal Behavior
Data use
Instructional leadership (i.e., use of time, training teachers)
↓
LONG-TERM OUTCOMES
Teacher Capacity
Attitudes
Knowledge/Self-Efficacy
Experience
Teacher Behavior
Data use
↓
ULTIMATE OUTCOMES
Improved Student Achievement2
2
Student achievement, while the ultimate outcome of this theory of action for principal DDDM, is not
addressed by this capstone project. The very nature of an ultimate outcome suggests a result expected over
an extensive period of time. As our capstone project examines only a snapshot in time, it is neither
appropriate nor possible to analyze the existence of student achievement outcomes resulting from DDDM
practices.
10
District Resources
District officials have devised a series of strategies to support principals’ learning of
DDDM. Of particular interest for our research project are the information reports,
expertise, and formal training opportunities provided to principals. The district intends
that all of these resources equip principals with the information and resources they need
to develop their own DDDM skills and those of teachers. More specifically, as revealed
through all of our district official interviews, MNPS uses the following mechanisms:
 Central office staff meet with principals several times each month to focus on
learning more about expectations for curriculum and assessment and guidance for
data interpretation. Principals meet with (1) the Director of Schools once a month
and (2) the district’s Chief Instructional Officer and Director of Assessment and
Evaluation once a month, by grade tier, and biweekly if they are in a high priority
school.
 Central office officials also conduct on-site consultations with school staff upon a
principal’s request. Consultations usually occur during in-service days and focus
on interpretation of school-specific achievement data.
 Additionally, the district’s Office of Assessment and Evaluation provides
principals with on-going data reports throughout the school year. For example,
our team reviewed all documents provided to principals during their monthly
meetings over the course of the 2005-06 school year. The focus of these student
achievement reports was on reporting and interpretation of state achievement
assessments (i.e. TCAP and Gateway) and district level assessments, with most
data aggregated at the district or school-level.
 Finally, MNPS has recently (Spring 2006) implemented the use of a new
standards-based computer software program, EduSoft Assessment Management
System, designed to streamline data collection and analysis, facilitate creation of
and management of district assessments, and provide educators with timely
feedback on students’ test results. The district has also hired a full-time EduSoft
consultant to train educators in its implementation.
Principal Capacity and Behavior
The intermediate goals within this theory of action are the development of principals’
capacity and behavior for DDDM. Principal capacity refers to attitudes about, knowledge
of, and experience with achievement data. Research literature emphasizes that in order
for behavior to change, the development of necessary attitudes, knowledge, and skills
must be established (Armstrong & Anthes, 2001; Keeney, 1998). It is, therefore,
assumed that in order for principals to implement desired DDDM practices they must
have the requisite attitudes, knowledge, and skills. More specifically, they must value
data as a useful and meaningful tool for improving teaching and learning and possess the
knowledge to analyze, interpret, and apply data for instructional improvement, as well as
impart such understandings to teachers.
Principals’ attitudes, knowledge, and skills provide the platform for changed behavior,
resulting in the desired DDDM practices. Principal data use is a multi-faceted concept,
including the type of data used, frequency of data use, reasons for its use, and objectives
for its application. Principals are expected to analyze, interpret, and apply achievement
11
data in their professional responsibilities for improving teaching and learning
opportunities within their schools (MNPS district official interviews, 2006-07).
MNPS expects principals to act as instructional leaders in their schools, providing
teachers with the necessary training to implement desired DDDM practices, ultimately to
improve the quality of classroom teaching and learning (MNPS district official
interviews, 2006-07). Instructional leadership also involves the ability to establish a
positive school culture supportive of DDDM, including the development of strong
collegial relations and emphasis on high expectations for teaching and learning. The
professional culture of a school influences the practice of DDDM through the emergence
of professional expectations and incentives. Research suggests that a collective
commitment to the value and practice of DDDM influences the realization of such
practices (Petrides & Nodine, 2005; Torrence, 2002; Watson & Mason, 2003).
Teacher Capacity and Behavior
The long-term outcomes within this theory of action are the development of teachers’
capacity and behavior for DDDM. As with principals, teachers’ capacity includes their
attitudes about, knowledge of, and experience with achievement data, all of which form
the foundations for desired DDDM practices. Similar to their principals, teachers must
value data as a useful and meaningful tool for improving teaching and learning and
possess the knowledge to analyze, interpret, and apply data for instructional improvement
in the classroom.
Again, teachers’ DDDM behavior use is a multi-faceted concept, including the type of
data used, frequency of data use, reasons for its use, and objectives for its application.
Ultimately, teachers are expected to use achievement data to improve their ability to meet
the learning needs of their students (MNPS district official interviews, 2006-07).
Revisiting the Research Objectives
This theory of action provides our team with a conceptual framework to guide our
evaluation. As discussed earlier, MNPS charged our team with several objectives,
namely: (1) evaluate the scope of the principal data use problem, (2) assess how well the
district’s data training strategies are aligned with data use needs, and (3) develop
recommendations for improved data training strategies. Several guiding questions stem
from these objectives, including:
 How many principals understand how to analyze, interpret, and use achievement
data?
 What is the level of understanding and use among principals?
 To what extent do principals understand the various types and uses of
achievement assessments?
 What is the level of and quality of principals’ communication to teachers
regarding achievement data and how to use that data to drive instruction?
 How can MNPS improve the training strategies that they already employ, and
what additional training strategies are needed?
12
As is evident in the district’s theory of action for principal DDDM, these questions –
while of importance to the district – do not address some of the mediating issues that
influence the quality of principals’ data use. They focus extensively on principals’
knowledge and use of data, as well as their behavior as instructional leaders. However,
they do not address a critical component of principal capacity, namely their attitudes
about DDDM. Nor do these stated questions grapple with a key indicator of principals’
effectiveness as instructional leaders for DDDM – teachers’ DDDM capacity and
behavior. Accordingly, our project considers some of these additional issues in order to
gain a fuller understanding as to how well the reality of principal DDDM aligns with the
desired process.
13
SECTION 4
Design and Methodology
In order to address these research objectives in a rigorous and systematic manner, our
team has developed a multiple-method research design, including a series of surveys,
interviews, and document analyses. As discussed below, and further detailed in Figure 2,
these research procedures will enable us to develop a comprehensive understanding of the
data use dilemma, triangulate our findings, and determine how the district might best
address unmet training and support needs for principals and teachers across MNPS.
Figure 2: Key Construct Measurement Plan
District Resources
Expert Staff
Technology and
information resources
(i.e., Data accuracy
and availability)
Training for principals
Training for teachers
Principal Capacity
Attitudes
Knowledge/
Self-Efficacy
Experience
Principal Behavior
Data use
Instructional
leadership
School Culture
Professional
collegiality
Expectations for
improvement
Teacher Capacity
Attitudes
Knowledge/
Self-Efficacy
Experience
Perceived barriers
Teacher Behavior
Data use
Principal Teacher
Surveys
Surveys
X
X
X
X
X
Principal Teacher
Surveys
Surveys
X
District
Interviews
X
X
Principal
Interviews
District
Documents
X
X
X
X
District
Interviews
X
X
X
X
Principal
Interviews
X
X
X
X
District
Documents
X
X
X
X
X
X
X
X
X
Principal Teacher
Surveys
Surveys
X
X
X
X
X
X
District
Interviews
X
X
Principal
Interviews
X
X
X
X
X
X
X
X
14
X
District
Documents
Principal and Teacher Surveys
We administered surveys to principals and teachers within MNPS to gauge their
perspectives and experiences related to data use and training. Gaining insight from
principals is essential, as they are the primary subjects under study. However, the insight
of teachers is a necessary supplement, as it is hoped for that principals’ role as
instructional leaders will impact teachers’ attitudes about, knowledge of, and behavior for
DDDM.
Additionally, surveying teachers enables us to confirm – or disconfirm – information
from principals regarding their professional practices. We asked principals and teachers
about very similar constructs in the separate surveys in order to investigate and compare
evidence regarding the inputs, intermediate outcomes, and long-term outcomes
comprising the district’s theory of action for DDDM (Appendix C. Principal Survey,
Appendix E. Teacher Survey). Consequently, these surveys provide us with invaluable
information for addressing our research objectives and investigating the existence of any
disconnects between principals’ and teachers’ views about and behavior for data use.
Sampling Procedures and Survey Administration
The samples for the principal and teacher surveys were determined based upon two
primary objectives: obtaining responses that could be generalized to the entirety of
MNPS, and doing so within the time confines presented by this project. Principal surveys
were administered to all 133 principals at district-led principal meetings. Members of our
research team attended three consecutive principal meetings, divided into grade levels
(e.g., elementary school, middle school, high school), during late fall of 2006. Access
was provided to us by several high-ranking district officials, and they set aside
approximately 25 minutes at the beginning of each meeting for us to explain and
administer the principal survey. Although we could not promise anonymity, as we were
present at the survey administration, we did uphold the confidentiality of respondents’
answers by not requiring that they provide any personal identifying information. This
captive audience of principals and the guarantee of confidentiality resulted in a robust
response rate of 94 % (125 of 133 principals); more specifically, 89% of elementary
principals (65 of 73), 94.4% of middle school principals (34 of 36), and 100% of high
school principals (16 of 16)3. This robust response rate upholds our confidence in the
generalizability of our principal survey findings; that is, the survey responses are an
accurate portrayal of principals throughout the entire school district.
Teacher surveys were administered using a noticeably different procedure. We could not
feasibly administer surveys – or gain access – to all teachers in MNPS, due to time and
resource constraints. Therefore, we devised a sampling method that would enable us to
capture the responses of a wide range of teachers representing each grade tier (e.g.,
elementary school, middle school, high school) so as to align with the way in which we
administered principal surveys by grade tier. In order to maintain the anonymity of
The principal survey respondents also included six from an “other” grade configuration, while four
respondents did not specify the grade level of their current administrative placement.
3
15
survey respondents, and because of the logistical burden of gaining direct access to
teachers (i.e., via personal email or mailing address), we decided to conduct a stratified
cluster sample of schools in order to gain indirect access to individual teachers, who were
in fact the unit of analysis.
Our team first stratified district schools by grade tier. In order to conduct a stratified
random sample, we had to determine the sample size that would be statistically required
for each stratum in order to maintain the statistical viability of the sample and to detect a
medium effect size (0.5). Therefore, we used the mean, standard deviation, and
confidence interval of 95% for key variables on the principal survey to get an estimate of
variation on key variables for the population of schools that we were sampling for the
teacher surveys.4 Although not a perfect estimate of teacher responses, we assumed that
principal responses were the closest proxy we had for teacher responses, given the
district’s theory of action that principals are the leaders of DDDM within schools. Using
this procedure, we selected a simple random sample within each grade stratum to get a
total of 38 schools (21 elementary schools, 10 middle schools, 6 high schools, and 1
other5) of the 124 total schools (73 elementary schools, 36 middle schools, and 15 high
schools). 6
In order to get indirect access to the teachers within those 38 schools, we requested that
the district’s Human Resources office provide us with a count of core academic teachers
within each of those schools. Core academic teachers include those that teach any course
in the following four subjects: Mathematics, English, Social Studies, and Science.
Teachers are counted if they teach any of these courses, even if it is just for one period
per day (i.e., a teacher who teaches Science in period 1, and P.E. in periods 2 thru 8). All
courses in the four key subjects were counted, including special education, remedial,
advanced, honors, and AP/IB courses. We focused on these core academic teachers
because, in discussion with high ranking district officials, we determined that this was the
cadre of teachers who most frequently encounter student achievement data; and, it was
not feasible for us to administer surveys to all teachers within those 38 randomly selected
schools.
This sampling procedure culminated in a total sample of 1,271 core academic teachers
(660 elementary school, 272 middle school, 307 high school, and 32 other).
Unfortunately, we still could not get access to teachers by email or direct mail because of
the need to maintain anonymity, so we mailed the appropriate number of surveys to each
4
To determine the total number of schools to be sampled for the teacher surveys, the following calculation
was used n=(∑NhSh)2/N2(B2/4) + ∑NhSh2
To determine the total number of schools to be sampled within each grade level stratum, we then used the
following calculation: Nh=n(NhSh)/(∑NhSh)
5
One of the schools selected has a grade configuration of 7-12th grade and is labeled as a high school in
MNPS; however, for the sake of survey analyses, we treated this school as having an “other” type of grade
configuration.
6
Note that when conducting the stratified random sample of schools, we used a total N of 124 schools,
which excluded 9 schools including special education schools, alternative education schools, gifted/talented
schools, adult education schools, and charter schools. We did not included this non-traditional schools
because their data use needs are systematically different than those of the traditional district schools.
16
of the 38 schools requesting that they be returned within a three week period using the
district’s cost-free mailing service. This procedure did have its limitations, resulting in a
rather low response rate of 632 teachers (50%), consisting of 352 elementary teachers,
155 middle school teachers, and 90 high school teachers.7
We recognize that this does not provide the robust evidence necessary to make districtwide generalizations about teacher survey findings, but as will be discussed further along,
we believe that it is a more accurate snapshot of evidence than the district has yet had
regarding teachers’ views and experience related to data-driven decision-making.
Survey Design and Analysis
As previously outlined in Figure 2, the principal and teacher surveys are tools to explore
key aspects of the MNPS theory of action for principals’ data use. More specifically,
both surveys include concepts of district resources, principal and teachers’ capacity and
behavior for data use, as well as school culture. As identified in the literature, these
concepts address the assumed inputs as well as intermediate and long-term outcomes of
the district’s approach to principals’ data use practices.
This comprehensive overview also provides an opportunity to focus on guiding questions
for this research project, including:
 How many principals understand how to analyze, interpret, and use achievement
data?
 What is the level of understanding and use among principals?
 To what extent do principals understand the various types and uses of
achievement assessments?
 What is the level of and quality of principals’ communication to teachers
regarding achievement data and how to use that data to drive instruction?
These questions are descriptive in nature, and such was the primary purpose of the
surveys. However, the survey concepts also provided an opportunity to analyze the
interplay between resources, attitudes, knowledge, and actions among principals and
teachers, acknowledging the research literature that speaks to the necessity of all
components for successful data use training.
In order to appropriately address the concepts of district resources, principal and
teachers’ capacity and behavior for data use, as well as school culture, our research team
consulted previously validated survey instruments and research literature in the field of
education. These resources provide a sound conceptual foundation for the way in which
constructs were operationalized in our surveys. Appendix D, Principal Survey Concept
Map, and Appendix F, Teacher Survey Concept Map, provide a detailed description of
the sources used to address key constructs in order to ensure the validity and reliability of
survey items.
Among teacher respondents, there were also 18 who identified their grade level as “other” and 17 who did
not specify their grade level.
7
17
Survey Limitations
Although our research team relied upon research literature in the construction of survey
instruments in order to preserve the internal validity of survey items, there were other
factors of the survey process that limit the external validity (i.e., generalizability) and
reliability of survey results. As mentioned previously, the principal survey was
administered to the vast majority of MNPS principals (94% response rate) leading to a set
of survey findings with robust external validity. However, the same is not true for the
findings emanating from the teacher survey, which had a much lower response rate of
50%. Even though teachers were selected for participation through a stratified random
cluster sampling procedure, teacher responses cannot be assumed to generalize to all
teachers within the district; in fact, it may be that teacher respondents are systematically
different from the rest of the teachers who did not participate in the survey.
However, several characteristics of survey respondents do approximate those of teachers
throughout the entire district, thereby strengthening some confidence in the findings; for
example, teacher representation by grade tier, their average years of experience, and their
education level are all similar to teachers throughout MNPS (Table 3: Teacher
Respondents v. Population Comparison). Nonetheless, these similarities cannot
overcome other potential systematic differences, such as attitude differences that lead
some to participate in a research study or principals that may have encouraged
participation more than others.
Table 3: Teacher Respondents v. Population Comparison
Grade Level Distribution
Survey Respondents
Teacher Population
55.7%
51.9%
Elementary School
24.5%
21.4%
Middle School
14.2%
24.2%
High School
Average (Mean) Years of Teaching Experience
Survey Respondents
Teacher Population
13.0 years
13.0 years
Elementary School
10.9 years
11.5 years
Middle School
10.4 years
14 years
High School
Level of Education
Survey Respondents
Teacher Population
35.8%
39.8%
Bachelor’s Degree
38.4%
36.7%
Master’s Degree
19.1%
18.8%
Master’s plus Degree
2.2%
4.6%
Doctorate Degree
Note – Teacher population characteristics were detailed in the district’s 2006-2007 Facts online manual
(http://www.mnps.org/AssetFactory.aspx?did=14406)
Additionally, both surveys were not entirely protected against threats to reliability.
Principal and teacher surveys were administered in situations that might have inspired
responses that they otherwise would not have provided given a more private setting.
18
Principal surveys were administered at district-led meetings in the presence of highranking district officials. This group setting and presence of higher-ups might have led
principals to respond in a less candid way than they would not have in a more private
setting. Similarly, teacher surveys were administered to respondents at school and we
relied upon principals to help distribute and collect surveys. This was a process chosen to
navigate the limited time and access we had to reach teachers, but it could also have
implications for teachers’ willingness to be entirely candid in their survey responses.
Finally, findings stemming from these surveys should be interpreted with the
understanding that self-reporting can invite bias. Both principals and teachers may have
responded in ways that do not completely align with the reality of their attitudes or
behaviors.
District and Principal Interviews
While principal and teacher surveys enable us to learn more about data practices and
perspectives from a large number of educators across MNPS, they cannot provide us with
the depth of information that we need to fully understand the context, dynamics, and
rationale for data use practices. The optimal way to study these issues is to talk to the
people entrenched in it; to learn from them about their own thoughts and experiences
(Peshkin, 1993). As is detailed in Figure 2, the district official and principal interviews
address many of the key concepts related to inputs and intermediate and long-term
outcomes of the district’s theory of action for principal data use. Therefore, we
conducted a series of interviews with district officials and principals to garner rich
information and illuminate nuances about principals’ data use practices.
District Official Interviews
District officials were targeted for interviews in order to allow us to develop a greater
sense of the district context, including its instructional, organizational, and historical
dynamics. We believed that we could learn more about why the district perceived there
to be a dilemma among principals’ data use, what led to its development, and how the
district has been dealing with it. Interviewees were purposefully selected in order to
garner information from high-ranking officials who work in positions of authority with
principals and specifically in the areas of data, instruction, and staff development.
Interestingly, identifying the appropriate interviewees was somewhat of a challenge, so
we utilized a snowball procedure to get references from interviewees for further officials
who might provide us with the information for which we were looking. At the
culmination of our research, we had successfully completed interviews with four highranking district officials.
The district interviews were our only opportunity to gather detailed information from
senior administrators from the district and, therefore, we developed a comprehensive
interview protocol that addressed a number of key concepts from the theory of action,
including: district resources such as the expertise of district staff for data use, information
resources, and training opportunities for principals and teachers; principal capacity and
behavior for data use; as well as teacher capacity and behavior for data use. The protocol
19
also addressed questions regarding the district’s philosophy for data use and why the
district perceived there to be a data use dilemma among principals. This structured
protocol resulted in interviews lasting approximately 30 – 45 minutes in length.
Interviews were conducted one-on-one between an interviewee and a member of the
research team. They were audio-recorded to ensure accuracy of notes, as well. Appendix
A provides the template of the protocol used for district official interviews.
Principal Interviews
We also conducted interviews with a purposeful sample of principals from the set of 38
schools selected using the stratified random cluster sampling procedure for teacher
surveys. We selected a total of 12 interviewees, identifying four principals from each
grade tier, each representing a varying level of administrative experience. Based upon
the literature, we believed years of administrative experience to be an important
characteristic with implications for data use expertise. However, we were only able to
successfully conduct 10 of these 12 interviews, consisting of four elementary school,
three middle school, and three high school principals.
These interviews focused on similar concepts addressed in the principal surveys, but were
able to dig deeper into the issues of why and how principals use data in their positions of
school leadership. These qualitative interviews both complement and supplement the
survey information that we gathered.
As we had the opportunity to gather extensive information from principals using the
surveys, and because of the limited time and difficulty of accessing principals, the
protocol was brief, yet purposeful. While focusing on some similar concepts as those in
the principal surveys – district resources, principal capacity and behavior for data use,
and teacher capacity and behavior for data use – the nature of questions addressed issues
such as how and why, as well as motivational and attitudinal questions that could not be
adequately gauged through a survey instrument. The structured protocol resulted in
interviews lasting approximately 10 – 15 minutes in length. Interviews were conducted
one-on-one by phone between an interviewee and a member of the research team. When
possible, interviews were also audio-recorded. Appendix B provides a template of the
protocol used for principal interviews.
Interview Analyses
The primary goal of interview analyses was to identify core themes and the ways in
which those themes enlighten our understanding of the project’s guiding questions. As
both Patton (2002) and Rubin and Rubin (2005) emphasize, the purpose of analysis is to
uncover and make sense of the core consistencies and “parallel ideas” (p. 207) that
address the conceptual framework of our study. Accordingly, a systematic review of the
interviews was paramount for identifying the content, themes, and patterns within the
interviewees’ responses. This process involved multiple stages of reviewing recorded
interviews, identifying emerging themes, re-evaluating interviews for accuracy and
completeness of interpretation, and synthesizing the interplay between the extrapolated
themes.
20
Initially, we each conducted a thorough review of audio-recorded and hand-written notes.
Subsequently, we shared our initial impressions and findings in order to clarify themes
and develop a common understanding of the concepts most recurring in the interviews.
With these established concepts, we revisited the interviews in order to identify
illustrative quotes and responses most relevant to the themes that were unfolding in the
interviewees’ responses. Upon completing this analysis and re-analysis of the responses,
we synthesized the findings to see how they worked together to inform the original
research question
Interview Limitations
These interviews provided a wealth of information about data use, specifically among
principals, in MNPS; however, the findings do have limitations. Specifically, interviews
rely on self-reporting by interviewees thereby introducing potential bias. Self-reporting
bias – whether conscious or not – can result from interviewees representing themselves in
an undue positive light or from the difficulty of recollecting past phenomena.
This threat to validity is tempered by strategies that reinforce the internal and external
validity of interview findings. In order to enhance internal validity, we relied on existing
research literature to devise interview protocols that exhaustively and accurately
represent the constructs of primary interest for the study.
Additionally, we employed critical case and stratified purposeful sampling in order to
establish a stronger defense for external validity (Patton, 2002). We purposefully
sampled four high-ranking district officials in order to gather information from critical
cases in MNPS. These cases (i.e., individuals) represent positions of authority in the
district hierarchy, and more specifically, those positions that work most frequently with
DDDM and have the ability to affect the direction of DDDM in future years. As
mentioned previously, we used a theory-based stratified purposeful sample of principals
in order to capture interview findings from a broad representation of individuals from
various grade tiers. Drawing upon our stated theory of action, we targeted principals with
varying years of administrative experience, believing that this background characteristic
has implications for DDDM capacity.
District Documents
Our third research strategy involved content analyses of relevant district documents.
MNPS district officials provided our research team with existing documents related to the
training services that they provide to principals, including agendas from district-wide
principal meetings and samples of data reports prepared for schools by district officials.
These documents allowed us to investigate the nature of information resources provided
to principals and their staff, a key input in the theory of action guiding our research.
More specifically, these documents enabled us to uncover the content, focus, and quality
of training provisions and data reports.
While not a full picture of the district’s training provisions, these documents do provide a
glimpse into the most frequently discussed topics addressed during correspondence
21
between the district and its principals. They also provide a source of information that
does not stem from respondents’ perceptions, as do the surveys and interviews. This is,
therefore, a key information resource for triangulating some of the findings emanating
from surveys and interviews.
22
SECTION 5
Findings about DDDM in MNPS
Using a multi-method design, our capstone team discovered a volume of findings to
address the district’s research objectives, namely: (1) evaluate the scope of the principal
data use problem, (2) assess how well the district’s data training strategies are aligned
with data use needs, and (3) develop recommendations for improved data training
strategies. This examination is of particular interest for MNPS as they, admittedly, have
not yet systematically evaluated the quality of DDDM throughout the district (District
official interviews, 2006-07).
This section addresses the first of the two stated objectives in order to appropriately fulfill
the third task of developing recommendations. The sections that follow provide a
detailed overview of our findings, using the research objectives and theory of action to
guide our analyses. More specifically, we uncovered the key findings that address each
of these primary issues:





The nature of principals’ DDDM capacity and behavior
The nature of and quality of principals’ communication to teachers
The nature of teachers’ DDDM capacity and behavior
The nature of the district’s support for and barriers to DDDM
The most pressing needs to improve DDDM capacity and behavior for MNPS
educators
Principals’ DDDM: Capacity and Behavior
As discussed in Section 4, Theory of Action for DDDM in MNPS, the district’s most
immediate objectives for DDDM are to establish principals’ capacity and behavior for
DDDM. Specifically, they intend to ensure that principals possess the requisite attitudes
and knowledge to practice desired DDDM behaviors, such as improving teaching and
learning in their schools and training teachers to do the same. The following sub-sections
begin to reveal the nature of such measures of capacity and behavior among MNPS
principals.
Overview of Key Findings
Principals tend to value data that they perceive as having the most utility for improving
instruction at their schools.
In contrast to district officials’ concerns about principals’ variable knowledge of DDDM, most
principals report feeling adept at DDDM.
The majority of principals reportedly use grade-appropriate assessments frequently; however,
there is evidence that self-reporting biased some of these results.
23
Principals use state and district assessments for many similar reasons related to curriculum,
instruction, and student learning improvements; the similarity of use is somewhat surprising
given the distinct purposes of these two types of assessments.
Principals’ Attitudes about DDDM
In the current context of high-stakes accountability for public education, district officials
believe that MNPS principals are undoubtedly of the mindset that DDDM is a necessity
for school improvement, although they may not have an affinity for the amount of data
reporting and analysis that accompanies those efforts (District official interviews, 200607). These sentiments are incited by national, state, and local efforts for outcomes-based
school improvement. No Child Left Behind, the Tennessee accountability system, and
local pressures have lead principals to focus with greater intensity on using achievement
data as a resource for school leadership. As one district official (2007) explained:
You know, we’re long past the days where you just teach the same thing each
semester that you were used to teaching the previous semester. I think those days are
gone. They [principals and teachers] are very much into looking at student data:
where are they [students] academically and where do we need to be?
Despite believing that principals have an overall appreciation for the necessity of using
data to guide school improvement efforts, district officials suspect that principals tend to
value achievement data that they (1) understand and (2) view as having utility for school
improvement efforts (District official interviews, 2006-07). While officials’ suspicions
are admittedly founded on anecdotal evidence, what we uncovered from principals’
interview responses seems to confirm their beliefs. Although principals did not find
consensus when asked to disclose the type of achievement data that they find to be most
valuable, the rationale for their choices reveals a similar mindset; specifically, they
identified data that they find useful for informing instruction (Principal interviews, 2007).
For example, several principals identified the district’s standards-based assessments and
ThinkLink as having great utility for instructional improvement due to the timeliness of
results and formative nature of the assessments.
Principals’ Knowledge about DDDM
As the research literature reiterates, principals’ capacity for DDDM stems not only from
their attitudes toward data, but their knowledge about it as well. According to our survey
and interview findings, principals consider themselves to be highly informed and
knowledgeable when it comes to DDDM. When asked to explain how frequently they
feel comfortable using data, nearly 90 percent of survey respondents cited “often” or
“always.” This same trend emerged from interview findings, whereby all interviewees
described themselves as being very adept using student achievement data (Principal
interviews, 2007).
This finding – while admittedly predicated on principals’ self-reports – suggests that
district officials’ belief that principals have variable understanding and comfort using
data in their school leadership positions, especially among less experienced principals
(District official interviews, 2006-07), may not be completely accurate. All principal
24
interviewees, even those beginning principals, expressed a high level of comfort with
using data. Additionally, survey responses reveal no difference of expressed comfort
with DDDM between beginning (0 to 2 years), mid-career (3 to 9 years), and veteran
principals (10 or more). As described in Table 4 and Table 5 below, there is no
statistically significant difference when comparing principals’ comfort with using data
based upon their overall years of administrator experience (p = 0.41) as well as their
years serving as principal at their current school (p = 0.34).
Table 4: Comfort Using Data
by Overall Years Administrator Experience
Sum of
Squares
Between Groups
df
Mean Square
1.001
2
.500
Within Groups
60.188
108
.557
Total
61.189
110
N
Sig.
F
.898
.410
F
1.100
Sig.
.337
111
Table 5: Comfort Using Data
by Years as Principal at Current School
Sum of
Squares
1.241
2
Mean Square
.620
Within Groups
59.768
106
.564
Total
61.009
108
Between Groups
N
df
111
Principals’ Behavior for DDDM
Of ultimate concern is the nature of principals’ actual DDDM practices. Accordingly, we
examined the frequency of principals’ use of various types of data and the reasons for
which they typically use data in their positions as school leaders. This review of data use
provides for a comprehensive understanding of principals’ approach to DDDM in MNPS.
Frequency and type of data
Our survey asked principals to identify how often they use various types of achievement
data, including state, and district-level assessments. As detailed in Table 6, state
assessments can be divided into categories by grade tier based on applicability to that
particular grade level. We used these categories to evaluate the use of state assessment
data by principals’ respective grade level placements.
25
Table 6: State Assessments by Grade Level
Elementary School
Middle School
High School
TCAP Proficiency
TCAP Proficiency
Gateway
TCAP Writing
TCAP Writing
TCAP Writing
TCAP Value Added
TCAP Value Added
State End-of-Course
As shown in Table 7, when analyzing principals’ frequency of using state assessments by
grade-appropriate levels, the majority of principals consistently reported using the state
assessment data “often” or “always.” Additionally, some high school principals
responded using data from the grade level immediately preceding them – middle school.
For example, 50 percent of high school principals responded that they “often” or
“always” use results from TCAP Proficiency tests, while fewer (31.3 percent) use TCAP
Value-Added results.
Table 7: Percent Principals’ “Often” or “Always” Using
State Assessments
Elementary
School
Middle School
High School
Type of
All Principals
Principals
Principals
Principals
Assessment
(n=112)
(n=62)
(n=34)
(n=16)
85.7%
92.3%
94.2%
50.1%*
TCAP
Proficiency
70.6%
58.5%
85.3%
87.5%
TCAP Writing
61.1%
67.7%
64.7%
31.3%*
TCAP ValueAdded
36.5%
N/A
N/A
93.8%
Gateway
33.3%
N/A
N/A
75.0%
State End-ofCourse
* Although TCAP Proficiency and Value-Added are not administered at the high school level, we included
these percentages to see how frequently principals make use of assessment results that might be informative
about the achievement of students’ entering their grade level.
It is worth noting a discrepancy that we uncovered between principals’ self-reports of
using TCAP Value Added and evidence of their actual use. On the survey, principals
were asked to report how often they used the TCAP Value Added (i.e., TVAAS) website.
This website is a free, comprehensive data analysis tool for school and individual student
achievement data, including predictions for future levels of achievement. Principals are
also able to provide teachers with passwords for logging on themselves. When asked
how frequently they used the site 52.4 percent reported “often” or “always”, 22.2 percent
reported “sometimes”, while 17.5 percent reported “rarely” or “never.” District officials
also provided us with a report detailing the frequency by which principals use this
website. Table 8 below indicates the number of times a school has logged on as of
November 2006 for the 2006-2007 school year.
26
Table 8: Frequency of Using TVAAS Website
School
#
School
#
School
#
School
#
School
#
A
5
Q
6
GG
9
WW
7
MMM
3
B
4
R
11
HH
1
XX
1
NNN
13
C
11
S
8
II
1
YY
2
OOO
1
D
38
T
2
JJ
1
ZZ
2
PPP
19
E
2
U
1
KK
2
AAA
3
QQQ
1
F
4
V
1
LL
1
BBB
4
RRR
1
G
3
W
3
MM
2
CCC
6
SSS
9
H
2
X
1
NN
10
DDD
1
TTT
7
I
1
Y
3
OO
6
EEE
4
UUU
10
J
2
Z
11
PP
2
FFF
1
VVV
5
K
3
AA
3
QQ
4
GGG
2
WWW
8
L
4
BB
1
RR
5
HHH
5
XXX
4
M
7
CC
6
SS
18
III
1
YYY
3
N
1
DD
2
TT
1
JJJ
5
ZZZ
9
O
9
EE
34
UU
2
KKK
3
P
17
FF
12
VV
4
LLL
8
Interestingly, over the course of the first four months of the school year, only 78 of 133
district schools have logged onto the website. Of those, 66 have logged on less than ten
times, of which over one-quarter have only logged on once. While not a perfect
assessment of principals’ use of TVAAS, it is a telling proxy implying that principals’
self-reports on surveys may be overestimated.
We also asked principals how often they use results from a number of district
assessments, with particular attention to those applicable to all grade tiers: Language!
Assessments, district math assessments, and district writing assessments. As is evident in
Table 9, the vast majority of principals at all grade levels consistently report “often” or
“always” using data from district assessments.
Table 9: Percent Principals’ “Often” or “Always” Using
District Assessments
Elementary
School
Middle School
High School
Type of
All Principals
Principals
Principals
Principals
Assessment
(n=112)
(n=62)
(n=34)
(n=16)
76.2%
70.8%
88.2%
75.0%
Language Test
87.3%
92.3%
82.4%
81.3%
District Math
91.3%
93.8%
91.2%
87.5%
District
Writing
27
Principals were also asked about their frequency of using TCAP data. TCAP assessments
are applicable to all grade levels and include TCAP Proficiency exams for third through
eighth grade and TCAP writing assessments for elementary, middle, and high school.
Therefore, it would be expected that all principals, regardless of grade level, would make
use of this assessment data for DDDM. Our survey asked principals to identify how
often they used various types of TCAP data results, including: performance level
breakdowns, scale scores, disaggregated subgroup results, grade-level and classroom
results, individual student results, as well as trend data over several years. Figure 3
displays the distribution of principals’ responses regarding the availability and use of
these types of assessment results.
Figure 3: Principals’ Level of TCAP Data Use
70.00%
Available, not reviewed
40.00%
Available, reviewed not used
30.00%
Available, used min.
20.00%
Available, used mod.
10.00%
Available, used extensively
Classroom
results
Grade level
results
Subgroup
results
Scale
scores
% students
at
performance
levels
0.00%
Test score
trends
across years
Not available
50.00%
Individual
student
results
60.00%
N=122 principal respondents
Well over three-quarters of principals repeatedly reported that they used the various types
of assessment results either moderately or extensively, of which most responded the
latter; the only exceptions were results reported at the classroom-level and trend data,
which had a more even split between moderate and extensive use.
Reasons for data use
Using surveys, we examined the reasons for which principals use assessment data in their
positions as school leaders. What we uncovered is a tendency among principals to use
state assessment and district assessment data for many similar reasons, which is a
somewhat unexpected finding considering that they exist for vary distinct purposes. State
assessments are summative in nature, providing end-of-year, one-time feedback about the
level of students’ mastery of various subject areas. District assessments, however, are
formative and intended to provide educators with frequent feedback on students’
academic progress throughout the course of a school year. Figures 4a and 4b provides an
illustrative breakdown of how similarly principals tend to use these two distinct types of
assessments.
28
Figure 4a: Principals’ Reasons for Using State Assessments
90.00%
80.00%
70.00%
60.00%
50.00%
Not applicable
Never/Rarely
40.00%
Sometimes
30.00%
Often/Alw ays
20.00%
10.00%
0.00%
Identify remedial Individualize
needs
instruction
Correct
Increase parent Evaluate teacher Recommend
curriculum gaps involvement
performance
tutoring
N=121 principal respondents.
29
School
Improvement
Planning
Identify staff
development
needs
Assign student
grades
Assign or
reassign
classes/groups
Figure 4b: Principals’ Reasons for Using District Assessments
90.00%
80.00%
70.00%
60.00%
50.00%
Not applicable
40.00%
Never/Rarely
30.00%
Sometimes
Often/Alw ays
20.00%
10.00%
0.00%
Correct Increase parent Evaluate teacher Recommend
Identify remedial Individualize
tutoring
instruction curriculum gaps involvement performance
needs
N=124 principal respondents
30
School
Improvement
Planning
Identify staff Assign student Assign or
reassign
grades
development
classes/groups
needs
These charts reveal a tendency among principals to “often” or “always” use both state
and district assessment data for a variety of similar reasons focused on student learning
needs and instructional improvement efforts, including: (1) identifying remedial needs;
(2) individualizing instruction for students; (3) correcting curriculum gaps; (4)
recommending tutoring for students; (5) preparing school improvement plans; and (6)
identifying staff development needs. Among other reasons for using assessment data,
principals cite increasing parent involvement, evaluating teacher performance, and
assigning students to classes or groups on a less consistent basis; while the majority of
principals “rarely” or “never” use state or district assessment data for assigning grades to
students.
Transfer of DDDM: From Principals to Teachers
Critical to the district’s expectations for principals’ DDDM is the ability to foster
teachers’ own DDDM capacity and behavior. The district relies heavily on principals to
transfer their own understandings of DDDM on to teachers, therefore, we closely
examined the nature of principals’ efforts to transfer training on to their teachers.
Overview of Key Findings
There is a disconnect between principals’ expectations that teachers use data to inform instruction
and their leadership role as one that guides teachers’ ability to interpret data.
Teachers have variable – and less than encouraging – experiences when it comes to the provision
of formal training for DDDM by their principals.
Despite the inconsistency of formal training, most teachers believe that they have adequate
support from their principals for DDDM.
Expectations for Teacher DDDM
There appears to be an overall inconsistency between principals’ expectations for
teachers’ DDDM, the leadership role that they expect themselves to play in that process,
and the practices that they enact to fulfill those expectations. To begin with, principal
interviewees consistently spoke of an expectation that teachers use student achievement
to drive instruction. More specifically, they expect teachers to use data to determine
which students need help in various subjects and to plan lessons in accordance with
students’ demonstrated needs. In the words of one middle school principal, teachers
should use data to know “where [students] have come from” and to determine “where
you are with them and . . . where you need to be” (Principal interviews, 2007).
Principals also communicated a perceived role of ensuring that teachers understand
student achievement data. As one principal explained, it is the principal’s responsibility
to
“help [teachers] understand how to make sense of the masses [sic] of data that is
available to them.”
31
Less frequently, principals spoke of a need to ensure that teachers then use this
understanding in order to effectively inform instructional decision-making (Principal
interviews, 2007). Hence, there is a disconnect between principals’ expectations that
teachers use data to inform instruction and their perceived leadership role as one that
focuses more on data interpretation.
Training for Teacher DDDM
Principals vary regarding the nature of staff development at their schools. Approximately
two-thirds (65 percent) of principals responded “often” or “always” to a survey item
inquiring about the frequency of which they provide staff with necessary training for
DDDM; however, nearly one-third (28 percent) reported that they only “sometimes” do
this. Additionally, 63 percent of principals “agree” or “strongly agree” that they have
control over staff development plans at their school, while almost 20 percent “disagree”
or “strongly disagree” with this statement. Combined, these discoveries cast doubt upon
the consistency by which teachers are able to receive the expected DDDM training from
their own principals.
Efforts to uncover the nature of these staff development efforts were not entirely fruitful,
but we did discover a few noteworthy trends. Principal interviewees consistently
described staff development as “meetings” of some sort, whether one-on-one with
teachers, organized in groups such as by grade level or subject area, or conducted with
the entire school staff. It was difficult to identify any consistent formal training
procedures enacted by principals within their schools, although most principals did
discuss a similar focus on data examination and interpretation so that teachers might
understand student learning through data results.
In order to triangulate these findings, we compared results from principal surveys to those
of teacher survey respondents. When asked how often their principal provides training
for data use, less than half (40 percent) responded “often” or “always”; while nearly twothirds of principals responded “often” or “always” to a similar question on the survey
administered to principals.
Another set of survey items asks teachers to identify the provider – either school-based,
district-based, other, or not provided at all – of various types of DDDM training,
including:
 Classroom strategies for using data,
 Using technology for DDDM,
 Connecting data to student learning, and
 Evaluating and interpreting data
Although school-based training is the most frequently cited response for each type of
training activity – suggesting the role of principals in transferring learning for DDDM –
slightly under half of teachers responded thus. The one exception is for training related
to evaluation and interpretation of data, in which case approximately 65 percent
responded that they receive school-based training.
32
Another set of survey items asked teachers about various characteristics of professional
development related to DDDM. Specifically, the items addressed issues of whether or
not (1) professional development is focused on DDDM, (2) it helps teacher to better
understand data, and (3) it helps teacher to apply data to their teaching. In response to all
of those items, just over half of teachers either agree or strongly agree.
These less than encouraging findings are tempered by the fact that 65 percent of teachers
said that they “strongly agree” or “agree” that principals provide the necessary support
they require for DDDM. This alludes to the possibility that teachers do not perceive
formal training as the only type of necessary support to foster their capacity for DDDM.
That is, teachers may believe that other resources – such as access to data or informal
discussions with their principal – provide them with the support they need for DDDM.
Further, over 60 percent of teachers stated that principals “often” or “always” use
achievement data to develop instructional interventions. And, over two-thirds responded
that they “strongly agree” or “agree” that open and honest discussions about data occur at
their school. These last two examples imply that other types of principal support exist at
the school-level, suggesting that the findings may be evidence of a culture of shared
commitment to using achievement data productively for the progress of teaching and
learning. Nonetheless, the inconsistency of formal DDDM training experiences among
the district’s teachers should be noted.
Teachers’ DDDM: Capacity and Behavior
An analysis of MNPS teachers’ DDDM capacity and behavior provides an approximation
of principals’ ability to transfer DDDM ability to their staff. As mentioned previously,
this transfer of training is a critical component of the district’s theory of action.
Admittedly, an assessment of teachers’ DDDM is not a perfect indication of principals’
ability to transfer understanding as instructional leaders; however, it is appropriate to
examine because if teachers have weak DDDM capacity and behavior it is unlikely that
principals are (1) acting as instructional leaders for DDDM or (2) enacting the most
appropriate efforts to foster DDDM among teachers.
Accordingly, this section provides an overview of teachers’ attitudes toward, knowledge
about, and behavior related to DDDM in order to garner a better understanding as to the
effectiveness of principals’ ability to pass DDDM capacity on to teachers.
Overview of Key Findings
While district officials and principals believe teachers’ attitudes toward and knowledge about
DDDM vary throughout MNPS, the majority of teachers report that they value DDDM and
perceive themselves to be knowledgeable about it.
Even when controlling for grade-appropriate assessments, teachers report using different types of
state and district assessments with noticeable variation.
33
When it comes to making instructional and curricular decisions, teachers tend to use district
assessment results more than state assessment results.
Teachers and principals have similar perceptions of DDDM norms among schools’ teaching
staffs, both believing that teachers practice DDDM frequently.
Teachers perceive that teacher colleagues are practicing DDDM more than they do themselves.
What makes this examination all the more necessary is the suspicion among district
officials that principals’ ability to train teachers in DDDM is deficient. A few officials
believe that principals lack the depth of understanding necessary to pass knowledge about
data interpretation and data use on to teachers. Others see the problem as one of resource
limitations, specifically a lack of time to delve into deep learning opportunities with
school staff. As one district official explains:
I think it is slower in that process [principals training teachers in DDDM] than it had been
with [district training] the principals because of time factor. Some of our principals work
with their teachers during grade level team meetings, but that only works if you have
common planning time; and, depending on school schedules not all of the schools have
common planning time for all of their grade levels. So, I think that there’s more room for
improvement there.
(District official interview, 2007)
Teachers’ Attitudes about DDDM
District officials (interviews, 2006-07) have variable perceptions about teachers’ attitudes
regarding the value of data for instruction. As is evident from officials’ quotes below,
some cast teachers as resentful of data, others believe them to be proponents, while others
perceive teachers to have inconsistent appreciation for data use.
At the classroom level there are probably a lot of teachers who think, ‘just let me teach
my kids and I can evaluate their performance.’ I think that there is going to be some
resistance in a lot of classrooms to too much time being spent on data.
We still have principals and teachers who get through a testing window and they’re just
glad to be done.
They are very much into looking at students’ [data]: where are they and where do we
need to be?
I think that is how you get to the teachers; you have to make it personal and useful to
them individually.
(District official interviews, 2006-07)
In contrast to these uncertain perceptions is the consistency with which teachers
responded to survey questions regarding the value of data use. Survey items asked
teachers the extent to which they agree or disagree with a series of statements, including:
34




“I value data for making instructional decisions.”
“Data is important for monitoring student performance.”
“Data assists with improving instruction.”
“Data is positive for student learning.”
As detailed in Figure 5, the vast majority of teachers either “agree” or “strongly agree”
with each of the statements.
Figure 5: Teachers’ Perceptions about the Value of Data
60.00%
50.00%
40.00%
Strongly Disagree
30.00%
Disagree
20.00%
Agree
10.00%
Stongly Agree
0.00%
Value data for
instructional decsions
Data important for
monitoring student
performance
Data assists w ith
improving instructon
Data is positive for
student learning
N=620 teacher respondents
We also combined these four survey items into a four-item composite measure of
teachers’ value for data use. With its high internal reliability (alpha = 0.91), this
composite measure provides a comprehensive variable for assessing teachers’ value for
data as an instructional tool. We found a mean score of 3.24 (between “agree” and
“strongly agree” on the 4-point likert scale) with little variation (standard deviation =
0.57), which alludes to consistent opinions among teachers that data is a valuable
resource for instruction – in notable contrast to district officials’ variable beliefs.
Teachers’ Knowledge about DDDM
Teachers’ knowledge about DDDM is another contributing factor to their overall capacity
to use data to inform instruction. It is safe to assume that principals believe that teachers
do not possess the requisite knowledge to enact DDDM. All principal interviewees
expressed that their teachers would benefit from learning more about DDDM. The most
commonly stated issue was teachers’ ability to interpret data reports, with the other less
frequently expressed concern being the application of data to classroom-level decision
making (Principal interviews, 2007).
Again, teachers’ survey responses are in contrast with principal concerns. Survey items
asked teachers the extent to which they agree or disagree with a series of statements,
including:
 “I know how to collect data.”
 “I understand reports on student achievement.”
35


“I am skilled with technology to use data.”
“I know how to use data to improve instruction.”
As detailed in Figure 6, the vast majority of teachers either “agree” or “strongly agree”
with each of the statements.
Figure 6: Teachers’ Knowledge for Data Use
70.00%
60.00%
50.00%
40.00%
Strongly Disagree
30.00%
Disagree
20.00%
Agree
Stongly Agree
10.00%
0.00%
Know how to collect Understand reports
Skilled with
Know how to
data
on student
technology to use improve instruction
achievement
data
using data
N=616 teacher respondents
As with the measure of teachers’ value for data use, we combined these four survey items
into a four-item composite measure of teachers’ knowledge for data use. With its high
internal reliability (alpha = 0.82), this composite measure provides a comprehensive
variable for assessing teachers’ knowledge for data use. We found a mean score of 3.07
(favoring the response of “agree” on the 4-point likert scale) with little variation
(standard deviation = 0.59), which again alludes to consistent opinions among teachers
that they are informed about the nature and strategies for understanding data.
Teachers’ Behavior for DDDM
Similar to the principal survey, we examined the frequency of teachers’ use of various
types of data and the reasons for which they typically use data in their positions as
classroom instructors. This review of data use provides a comprehensive understanding
of teachers’ approach to DDDM in MNPS. It offers valuable information to confirm or
disconfirm principals’ and district officials’ doubts about teacher capacity for data use, as
previously described. Additionally, it enables our team to compare principals’ DDDM
behavior to that of teachers.
Frequency and type of data
As presented in previous discussions of principal behavior for DDDM, we also examined
the frequency by which teachers indicate using various types of grade-appropriate state
and district assessments. Table 10 below provides another overview of those
assessments. Due to the distinct nature of various grade configurations, it is most
appropriate for us to examine data use among each grade level separately.
36
Table 10: National and State Assessments by Grade Level
Elementary School
Middle School
High School
State
State
State
TCAP Proficiency
TCAP Proficiency
Gateway
TCAP Writing
TCAP Writing
TCAP Writing
TCAP Value Added
TCAP Value Added
State End-of-Course
District
Language! Assessment
District Writing
District Reading
District Math
District
Language! Assessment
District Writing
District Math
District
Language! Assessment
District Writing
District Math
Elementary schools typically use self-contained classes as the instructional norm; that is,
the same teacher will instruct a group of students in all core subject areas (e.g.,
reading/language arts, math, science, social studies). Accordingly, it would be expected
that, in general, all elementary teachers responding to this survey would need to use all
types of data listed above in similar ways. As is evident in Figure 7 below, teachers
tended to respond in very similar ways resulting in most types of assessments leaning
heavily toward one response category. For example, well over half of elementary
teachers responded that they use district math and district reading assessments “often” or
“always”; and nearly 50 percent indicated the same for district writing assessments.
Among other assessments, the district’s language test was “not applicable” for over 60
percent of teachers in elementary school, which may not be surprising as the assessment
begins in 4th grade. What is more surprising is the frequency with which elementary
teachers identify all state assessments as “not applicable” to their teaching position; over
40 percent responded “not applicable” for TCAP proficiency tests, over 50 percent for
TCAP writing, and nearly 50 percent for TVAAS. Even though TCAP testing does not
begin until the 3rd grade, it would seem beneficial for lower grades (e.g., kindergarten
through 2nd) to, at the very least, become familiar with school results as one indicator of
the adequacy with which students are being prepared for later elementary grades.
37
Figure 7: Elementary Teachers’ Type of Data Use
90.00%
80.00%
70.00%
60.00%
Not Applicable
50.00%
Do not have access
40.00%
Never/Rarely
Sometimes
30.00%
Often/Alw ays
20.00%
10.00%
0.00%
TCAP
TCAP Writing
Proficiency
TVAAS
Language District Math
Assessments
District
Reading
District
Writing
N=344 elementary teacher respondents
Analyses of data use among middle school and high school teachers must take into
consideration the distinct nature of their responsibilities compared to those in elementary
schools. As teachers tend to assume subject area specialties, they would logically not
find all types of subject-specific assessments applicable to their positions. However, it is
reasonable to assume that reading/language arts and math teachers would benefit from
examination and use of the state and district assessments listed in Table 10, as their focus
tends to be on theses subject areas. Among our survey respondents, 54.2 percent and
40.6 percent of middle school teachers are reading/language arts and math instructors,
respectively; 40 percent and 19 percent among high school respondents. Figure 8 and
Figure 9 below illustrate the extent to which middle and high school teachers are using
theses state and district assessments.
Middle school teachers are much more disposed to consider TCAP proficiency and
TCAP writing assessments to be applicable to their positions and useful, as
approximately 50 percent of respondents indicated that they “often” or “always” use
those data results. Among the other assessments there appears to be great variation in
terms of applicability and use by middle school teachers, with the most common
responses being “never” or “rarely” and “often” or “always”, implying that there is a
good deal of inconsistency among this group of teachers in MNPS.
38
Figure 8: Middle School Teachers’ Type of Data Use
60.00%
50.00%
40.00%
Not Applicable
Do not have access
30.00%
Never/Rarely
20.00%
Sometimes
Often/Always
10.00%
0.00%
TCAP
Proficiency
TCAP Writing
TVAAS
Language
District Math District Writing
Assessments
N=150 middle school teacher respondents
Several distinct patterns emerge from an examination of high school teachers’ use of state
and district assessments. The most noteworthy trend is the variation among teachers’
responses to each type of state and district assessment; no single answer category
assumed even half of responses. Nonetheless, certain response categories do stand out on
a number of assessment types. First, teachers most frequently responded that they
“often” or “always” use Gateway (approximately 47 percent) and End-of-Course data
(approximately 38 percent). Additionally, the most common response to all district
assessments was that teachers “do not have access”, with between 30 and 40 percent of
high school teachers indicating that reply.
Figure 9: High School Teachers’ Type of Data Use
50.00%
45.00%
40.00%
35.00%
30.00%
Not Applicable
25.00%
Do not have access
20.00%
Never/Rarely
Sometimes
15.00%
Often/Always
10.00%
5.00%
0.00%
TCAP Writing
Gatew ay
State End-ofCourse
Language
Assessments
District Math
District Writing
N=88 high school teacher respondents
Some form of TCAP data is applicable to all grade levels – TCAP Proficiency in grades
3rd through 8th and TCAP writing for elementary, middle, and high school. Therefore, we
further analyzed the extent to which teachers make use of these data results in their
positions as classroom instructors. As with our principal survey, we asked teachers to
identify how often they used various types of TCAP data results, including: performance
level breakdowns, scale scores, disaggregated subgroup results, grade-level and
classroom results, individual student results, as well as trend data over several years.
39
Figure 10 below displays the distribution of teachers’ responses regarding the availability
and use of these types of assessment results.
Figure 10: Teachers’ Level of TCAP Data Use
35.00%
30.00%
25.00%
Not applicable
Not available
20.00%
Available, not reviewed
Available, reviewed not used
15.00%
Available, used min.
Available, used mod.
10.00%
Available, used extensively
5.00%
0.00%
% students at
performance levels
Scale scores
Subgroup results
Grade level results
Classroom results
Individual student
results
Test score trends
across years
N=603 teacher respondents
Overall there is great variation in responses, as no response category has over 30 percent
of teachers responding; however, some recurring trends do emerge. First, the most
frequent response to each level of data is “not applicable”; nearly a third of teachers
responded so for each category of data. This finding is unexpected, as our survey was
only administered to teachers in core subject areas (i.e., those who would need to use data
for their given subject area). Oddly, the second most common response for each category
of data is that data is “available and used moderately.” This implies that teachers’
DDDM practices with TCAP data are quite inconsistent throughout the district. It is
worth noting that “percent of students at performance levels” and “individual student
results” are most frequently used by teachers (i.e., “used moderately”, “used
extensively”), but still only approximately 40 percent of teachers reported those two
responses. In fact, no data category has over 50 percent of teachers reporting that they
moderately or extensively make use of it.
Differences between teachers and principals?
When comparing teachers’ and principals’ responses to the frequency by which they use
various types of data it becomes clear that their behaviors are quite different. As
discussed previously, the vast majority of principals across all grade levels tend to
consistently use various state and district assessments extensively. Teachers, however,
exhibit a great deal of variation, even when accounting for various grade-appropriate
assessments. And when patterns do emerge, they tend toward responses that data is
either “not applicable” or that they “do not have access” to data results, especially among
elementary and high school teachers, respectively.
40
In terms of using various levels of TCAP data, similar differences are noted.
Specifically, nearly half of principals responded that they use each level of data
extensively, while teachers’ responses were spread across different answer categories; the
most frequent response being that data is “not applicable” to their teaching position.
Reasons for data use
We also examined the reasons for which teachers personally use state and district
assessment data in their positions as classroom instructors. These reasons include: (1)
identifying remedial needs, (2) individualizing instruction, (3) improving curriculum, (4)
improving parent involvement, (5) evaluating one’s own teaching, (6) recommending
tutoring, (7) working on school improvement plans, (8) identifying personal professional
development needs, (9) assigning grades to students, and (10) sorting students into
classes or groups. As evidenced in Figures 11a and 11b on the following pages, what
emerged is a tendency among teachers to find district assessments more applicable than
state assessments, and they use the former for various reasons more frequently than the
latter.
Among state assessments, teachers’ most common response was that they “often” or
“always” use results for the reasons outlined above, with the exception of several
categories, including: improving parent involvement, identifying personal professional
development needs, assigning student grades, and assigning students to classes/groups.
However, it should be noted that even among the other response categories, “often” or
“always” responses were given by no more than 40 percent of teachers. It was also
surprising to discover that over 20 percent of teachers responded “not applicable” to each
of the response categories; that is, over one-fifth of teachers believe that using data for
the various reasons listed above is not applicable to their roles as teachers.
Teachers appear much more predisposed to using district assessments for various
instructional purposes. In fact, for all but one response category – assigning student
grades – the most common response was “often” or “always”; and 50 percent or more of
teachers responded that way for a majority of categories, including: identifying remedial
needs, individualizing instruction, improving curriculum, evaluating own teaching,
recommending tutoring, and working on school improvement planning. In contrast to
state assessments, only 10 percent or less of teachers responded “not applicable” to each
of the response categories. Finally, less than 20 percent of teachers responded that they
“never” or “rarely” use data for all of the given reasons, except for the divergent category
of assigning student grades, which had a response of nearly 40 percent.
41
Figure 11a: Teachers’ Reasons for Using State Assessments
60.00%
50.00%
40.00%
Not applicable
30.00%
a
Never/Rarely
20.00%
Sometimes
10.00%
Often/Always
0.00%
Identify remedial Individualize
needs
instruction
Improve Improve parent Evaluate ow n Recommend
School Identify personal Assign student Sort students
curriculum involvement
teaching
tutoring Improvement PD needs
grades
into
Planning
classes/groups
N=604 teacher respondents
42
Figure 11b: Teachers’ Reasons for Using District Assessments
60.00%
50.00%
40.00%
Not applicable
30.00%
Never/Rarely
Sometimes
20.00%
Often/Always
10.00%
0.00%
Identify remedial Individualize
needs
instruction
Improve
curriculum
Improve parent Evaluate ow n
involvement
teaching
Recommend
tutoring
N=606 teacher respondents
43
School
Identify personal Assign student Sort students into
Improvement
PD needs
grades
classes/groups
Planning
Differences between teachers and principals?
Overall, teachers’ reasons for using data parallel those of principals’. Among both groups of
respondents, the most frequent response was “often” or “always” using data for the given
reasons; and both teachers and principals were least likely to use data for assigning grades to
students. However, a few noteworthy differences do exist. First, although both groups cite
“often” and “always” as the most commonly recurring responses, principals have a greater
percentage of respondents consistently reporting so. Therefore, it can be assumed that principals
use data for various instructional purposes on a more frequent basis than teachers. Additionally,
principals are more likely to use state and district assessments in very similar ways; whereas
teachers tend to use district assessments more than state assessments in their role as classroom
instructors.
Norms of DDDM
Our team used another means by which to gauge the nature of teachers’ data use. In addition to
asking teachers about their personal data use practices, we asked them to report on the nature of
data use among teachers at their respective schools. These survey items measure the culture – or
norms – of DDDM among schools’ teachers. We also asked principals to respond to these same
survey items, in order to compare their perceptions of DDDM norms with those of teachers.
Table 11 provides an overview of teachers’ and principals’ responses to a series of survey items
asking respondents to identify how frequently teachers use data for various reasons.
Table 11: Principal and Teacher Perceptions about DDDM Norms
Never/
Often/
Reason for Data Use
Rarely
Sometimes
Always
Determine student
instructional needs
Plan instruction
Monitor student
progress
Sort students into
ability groups
Strengthen
instructional content
Improve teaching
strategies
Involve parents in
student learning
Recommend tutoring
Identify special
education needs
Measure instructional
effectiveness
Collaborate to improve
curriculum and
Principal
(n=121)
Teachers
(n=625)
Principal
(n=121)
Teachers
(n=625)
Principal
(n=121)
Teachers
(n=625)
2.4%
3.2%
0.1%
2.0%
12.7%
10.3%
16.8%
17.6%
81.0%
84.9%
71.9%
71.0%
2.4%
1.7%
11.9%
16.6%
84.9%
71.1%
9.5%
6.1%
24.6%
20.4%
64.3%
59.4%
5.6%
4.9%
15.1%
19.1%
78.6%
61.4%
4.0%
3.3%
23.0%
22.0%
71.4%
63.3%
14.3%
7.2%
14.5%
5.6%
38.1%
33.3%
27.7%
23.4%
46.0%
58.0%
43.0%
61.4%
3.2%
3.1%
21.4%
21.5%
74.6%
65.2%
6.4%
4.9%
40.5%
24.4%
52.4%
60.2%
4.8%
7.2%
34.1%
27.2%
59.6%
53.9%
44
instruction
Make instructional
changes
4.0%
3.3%
31.0%
25.0%
64.3%
62.2%
It is evident that both principals and teachers perceive DDDM practices among teachers to be
robust; for each type of DDDM activity – except for “involving parents in student learning” – the
majority of respondents from both groups believe that teachers “often” or “always” enact these
practices. Additionally, based upon these survey results it appears that principals and teachers
have a similar conception of DDDM norms among teachers, as their responses are quite parallel.
Principals do, however, tend to perceive teachers as practicing DDDM more frequently than
teachers perceive themselves; that is, they are more likely to respond “often” or “always” to each
of the survey items.
In addition to comparing teachers’ responses to those of principals, it is also of interest to
examine if teachers’ perceptions of DDDM norms ally with their personal reasons for data use as
reported on the survey. Admittedly, the nature of survey items does not permit a complete
comparison between all DDDM norms, but there are several opportunities for examination.
Analyses of these responses, reported below in Table 12, raise some questions about the
alignment between teachers’ perceptions of personal use and those of their colleagues. Overall,
teachers tend to believe that school-wide DDDM practices occur more frequently than they
report for themselves; and these differences are most pronounced between teachers’ personal
reasons for using state assessment results and DDDM norms. It should be clarified that these
findings do not automatically discount teachers’ reports of personal use or those of their
colleagues; for it is possible – although somewhat unlikely – that teachers were thinking of
DDDM norms for achievement data other than state and district assessments (e.g., classroomlevel assessments).
Table 12: Comparing Teachers’ Personal Data Use
and Perceived DDDM Norms
Never/
Rarely
Sometimes
Reason for Data
Use
Personal Use
State
District
(n=604) (n=606)
Determine student
instructional needs
Sort students into
ability groups
Involve parents in
student learning
Recommend
tutoring
Measure
instructional
effectiveness
DDDM
Norms
(n=625)
Personal Use
State
District
(n=604) (n=606)
DDDM
Norms
(n=625)
Often/
Always
Personal Use
State
District
(n=604) (n=606)
DDDM
Norms
(n=625)
10.2%
12.2%
0.1%
21.5%
19.5%
16.8%
38.8%
56.8%
71.9%
18.2%
18.7%
6.1%
20.3%
19.1%
20.4%
26.3%
46.1%
59.4%
22.0%
19.8%
14.5%
24.5%
29.6%
27.7%
25.0%
38.7%
43.0%
11.5%
13.6%
5.6%
21.7%
23.3%
23.4%
37.3%
51.7%
61.4%
7.9%
9.2%
4.9%
20.1%
21.7%
24.4%
43.2%
58.1%
60.2%
45
District Support for DDDM: Resources and Barriers
In order to develop recommendations for DDDM in MNPS, it is important to not only study
educators’ DDDM capacities and behaviors, but also to examine the current resources provided
by MNPS towards this effort. Analysis of district documents, interviews, and survey finding
provides a comprehensive overview of these resources, their strengths and limitations.
Overview of Key Findings
MNPS provides a number of resources – including frequent meetings, data reporting, and technological
software – related to DDDM; however, most emphasize data interpretation over data application.
While principals believe data is accurate and accessible, teachers are less convinced.
Both principals and teachers perceive untimely data reports and competing professional demands as
barriers to DDDM.
In addition to the concern of time, principals and teachers – as well as district officials themselves – agree
that the district has an underdeveloped and insufficient process for learning about educators’ needs for
DDDM.
District Resources for DDDM
MNPS provides an array of formal and informal learning opportunities for DDDM, most geared
towards principals. These resources include a combination of district-led meetings, on-site
school consultations, data reports, and technological support. The following sub-sections of this
report further explain the nature of these resources, their limitations, and educator perceptions of
these resources.
In general, principals seem very pleased with the nature of district support, but that is not the
case with teachers. We asked principals and teachers about the extent to which they agree that
the district provides needed support for DDDM. Over 80 percent of principal survey
respondents replied “agree” or “strongly agree”, while fewer than 30 percent of teachers replied
the same. These general statements provide a glimpse at educators’ perceptions, but further
analysis of documents, interviews, and survey results uncovers the nuances of these experiences.
Principal meetings and on-site consultations
Central office staff arrange regular meetings with principals several times each month, with a
focus on learning more about expectations for curriculum and assessment, as well as guidance
for data interpretation. Principals meet with (1) the Director of Schools once a month and (2) the
district’s Chief Instructional Officer and Director of Assessment and Evaluation once a month,
by grade tier, and biweekly if they are in a high priority school. The nature of these district-led
meetings has evolved over time, as explained by one district official below.
46
We were all kind of green with some of the use of the data, and so it didn’t take the first time
because it was just about a three hour workshop and we know that you have to have continuous
learning so we brought people back again, worked on it some more by school clusters; and, we
have a very large school system so we have 13 clusters of school groups, and we worked in those
clusters, as well as, as a whole district with principals and assistant principals. We did this over
the past two years. This fall, we worked especially with middle and high school. We had worked
a lot with elementary schools last year and in that training.
(District official interview, 2007)
We also conducted a review of all documents that were provided by the district to elementary,
middle, and high school principals at principal meetings during the 2005-06 academic year.
Various types of student achievement data were presented, with the vast majority of data coming
from state achievement assessments (i.e. TCAP and Gateway) and district level assessments. All
but two of the 34 reviewed documents presented aggregate student achievement data for the
entire district or for individual schools. As district officials emphasize the importance of using
data to improve the effectiveness with which principals inform instructional decision-making, it
is important to note that the following types of documents did not appear to be provided to
principals:
 Tools for interpreting available achievement data reports, especially at the classroom
level
 Descriptions of the potential power of student achievement data for improving
instructional decision-making
 Examples of how student achievement data should be applied to inform instructional
decision-making
In sum, it appears that principal meetings tend to focus on data interpretation, with some
emphasis on applications relevant to the responsibilities of school leaders, such as understanding
students’ scale scores on TCAP, interpreting the state’s annual report card, understanding trend
data across years, and using data to develop school improvement plans (District official
interviews, 2007). What tends to be lacking is a focus on instructional applications at the
classroom-level. Additionally, a good portion of these meetings seem to focus on long-term data
use (e.g., interpreting trend data, school improvement planning), but the district is characterized
by significant principal mobility, much of which is the result of transfers initiated by district
officials.
On-site consultations are provided by the Office of Assessment and Evaluation upon the request
of principals. These consultations usually take place during schools’ in-service days and offer
more in-depth interpretation of school-specific data than can often be accomplished by a
principal (District official interviews, 2006). Staff within this office tend to have limited time to
accomplish these consultations, as the district heavily relies upon them to fulfill these school
requests. As one official explained (2006),
Of course, it would certainly be more helpful if we had some additional assistance in our
assessment department. I think they’re really stretched and when they try to go out to individual
schools, individual cluster groups, its hard for them to cover all the territory.
47
Data reports
Additionally, the district’s Office of Assessment and Evaluation provides principals with ongoing data reports throughout the school year. With only one exception, principal interviewees
(2007) were consistently pleased with the support that they receive from the central office in
organizing and accessing student data reports. Principals expressed appreciation for the
formatting of reports and for the responsiveness of the central office to principal requests for
information. As one middle school principal said,
We get anything we need. We just have to ask Paul Changas. We can get just about anything we
need in any form that we need it. I commend our research and evaluation department for being
able to spit out just about anything that we need.
Both district official and principal interviewees recognize that dissemination and discussion of
data reports is a big focus in MNPS. And while some district officials wonder if they provide too
much information, most principals seem to feel that they and their teachers receive necessary and
quality data. Additionally, teachers seem to be less convinced that data is accessible and
accurate, as evident below in Table 13. Teachers’ responses raise doubts about the effectiveness
of either the district’s or principals’ ability to provide teachers with the information needed for
quality DDDM practices.
Table 13: Data Accuracy and Availability
Principals
Teachers
“Strongly Agree”/
“Strongly Agree”/
“Agree”
“Agree”
(n=123)
(n=619)
Teachers can easily access
73.7%
42.9%
needed information
Data available for those
68.0%
49.3%
who need it
Data is accurate and
81.1%
54.7%
complete
One final recurring theme that emerged was principals’ persistent concern about the timeliness of
receiving student assessment results (Principal interviews, 2007), particularly for standardized
year end state assessments. Although one principal expressed that the state department of
education might be responsible for the delayed reporting process, many principals seemed to
believe that the district is responsible for a slow turnaround of student testing results. This is a
potent concern for principals, one that needs to be discussed more openly in order to clarify the
source of the problem and strategies to either resolve or compensate for it.
Technological support
MNPS has recently (Spring 2006) implemented the use of a new standards-based computer
software program, EduSoft Assessment Management System, designed to streamline data
collection and analysis, facilitate creation of and management of district assessments, and
provide educators with timely feedback on students’ test results. The district has also hired a
full-time EduSoft consultant to train educators in its implementation. Earlier attempts at training
48
in Spring 2006 were unsuccessful, as the district enacted a train-the-trainer model.
Unfortunately, many of the first-round trainees were unenthusiastic about the new system and
about the responsibility of training, while others were unaware and uncommitted to scaling up
training efforts at their respective schools (District official interviews, 2006).
Even if the new training approach provides quality learning opportunities, the district must also
encourage principals and teachers to value EduSoft as a worthwhile tool for DDDM. As one
principal interviewee (2007) explained, the system is time-consuming and, while capable of
providing useful data feedback, it is rarely worthwhile to spend so much time working with the
software.
Barriers to DDDM
In addition to learning about resources for DDDM, this project seeks to understand the barriers
that impede the practice of data use. Two primary types of barriers emerge from our
examination of interviews and surveys – limited time and limited understanding of educators’
data needs.
The issue of limited time is the most prominent barrier expressed by principals and teachers. In
fact, six of the 10 principal interviewees (2007) said that lack of time kept them from using
student data more effectively. Two other related issues compound the barrier of limited time –
delayed data reporting and high student mobility. The former issue, discussed earlier, refers to
state assessment results being provided to schools too late to be useful for school year planning.
Often, data reports are received in either the weeks preceding or following the start of a new
school year, leaving very little time to analyze the data let alone apply it to instructional
planning. High student mobility also diminishes the ability to use data to inform instruction.
Presumably, student data does not always readily apply to schools when students transfer from
one school to another.
Principals were not alone in their concerns about limited time. We asked teachers to reply to a
series of survey items inquiring about various types of barriers to data use, including: outdated
technology, doubts about the importance of data, lack of principal support, and too many
teaching demands (i.e., not enough time). Teachers’ responses to these items varied, with
several noteworthy findings emerging.
 Nearly two-thirds of teachers referred to “doubts about the importance of data”, as either
not a barrier or a small barrier, supporting previous findings about the extent to which
teachers value data use.
 The majority of teachers – 60 percent – think that “lack of principal support” is not a
barrier.
 Over half of teachers – 60 percent think “too many teaching demands” is a large barrier,
thereby contributing to earlier evidence that lack of time is a significant barrier to using
data well.
In addition to the concern of time, principals and teachers – as well as district officials
themselves – agree that the district has an underdeveloped and insufficient process for learning
about educators’ needs for DDDM. When asked if the district surveys educators “to see what
data is needed to inform instruction,” over half of teachers reported “never” or “rarely” with less
49
than 15 percent reporting “often” or “always.” Complementing these trends are principals’
responses; nearly half replied “never” or “rarely”, and less than 15 percent said “often” or
“always.”
An Examination of DDDM in MNPS:
Explaining Emerging Outcomes
Before developing recommendations for MNPS, it is important to understand how the series of
findings discussed above are interrelated to impact outcomes of interest in the district’s theory of
action for principal DDDM. Accordingly, this section presents further analyses to clarify how
the inputs, intermediate goals, and long-term outcomes of the theory of action are related. In
doing so, it addresses the following questions:



Does district support for DDDM, principal support for DDDM, and teacher experience
explain differences in teachers’ attitudes about and knowledge of DDDM?
Is district support or principal support related to DDDM norms among school teachers?
Do DDDM norms differ by grade level?
Overview of Key Findings
District and principal support for DDDM, in addition to teachers’ background, do help to explain
teachers’ attitudes about and knowledge of DDDM, but only minimally.
District and principal support for DDDM are both positively related to DDDM norms among teachers,
with the latter having the strongest relationship.
Norms of DDDM do vary between teachers of different grade levels, with elementary teachers exhibiting
the strongest DDDM norms.
Explaining Teachers’ Attitudes about and Knowledge of DDDM
The district believes that a measure of principals’ success with DDDM is the development of
teachers’ capacity for DDDM. As discussed throughout this report, teachers’ capacity includes
both attitudes about and knowledge of DDDM. It is, therefore, appropriate to learn more about
the factors that contribute to teachers’ capacity.
Using teacher survey results, we analyzed the extent to which various inputs – district support for
DDDM, principal support for DDDM, overall years of teaching experience, years teaching at
current school, and level of education – contribute to the nature of teachers’ attitudes about
50
DDDM.8 For this analysis, we used a composite measure of four-survey items intended to gauge
the extent to which teachers value data as an important tool for instruction.9
Using a regression analysis, all inputs – except level of education – prove to be statistically
significant variables for explaining differences in the extent to which teachers value data. As
detailed in Table 14, all of these variables are positively related to teachers’ attitudes, with one
exception – the longer a teacher has been at their current school, the lower their value for data.
Principal support has the strongest power of explanation, but it is not that much greater than the
other input variables.
Table 14: Explaining Teachers’ Attitudes about DDDM
Unstandardized
Standardized
Coefficients
Coefficients
Sig.
Std.
B
Error
Beta
(Constant)
2.770
.090
.000
Receive needed
.059
.021
.131
.005
support from district
Receive needed
support from
.073
.023
.149
.001
principal
Highest degree
-.003
.025
-.004
.919
Years as teacher
.008
.003
.131
.008
Years at current
-.011
.005
-.109
.022
school
a Dependent Variable: ValueofData
Overall, most of the variables significantly contribute to explaining variability of teachers’
attitudes – with principal support having the greatest explanatory power. Nonetheless, there is
still much unknown about why teachers’ attitudes vary because the model only explains 8
percent of the variation.
We conducted a similar regression analysis to better understand variation in teachers’ knowledge
of DDDM, using the same input variables as the previous model – district support for DDDM,
principal support for DDDM, overall years of teaching experience, years teaching at current
school, and level of education. For this analysis, we used a composite measure of four-survey
items intended to gauge the extent of teachers’ knowledge of DDDM.10
8
Our measures of district and principal support are based upon two survey items asking the extent to which teachers
agree that (1) the district provides needed support for DDDM and (2) their principal provides needed support for
DDDM.
9
The composite measure of teachers’ value of DDDM is a four-item scale with high internal reliability (alpha =
0.913).
10
The composite measure of teachers’ knowledge of DDDM is a four-item scale with high internal reliability (alpha
= 0.816).
51
As detailed in Table 15, this model is more robust, explaining approximately 14 percent of
variation in teachers’ knowledge of DDDM. Additionally, only three input variables – district
support, principal support, and level of education – are statistically significant and exhibit
positive relationships with the dependent variable of teachers’ knowledge. That is, as each of
those input variables increases so does the level of teachers’ knowledge for DDDM. Principal
support, again, has the strongest power of explanation.
Table 15: Explaining Teachers’ Knowledge of DDDM
Unstandardized
Standardized
Coefficients
Coefficients
Sig.
Std.
B
Error
Beta
(Constant)
2.270
.091
.000
Receive needed
.072
.021
.153
.001
support from district
Receive needed
support from
.114
.023
.225
.000
principal
Highest degree
.076
.025
.119
.003
Years as teacher
.004
.003
.065
.170
Years at current
-.005
.005
-.044
.344
school
Dependent Variable: Knowledge of Data Use
Relating DDDM Norms to District and Principal Support
The above analyses suggest that district support for DDDM, and especially principal support,
contribute to variations in teachers’ attitudes about and knowledge of DDDM; both having a
positive relationship with those outcomes of interest. To further the analyses along the theory of
action, it is also appropriate to learn how district and principal support are related to DDDM
norms among school teachers. The following statistical analyses disclose these relationships,
using a composite measure of DDDM norms.11
Conducting simple correlations, we found that both district support and principal support for
DDDM are positively related to DDDM norms. Specifically, district support has a significantly
significant, small positive relationship (r = 0.34); principal support has a statistically significant,
moderate positive relationship (r = 0.44). Therefore, as district support increases so do DDDM
norms among school teachers; such is also the case for principal support.
Understanding DDDM Norms between Grade Levels
MNPS frequently conducts principal meetings along grade level lines, meeting separately with
elementary, middle school, and high school principals. Accordingly, we examined whether
DDDM norms actually differ by grade level; for, if not, it might make sense for the district to at
least consider other training arrangement strategies.
11
The composite measure for DDDM norms is a 12-item scale with high internal reliability (alpha = 0.962).
52
Using analysis of variance (Table 16), we did uncover a statistically significant difference by
grade level. Elementary teachers have the highest mean score (3.88) for the composite measure
of DDDM norms, followed by middle school (3.34), and then high school (2.61).12 These
differences of the means are all statistically significant (p = 0.05).
Table 16: Comparing DDDM Norms by Grade Level
Sum of
Squares
df
Mean Square
Between Groups
171.806
3
57.269
Within Groups
701.708
608
1.154
Total
873.514
611
(I)
Describe school
Elementary
Middle
High school
Other
F
49.621
Sig.
.000
Mean
Difference
(I-J)
.54482(*)
1.27007(*)
Std. Error
.10356
.12863
Sig.
.000
.000
Other
2.01038(*)
.25961
.000
Elementary
-.54482(*)
.10356
.000
High school
.72525(*)
1.46555(*)
-1.27007(*)
.14392
.26751
.12863
.000
.000
.000
Middle
-.72525(*)
.14392
.000
Other
.74031(*)
.27818
.040
-2.01038(*)
-1.46555(*)
-.74031(*)
.25961
.26751
.27818
.000
.000
.040
(J) Describe school
Middle
High school
Other
Elementary
Elementary
Middle
High school
* The mean difference is significant at the .05 level.
Overview and Implications of Findings
Key Findings and Implications for DDDM in MNPS
The above findings provide a comprehensive overview of DDDM in MNPS, addressing each of
the key components of the theory of action for DDDM. Overall, it appears that perceptions of
principals’ and teachers’ capacity for DDDM are inconsistent among key district stakeholders.
District officials have doubts about principals’ capacity, while principals report consistently high
attitudes about, knowledge of and behavior related to DDDM. The same is true of teachers –
that is, principals and some district officials have doubts while teachers’ self-reports stand in
contrast to them.
Additionally, it appears that resources for developing DDDM – whether district meetings, data
reports, or principal support and training for teachers – are focused on data interpretation as
opposed to strategies for applying data for the improvement of teaching and learning; the latter
12
Mean scores for DDDM norms are based upon a five-point scale (never, rarely, sometimes, often, always).
53
being the espoused goal of DDDM in MNPS. This discrepancy is compounded by a seeming
lack of time to devote to meaningful DDDM practices.
These findings raise several key questions pertinent to the development of recommendations for
MNPS, namely:
 How can MNPS more accurately and regularly gauge the DDDM practices of its
educators?
 How can MNPS better direct DDDM resources and training efforts toward the
application of data for decisions about teaching and learning at the classroom level?
 How can MNPS provide principals with more explicit training focused on training
teachers in DDDM for instructional improvement?
54
SECTION 6
Recommendations
Drawing upon the findings emanating from our evaluation of DDDM in MNPS, our capstone
team devised a set of recommendations to improve DDDM among the district’s educators, with a
focus on principal leadership development. These recommendations reflect key principles of
model principal development programs throughout the nation while recognizing the distinct
contextual realities of MNPS.
Overview of Key Recommendations
Invest in teacher leadership.
Establish expectations of participation in current leadership development opportunities.
Create a district-wide induction program for principals and teachers.
Increase principal time to focus on leadership for learning.
Access funding resources for high quality leadership development.
Context
Building leadership in schools continues to be an obstacle for school districts. Districts are
facing huge shifts in expectations and accountability from state and federal authorities and a
shortage of qualified school leaders. According to Arthur Levine (2005)
Schools have the job of educating a population – that is experiencing dramatic demographic
changes, growing increasingly diverse, and becoming more and more segregated by income and
race – to meet today’s more rigorous state standards. And they must do this with a shrinking
number of experienced administrators and teachers due to retirements and departures from the
profession (p. 11-12).
In Rethinking the Professional Development of School Leaders Kochan, Bredeson, and Riehl
(2002) expand the issues facing principal professional learning beyond attrition and
accountability:
The principal’s job responsibilities have been further complicated by expanded demands from
external constituencies, rapid growth in research on teaching and learning, changing
demographics of our population, and burgeoning access to information resulting from explosions
of new technologies (p.289).
Part of the challenge facing school districts and school leaders is not only the changing landscape
of schools today but also the inability of university leadership development programs to
adequately prepare school leaders for the complexities of the principalship.
55
Through the examination of educational leadership preparation in universities across the nation,
Levine (2005) found that
Because traditional educational administration programs have not prepared school leaders for
their jobs, new providers have sprung up to compete with them. Because they have failed to
embrace practice and practitioners, their standards have fallen, and school systems have created
their own leadership programs (p. 68).
This is particularly the case in urban school districts like Metropolitan Nashville Public Schools.
Many urban school districts are designing multi-dimensional programs that identify leadership
potential in current employees, provide intensive leadership training for teachers and assistant
principals, and then support principals with high quality professional learning. This requires a
change in philosophy on the part of school district and a shift in fiscal priorities to provide high
quality staff development for leadership. In light of tightening fiscal challenges for districts,
leadership development is often one of the first areas of the budget to be cut, causing a domino
effect of ill-equipped leaders serving challenging schools ineffectively. How does a district
break the cycle?
MNPS needs to develop a comprehensive, leadership support program to develop leadership
skills in future leaders and support the various needs of current school leaders if it wishes to
improve their DDDM practices. To begin, it is recommended that the district examine successful
models of leadership development already employed in several school systems across the nation.
Model District Programs
Chicago Public Schools
Chicago Public Schools is one district that understands the importance of school leadership
development. The Chicago Public Schools contracts with the Chicago Leadership Academies
for Support Success (CLASS) to provide leadership development for the 1800 school leaders in
the school system. The annual cost of the partnership to Chicago Public Schools is $1.8 million.
CLASS
CLASS operates with a clear vision and theory of action for change. The program goal of
CLASS is to develop leaders who embody the knowledge, skills, attitudes, behaviors, and
aspirations to lead schools where continuous capacity building leads to increased student
achievement. The theory of action includes the following steps (CLASS website, 2007):
Participants attend CLASS workshops.
↓
Participants learn workshop content through interaction, application and reflection.
↓
Participants access related resources for their schools.
↓
Participants apply content and strategies in the school.
↓
Participants expand and deepen their knowledge through succeeding sessions.
56
↓
Participants use content and resources to build capacity of others through professional
development.
↓
Participants implement coaching support and monitoring systems at the school level.
↓
Evidence of implementation is demonstrated by walk throughs, school visits, and
artifacts.
↓
Participants receive coaching and follow up from Area Instructional Officers.
↓
Staff performance and student achievement improves
CLASS has a clear mission for leadership development at various levels of leadership skill and
experience. To address the diverse needs of school leaders, CLASS runs a variety of programs
including LAUNCH for aspiring school leaders, LIFT for beginning principals, Illinois
Administrators’ Academy (IAA), and Chicago Academy for School Leadership(CASL). Each
program targets specific leadership needs for professional learning.
Aspiring leaders
CLASS offers the LAUNCH program for aspiring school leaders. Initiated in 1998, the
LAUNCH program “identifies, trains, and supports principals for the Chicago Public School
system.” According to the CLASS website (www.classacademies.org), LAUNCH has graduated
120 principals for Chicago Public Schools. The program has three separate phases including a
summer leadership academy, a one year internship, and a network that provides ongoing
professional development. The curriculum for the program is based on the Chicago Public
Schools Principal Competencies. A key component of the program is the partnership with
Northwestern University’s Kellogg Graduate School of Management. The summer leadership
academy takes place on the Northwestern campus and program participants have access to the
University’s facilities and staff.
In addition to LAUNCH, the Chicago Academy for School Leadership (CASL) runs an assistant
principal’s academy. Over a one year time period, participants are trained in six topics including
“leadership, relational, literacy, mathematics, change, and organizational development.” The
focus of the program is on leadership for DDDM and increased student learning.
First year principals
First year principals in Chicago Public Schools benefit from the direction and support provided
by LIFT or the Leadership Initiative for Transformation. LIFT focuses on eight topics,
including:
 Instructional Leadership
 Human and Physical Resource Management
 Strategic planning
 Technology
 Evaluation
 Financial Planning and management
57
 Chicago Public Schools policies, goals, and system initiative implementation
The program uses a cohort model including a three day new principal institute, monthly
meetings, mentors, and field study groups.
Experienced administrators
Experienced administrators have access to ongoing professional development provided by
CLASS. The Illinois Administrators Academy (IAA) provides workshops and seminars to help
school administrators complete required professional development to renew their professional
license with the state of Illinois. Presently, administrators are required to complete 20 hours of
professional development and at least 1 Administrator Academy Class annually.
A second program offered to veteran school leaders is the Chicago Academy for School
Leadership (CASL) which is described as a “portfolio program” of CLASS. CASL offers longterm learning seminars to principals who have completed LIFT. Currently, three different
seminars are offered in the area of leadership, high quality DDDM, and “change agentry.” The
program duration is one year and includes a portfolio component where participants are asked to
gather evidence of the implementation of new strategies and concepts that have been introduced
through the seminars.
Baltimore City Schools
Baltimore City Schools is another district that has a focused commitment on professional
learning for school leaders. Baltimore City Schools is a strong model for Metro Nashville
because it is only slightly larger in size. For the 2006-2007 school year, Baltimore City Schools
has 193 schools and 82,381 students. The district has a teacher and principal quality team that
oversees the professional learning for the school system. According to the district website
(www.bcps.k12.md.us), the Teacher/Principal Quality Team is responsible for the creation and
coordination of programs that promote personal growth and the preparation of tomorrow’s
school leaders. The district has created and expanded opportunities for “teacher leaders, aspiring
leaders, assistant principals, principals and senior staff” and is participating in several leadership
development partnerships, including: the Aspiring Leaders Program; the Principals’ Induction
Institute, a program of support for first and second year principals; and New Leaders for New
Schools, an intense preparation program for aspiring principals and assistant principal cohorts.
This shared responsibility for professional learning between teachers and leaders is a philosophy
that permeates the Baltimore City Schools professional development mission. The model for
leadership development includes recognition that a variety of programs are needed to address the
diverse needs of school leaders in different phases of leadership. The district allocates
approximately $1 million annually to leadership development programs that are district-run and
university-supported in order to meet the diverse needs of their learning organization. These
programs can be divided into three categories: future leadership development, new leader
support, and veteran administrator renewal. All three program types have a different emphasis.
Future leadership development
The first category, leadership development, includes the Aspiring Leaders Program in
partnership with local universities; New Leaders for New Schools, a national initiative that
58
places aspiring administrators in an internship with a veteran principal for one year; and assistant
principal cohorts in partnership with the Institute for Learning at the University of Pittsburgh.
The Aspiring Leaders Program is coordinated by the district in partnership with local universities
willing to design and implement a cohort program of 18 credit hours that meet the state criteria
for the Administrator I certification, which is the credential required to be an assistant principal
in Maryland. Universities that are interested in providing these courses must apply with the
district and be approved. The district then allows teachers to apply for placement in the program,
and the district pays 100% of the cost. Teachers completing Aspiring Leaders are placed in
assistant principal positions in the district and encouraged to finish their Masters in Educational
Leadership to receive a full principal’s certification. Currently, 67 teachers participate in one of
the three cohorts. The district hopes to have 100 teachers enrolled in this program at a time to
provide a continuous pool of candidates for vacant assistant principal positions.
Once an assistant principal is assigned to a school, they are encouraged to participate in the
Assistant Principal’s Cohorts, a partnership with the University of Pittsburgh’s Institute for
Learning. This program is designed to help assistant principals change their roles from
administrative tasks to instructional leadership. The intensive program includes monthly
meetings, outside readings, projects, and discussion groups designed to increase the capacity of
assistant principals in instructional leadership.
In addition to these two programs, Baltimore City Schools received a US Department of
Education grant to help fund the New Leaders for New Schools program. This program is
approved by the State of Maryland as an alternate licensure program. Once candidates complete
the program, they are placed in a principalship in the district.
New leader support
Recognizing a need for the intensive support of new school principals, Baltimore City Schools
has forged a partnership with Johns Hopkins University to provide a principal induction institute.
This program focuses on providing new school leaders with access to current research,
researchers, and colleagues to improve problem-solving and capacity-building competencies.
New principals participate in the Principal Induction Institute for the first three years of their
principalship in Baltimore City Schools.
Veteran administrator renewal
A final phase of professional learning in Baltimore City Schools is the selection and training of
veteran administrators to serve as mentors for new and aspiring administrators. The purpose of
this program is two-fold. Through the training program, experienced principals have the
opportunity to learn and access new research and current tools for school leadership, increasing
their capacity as leaders. In turn, these mentor administrators work closely with aspiring school
leaders, providing advice, support, and a collegial partner for problem solving.
Lessons Learned from Model Programs
Both of these model programs were selected because of their focused emphasis on various
phases of leadership development. In addition, the programs are comprehensive in addressing
59
the needs of the learner at each particular phase of leadership. Unifying themes that MNPS
should consider in the design and implementation of leadership support include:
 Forging partnerships with higher education institutions that provide access to
current research in education.
 Differentiating professional learning for various phases of leadership
development by recognizing the differences between aspiring administrators,
new principals, and experienced school leaders.
 Understanding that there will be beliefs and procedures unique to MNPS that
should be imbedded in a local leadership development strategic plan.
 Actively identifying and training teacher leaders and aspiring administrators
to include training specifically for teachers interested in the assistant
principalship and assistant principals planning to move into the principalship.
 Leadership development requires a targeted investment of fiscal resources and
time. Many of these programs are comprehensive and represent a relatively
small percentage of the district’s overall staff development budget.
Recommendations
Although concerns over principals’ DDDM initiated this inquiry, our capstone team finds that
many other factors must be considered in developing the capacity of school leaders in MNPS.
Recognizing that school leadership is growing to be increasingly more complex with new
demands from social, political, economic, and pedagogical perspectives, it is recommended that
MNPS adopt a multidimensional approach to the identification, development, and support of
building level leaders. The following action steps are recommended:





Invest in teacher leadership.
Establish expectations of participation in current leadership development
opportunities.
Create a district-wide induction program for principals and teachers.
Increase principal time to focus on leadership for learning.
Access funding resources for high quality leadership development.
Invest in Teacher Leadership
Change is a prevalent theme when it comes to principals in MNPS. In the past five years, almost
all of the schools have seen changes in the head principal of the school. Although change brings
new ideas and possible improvements, if it is not tempered with some stability, it can challenge
teachers in even the best schools. Often, school teachers are less transient and many stay in
buildings for their entire career. To deeply imbed a culture of DDDM in a learning community,
it is imperative to identify teacher leaders and build their capacity to understand and use data to
impact student learning. To do so, it is recommended that each school train a team of teachers
and administrators in the following areas:
 Basic statistics,
 TCAP interpretation,
 TVAAS website,
 Differences in types of standardized tests used in the district, and
60

Researched-based instructional strategies and interventions with empirical evidence of
impact on student learning
These site data teams will be responsible for gathering evidence and working together with the
learning support team at the central office to write and implement the state required school
improvement plans.
Although a participatory process for school improvement planning already exists in most
schools, the difference here is the strategic identification and training of teacher leaders. Often,
teacher leaders are asked to serve on school improvement or building leadership teams without
the requisite knowledge to really understand and interpret data. Due to frequent change in the
principalship, the learning support team at the central office should work with principals to
carefully select teacher leaders that have a vested interest in their particular school.
Induction
Building capacity in teachers and principals begins on the first day that they walk into the district
as a new employee. The beliefs, mission, and vision of the organization must be introduced and
translated into action steps and behaviors for new staff members. In Creating Great Schools, Phil
Schlechty (2005) refers to induction programs as a “high leverage activity”:
Creating an induction system is one such high-leverage activity, and any leader who is serious
about changing systems needs to give considerable personal attention to the design of this system
and should also be prepared to play a central role in the processes that are developed (p. 67).
Schlechty (2005, p.68) goes on to identify that the results of effective induction programs ensure
that:





Knowledge about the school’s moral, aesthetic, technical norms and local conventions is
widespread throughout the group.
Most members of the group have the skills needed to comply with the norms and have
internalized the norms to the point that the primary means of control is self control.
Consequently, the need for formal control is limited, and when notable deviations do
appear, informal sanctions will be sufficient to restore conformity.
Commitment to the organization is high, and turnover is low.
Peer support is high and sustained.
Patterned deviation is limited.
In other words, everyone in the organization has a shared set of beliefs and actions. They
understand why and how the organization conducts business. In the case of MNPS, this would
mean translating the belief that data should be used to inform instructional practice and increase
student achievement into a tangible set of strategies and expectations. Everyone coming into the
district would understand the central purpose and ways of acting to accomplish the goals. This
requires the creation and sustained implementation of an induction program that supports
teachers and principals initially for the first two years of service in MNPS. Induction includes
the recruitment and retention of quality personnel.
61
Establish Expectations and Support Principal Participation in State and Local Programs
State programs
Principals are currently required by the state board to participate in the Tennessee Academy for
School Leaders (TASL) and complete continuing education hours every two years to renew their
license. Additionally, new principals are issued a beginning principals license (BAL) and have
three years to complete a process, usually guided by a university, to advance their license to a
Professional Administrators License (PAL). TASL programs are paid for with state funding and
are free to school systems and school leaders in Tennessee public schools. Although a variety of
programs are offered, there are two programs, Beginning Principals Academy, and the Assistant
Principals Academy, that would benefit MNPS and help build the capacity of their school
leaders to use DDDM.
Several principals have participated in these programs in the past; however, there should be an
expectation that all beginning principals and assistant principals participate in these TASL
academies. These programs are provided at no cost to the district and allow principals to access
high quality training and network with State Department of Education resources, university
personnel, new principals across the state, and mentor principals.
The Assistant Principals Academy focuses on individual leadership development. Assistant
principals are encouraged to evaluate their individual leadership styles and build capacity as
instructional leaders. This is a development program that helps assistant principals think about
their role in schools and develop skills that would be critical as they move into an executive
principal position in a learning organization.
The Beginning Principals Academy is designed to support the needs of principals in their first
two years as a school leader. Principals are trained in DDDM, communication skills,
instructional pedagogy, financial management, and state policies/procedures. In addition,
academy participants benefit from a school climate inventory (SCI), administered by the
University of Memphis, to evaluate their leadership and identify needs in their learning
community. This inventory is provided free of charge to the program participant and is
administered both years the principal is in the academy. Additionally, academy participants are
assigned a mentor principal from a school with their same grade configuration. This mentor
administrator provides council and support for the new principals, facilitates a site visit to their
school, and visits the school of the new principal to observe, listen, support, and provide
suggestions.
Local programs
In addition to these state-supported programs, principals in MNPS have access to the Principal
Leadership Academy of Nashville (PLAN) offered by Vanderbilt University’s Peabody College.
PLAN is a leadership development program that works in cooperation with MNPS and focuses
on leadership for learning. The program is grounded in the belief that school leaders must have
the capacity to guide ongoing teacher and student learning. Towards this end, each year PLAN
provides a year-long development program to a cohort of existing and aspiring school leaders.
Cohorts meet for a two-week long intensive classes each June to kick-off the year-long program.
This initial induction is followed by twice-a-month meetings throughout the remainder of the
62
school year. These PLAN meetings are supplemented by (1) a PLAN mentor that works more
intensively with leaders throughout the school year, (2) practice in conducting classroom
observations focused on instructional quality, and (3) the completion of a final deliverable to
address the real-time challenges facing their schools.
Currently, PLAN targets existing and aspiring school leaders who tend to be the strongest of
leaders within MNPS; the rationale being that the program wants to invest in leaders with
potential and commitment to the district. While a viable rationale for selection, this process
should possibly be revisited if MNPS wants to develop the leadership skills – and especially the
DDDM capacity – of a broader range of leaders. In order that more principals and aspiring
principals may benefit from the program, it is possible to consider the following options: (1) a
group of participants with more diverse ability should be admitted into PLAN, or (2) MNPS
should provide opportunities for PLAN participants to work with non-participants in an effort to
broaden the impact of the program among district educators.
Time
The overwhelming consensus of principals and teachers in this study is that a lack of time is a
barrier to DDDM. Although it is difficult to create more time during the instructional day, if the
district wants DDDM to be a part of organizational culture, more time must be allocated for
principals and teachers to work with data. To help with this process, there are two major
recommendations:
 Conduct an administration audit of administrative policies and procedures. Survey
principals and central office personnel to identify administrative processes that can be
eliminated and streamlined. Look for ways to utilize technology to organize and access
information. Identify bureaucratic barriers and reorganize central office services to
support DDDM in schools. It is recommended that the district contract with an outside
consultant to complete this audit and make recommendations to the administrative team.
 Increase the contract for principals from 11 months to 12 months. This additional month
of time can be used to plan professional development for individual schools, carefully
analyze and disaggregate data, and participate in leadership development programs
specifically designed to support principals in their roles as instructional leaders.
Access Funding Sources for High Quality Leadership Development
Funding for staff development is always a challenge in public schools. Annually, MNPS faces
budget cuts, and staff development programs receive a low priority. Presently, the professional
development department oversees a budget of approximately $10 million, of which about
$80,000 is spent annually on leadership development. To build capacity in school leaders, high
quality staff development for principals must be a priority and receive the needed fiscal support.
It is recommended that MNPS seek private funding and grants to support the development of
leadership training programs. Two sources of funding are currently available:
 Broad Foundation – The Broad Education Foundation, founded in 1999 by Eli Broad,
reports as their mission to “dramatically improve urban public education through better
governance, management, labor relations and competition.” Currently, the Broad
Foundation is accepting proposals from the top 100 urban districts in the nation for
63

programs to train aspiring principals. The deadline for proposal submission is June 15,
2007, with winning proposals receiving funding in January 2008. Proposal applications
are available on the Broad Foundation website at
http://www.broadfoundation.org/funding/rfp.shtml.
The US Department of Education regularly provides grants and funding to Local
Education Agencies for the development of leadership preparation programs. As of
March 1, 2007, there are not any programs presently offered; however, the status of
federal grants is constantly changing and funds may be available in the near future.
Requests for Proposals are posted on the Department of Education website at
http://www.ed.gov/fund/landing.jhtml?src=rt.
64
SECTION 7
Implementation Goals and Plan
The recommendations and future actions outlined in the previous section are meant to improve
the delivery of staff development and professional learning related to DDDM for MNPS. This
section provides further guidance as to the goals and action steps for implementing these
recommendations.
Overview of Key Implementation Steps
Influence educators’ attitudes about student learning and achievement.
Increase educators’ access to technology tools and data sources.
Impact principals’ and teachers’ aptitude for data use.
Facilitate instructional adaptations, whereby teachers will change their instructional practice
based on data to increase student learning and achievement.
This section will attempt to provide a rationale for why our team feels these particular
interventions and actions are related to the intermediate goals and long term outcomes identified
in the theory of action shown again here in Figure 12. It is understood that district constraints
(i.e. MNEA contract, current budget, time, and current organizational model) impact the actual
implementation of recommended action. Additionally, some strategies recommended will have a
greater return on investment for the district than others as it relates to the ultimate outcome,
improved student achievement. Nonetheless, we believe that the objectives and action steps
discussed in this section address key components of the theory of action for DDDM, and
therefore, are essential to the improvement of DDDM among district educators.
Figure 12: Theory of Action Revisited
INPUTS
District Resources
Expert Staff
Technology and Information Resources
(i.e., Data reporting and communication)
Training for principals and teachers
↓
INTERMEDIATE GOALS
Principal Capacity
Attitudes
Knowledge/Self-Efficacy
Experience
65
Principal Behavior
Data use
Instructional leadership (i.e., use of time, training teachers)
↓
LONG-TERM OUTCOMES
Teacher Capacity
Attitudes
Knowledge/Self-Efficacy
Experience
Teacher Behavior
Data use
↓
ULTIMATE OUTCOMES
Improved Student Achievement
Implementation Goals
Through the implementation process, our capstone team has identified four areas of change as
illustrated below in Figure 13. The implementation change cycle includes the following goals:
 Influence attitudes about student learning and achievement.
 Increase access to technology tools and data sources.
 Impact principal and teacher aptitude for data use.
 Instructional adaptations – Teachers will change their instructional practice based
on data to increase student learning and achievement.
All four of the goals identified in the implementation change cycle are interrelated and cannot be
addressed in isolation. However, as Figure 13 illustrates, all four goals are connected in helping
achieve the ultimate outcome of improved student achievement.
66
Figure 13: Implementation Change Cycle
Influence
Attitudes
Increase
Access
Student
Learning
and
Achievement
Impact
Aptitude
Instructional
Adaptation
Influence Attitudes about Student Learning and Achievement
The first area targeted by the implementation change cycle is principal and teacher attitudes
about DDDM. This goal connects to the intermediate goals and long term outcomes of the
theory of action: principal and teacher capacity. Often, school leaders feel overwhelmed by the
volume of data available. Teachers and principals are frustrated by the lack of time available to
analyze data and some have come to resent conversations focused on DDDM. Influencing
attitudes about DDDM establishes a foundation for change. Principal attitudes help shape the
culture of the learning organization. If DDDM is a priority for MNPS, principals must believe
that it has a relationship to student learning and act according to that belief in the design and
sustenance of their schools as a learning community. In Professional Learning Communities
That Work: Best Practices for Enhancing Student Achievement, DuFour and Eaker (1998) write:
Principals of professional learning communities lead through shared vision and values rather than
rules and regulations. Principals of a learning community engage the faculty in a co-creation of
shared vision and values. They facilitate consensus building and conflict resolution and
demonstrate their sincere interest in finding common ground. When their staff members become
67
collectively committed to shared vision and values, principals then focus on these common hopes
and commitments as the driving force in school improvement (p. 184-185).
To influence attitudes at the building level about the importance of DDDM, principals must first
believe in its value, and then translate that belief into a culture in their own learning communities
with teachers and staff.
Teacher attitudes impact the delivery of instruction. Spillane and Louis (2002) write, “Teachers’
beliefs and expectations are also critical, because they may influence the ways in which
classroom opportunities to learn are mobilized.” Attitudes direct actions in schools around
assessment practices and student learning. If principals do not create a culture of learning around
the belief that data informs instructional practice, and teachers do not believe that data is a
valuable part of their classroom practice, then it is very difficult to connect data use to student
learning. Staff development interventions recommended in the previous section must include as
a core component activities that challenge principal and teacher beliefs about the connection
between student learning and assessment.
Increase Access
Teachers and principals cannot use data they do not have readily available. New technology
tools can help teachers and schools organize and access student achievement data more
efficiently than in previous years. However, this data is only effective if the right people have
access when they need access. For example, our capstone team found that most schools only
accessed the TVAAS website less than 5 times during the past school semester. This tool allows
principals to assign individual teacher passwords, and teachers have access to student
performance data, student growth data, projection graphs indicating trajectories for future
performance, as well as disaggregated group data. If every teacher had accessed the site even
once, the log in rate would have been much higher. The low log in rate indicates the possibility
that either teachers do not have access or they do not understand the value of the tool.
Communicating available technology tools and ensuring access is the second goal of the
implementation plan.
Impact Aptitude
Having data is not enough for teachers and principals. To use data effectively, education
practitioners must understand the types and uses of data in educational assessment and
accountability systems. To develop this capacity, school leaders and teachers must engage in
conversations about student data, statistical measures applied in student assessment practices,
and appropriate interpretations and application of data. NSDC Standards for Staff Development
(2001) explains:
If data are to provide meaningful guidance in the process of continuous improvement, teachers
and administrators require professional development regarding data analysis, designing
assessment instruments, implementing various forms of assessment, and understanding which
assessment to use to provide the desired information. Because the preservice preparation of
teachers and administrators in assessment and data analysis has been weak or nonexistent,
educators must have generous opportunities to acquire knowledge and skills related to formative
classroom assessment, data collection, data analysis, and data-driven planning and evaluation (p.
16).
68
In other words, principals and teachers must understand what the data tells them about student
learning and have the opportunity to increase building-level professional learning opportunities
around the concept of data use.
Instructional Adaptation
Understanding data is not enough. One of the most difficult pieces in the DDDM process is
helping teachers translate data into a change in instructional practice. Often, this requires
teachers to develop skills in differentiation of instruction and the ability to provide targeted
instructional support to individuals or small groups of students in the classroom. Teachers need
help answering the question, “Now, what?” In the implementation change cycle, once teachers
believe that data informs instruction, have access to data and technology tools needed to organize
the data, and understand the different types and appropriate uses of the data, they can proceed to
changing instructional practice to support identified student learning needs. Not only do teachers
need help developing this capacity, principals need the skill in instructional leadership to support
teachers in evaluating student data and translating the data into measurable actions. This last
component appears to be the area in need of greatest improvement among MNPS educators.
Implementation Plan
Our capstone team’s recommendations can be implemented in three different phases. As
mentioned early in the recommendation section, building the capacity of school leaders for
DDDM will require an investment of time and financial resources to adequately support the
identified needs. This should, however, be viewed as an investment in the future leadership for
learning in MNPS. Providing targeted support for instructional leadership over time will help
MNPS reach the final outcome of increased levels of student learning and achievement.
Phase One
In phase one, it is recommended that MNPS focus on the use of present resources and secure
funding for future programs. The following steps are recommended for immediate action during
the remainder of the 2006-2007 school year:



Complete the Broad Foundation application for funding before the June 15, 2007
deadline. This document can be used as a foundation and rationale for the concept
paper required as part of the grant application. These funds could provide the
necessary resources to fully implement suggested programs in phase II and III.
Require current first year principals to enroll in the Beginning Principal’s Academy
provided by TASL. Additionally, any new principals hired after the issue if this
report should be required, as part of the administrator induction process in MNPS, to
register for BPA with TASL.
Identify future principals presently serving as assistant principals and encourage them
to enroll in the Tennessee Academy for School Leaders (TASL) Assistant Principals
Academy
69
Phase Two
Phase two recommendations require some financial planning and resource allocation or
adjustment. The recommendations in phase two are to be implemented in the 2007-2008 school
year.







Lengthen the contract of all principals from 11 months to 12 months. This increased
time would allow principals to attend high quality staff development, analyze student
data, and plan for instruction and professional learning in their schools.
Develop a plan to ensure that all building level principals will participate in Principal
Leadership Academy Nashville (PLAN) at Vanderbilt University. Initiate a
conversation with Vanderbilt about designing and implementing a principal induction
program for new principals in the district.
Train site-based data teams for each school. Have principals identify 4-5 teacher
leaders in each school building that have the potential to serve as data team leaders in
their schools. Meet with data teams once a month during the 2007-2008 school year
to build the capacity of each data team and school principal for DDDM in the school.
Focus the content of the training on understanding research that supports DDDM in
learning communities, research-based instructional strategies that support student
learning, and the capacity to analyze student data using statistical methods
appropriate to state and district test data. Teams should be trained in using the
TVAAS website and Edusoft software to organize school level data. In each monthly
meeting, teams should work with authentic, real time data from their schools.
Provide monthly professional learning for building level principals. These meetings
should focus on leadership issues impacting the profession and provide an
opportunity for principals to network and develop skills around the theme of
instructional leadership in learning communities. To help frame the design of these
professional meetings, the National Staff Development Council’s Standards for Staff
Development can be used to help define appropriate formats and topics of
conversation.
Conduct an administrative audit of principal administrative tasks. Identify tasks that
can be reassigned to central office support staff, building level support staff, or
assistant principals to provide additional time for principals to focus on instructional
leadership.
Design an RFP (request for proposal) for a university partnership to train aspiring
leaders. This program should be designed to imbed the skill set needed to be
effective school leaders in the future. The program should include a licensure and
internship component.
Identify and begin to train retired principals willing to serve as mentors and
leadership coaches for new school principals.
Phase Three
The following steps should be implemented in the 2008-2009 school year.
 Begin a teacher induction program for new teachers. This program should include an
orientation during the summer before the first day of school, a minimum of two
school days or professional development days during each semester of the first two
70

years as a teacher, and a week long summer institute following the first two years of
teaching. Topics should include assessment systems and practice, curriculum,
technology, district core beliefs about teaching and learning, and differentiation of
instruction.
Design a differentiated leadership training program for veteran administrators. Seek
input from higher education institutions to design relevant, cutting edge academies for
school leaders in the area of school leadership. These might include online
professional resources and networks, summer institutes with world class experts in
education, topical study groups facilitated by experts in the field, or local district
initiated and led groups focused on identified district needs.
Finally, we recognize that any of these recommendations could be implemented with variable
success if quality checks are not instituted. Therefore, the following section addresses the
advised steps for monitoring the quality and effectiveness of these program recommendations for
improving principal leadership for DDDM.
71
SECTION 8
Future Evaluation of DDDM in MNPS
This section provides a rationale and action steps for monitoring and evaluating the fidelity by
which key recommendations are implemented as well as the effectiveness of these
recommendations for improving DDDM in MNPS. Drawing upon a body of literature that
speaks to the importance of monitoring change efforts, we expand upon the following key ideas.
Overview of Key Evaluation Steps
MNPS should monitor the quality of DDDM improvement efforts using a combination of
formative and summative evaluation initiatives.
Formative evaluation efforts should monitor the development of educators’ attitudes about,
knowledge of, and practice of DDDM in order to regularly adjust improvement strategies.
Summative evaluation efforts should provide MNPS with evidence to assess the impact of their
strategies to improve DDDM.
In the previous two sections, we have outlined recommendations for improving the quality of
DDDM in MNPS. These recommendations include:
 Investing in teacher leadership
 Establishing expectations of participation in current leadership development
opportunities
 Creating a district-wide induction program for principals and teachers
 Increasing principal time to focus on leadership for learning
 Accessing funding resources for high quality leadership development
In addition to these recommendations, we have outlined specific action steps to facilitate their
implementation. Specifically, we have advised MNPS to implement the following core
components of the implementation change cycle:
 Influence attitudes about student learning and achievement.
 Increase access to technology tools and data sources.
 Impact principal and teacher aptitude for data use.
 Instructional adaptations – Teachers will change their instructional practice based
on data to increase student learning and achievement.
Rationale for Program Evaluation
These recommendations require substantial resources in the form of personnel, time, and
financial investment. It is a large task to bring about comprehensive, significant improvement in
principal and teacher use of student achievement data. Our capstone team believes that such a
significant investment will be wisely made if it results in school and classroom level
improvements in using student achievement data to drive instructional decision-making. In order
72
to gauge the effectiveness of such a bolstered effort to encourage DDDM, it will be necessary for
the district to engage in on-going evaluations of its efforts.
At a time when program evaluations were beginning to be widely used, Cohen and Garet (1975)
suggested that
[s]ocial policy can be improved by getting better information about it. . . The idea that better
knowledge produces better decisions is not exactly a political novelty . . . . But reforming social
policy on a large scale by changing the information upon which it is based is a recent
development and it is one of considerable proportions (p. 123).
Cronbach et. al. (1980) provided similar rationale for the general need for evaluative studies of
social programs:
An evaluation pays off to the extent that it offers ideas pertinent to pending actions and people
think more clearly as a result. To enlighten, it must do more than amass good data. Timely
communications—generally not “final” ones—should distribute information to the persons
rightfully concerned, and those hearers should take the information into their thinking. To speak
broadly, an evaluation ought to inform and improve the operations of the social system. (Toward
Reform of Program Evaluation; Quoted in Rossi, et. al., 2004, p. 24).
As illustrated above, efforts to measure the effectiveness of programs and policies offer great
opportunity to improve future performance. This potential applies for all kinds of social
programs and policies. Certainly, those who make decisions or design programs of significant
importance should be keenly interested in their subsequent consequences and results.
While program evaluation can be helpful for a variety of situational contexts, logic implies that it
should most vigorously be applied towards a program whose very purpose is to improve the use
of DDDM. To rephrase the above quotes, program evaluation holds the potential to produce
valuable data that can be applied to improve future decisions. These quotes speak of program
evaluation as being, in its purest form, a provision for DDDM. MNPS should use a vigorous
application of program evaluation (i.e. DDDM) in order to inform decisions regarding its own
efforts to improve DDDM in the district. Doing so would serve to improve the quality of the
adopted DDDM program and also model the very behavior that MNPS desires to see in its
building principals and teachers.
The Challenge of Transfer of Training
MNPS has expressed a disappointment with its efforts to train principals to use DDDM, and the
results of our study confirm the belief that professional development of principals regarding
DDDM has not transferred as productively as possible to the “core technology” of schooling:
teaching and learning in the classroom (Murphy, 2001). Although disappointing, such a result
should not be a major surprise. Across organizations, there is a significant challenge in ensuring
that the professional development provided to employees is actually applied in the workplace.
Broad and Newstrom (1992) have strong words regarding this challenge:
Most . . . investment in organizational training and development is wasted because most of the
knowledge and skills gained in training (well over 80%, by some estimates) are not fully applied
by those employees on the job (p. ix).
73
Thus, while we offer substantial recommendations and action steps to improve the professional
development and skills offered to principals and teachers, MNPS should also recognize the
difficulty in ensuring that training will result in a higher quality use of skills. It is this challenge
that, once again, points to the need to systematically monitor the effectiveness of the newly
bolstered efforts to encourage DDDM in MNPS schools.
The actual application of newly acquired skills can be as challenging as the initial learning of
these skills. The difficulty clearly lies in ascertaining what amount of information learned in
professional development activities is actively applied to instructional decision-making in
schools. MNPS must seek to learn, in increasingly accurate terms, about the actions of principals
and teachers inside the school building and classroom. This is a difficult task, as evaluations
described below will rely in large part on self reports. It is here that proactive steps must be
taken in order to ensure that principals and teachers understand that their self reports will neither
personally benefit nor harm them. Similarly, principals and teachers must be brought to
understand the importance of the provision of accurate data during such ongoing evaluations. In
sum, the district must work to build a climate of trust, or at least begin with a circle of trust
around issues of DDDM, in order to make sure that the data upon which decisions are made are
accurate. Further steps to ensure that data provided during evaluations of the district’s DDDM
program is transparent and accurate will be discussed in the specifics of program evaluation
below.
Future Evaluations of DDDM
In this section, we will outline specific recommendations for ensuring continued and productive
evaluations of the district’s policies and training surrounding DDDM. As discussed previously,
such evaluation holds the potential to inform decisions related to the program being studied. The
context surrounding MNPS, like all American school districts, is one that is continually pushing
for increasingly more data upon which to base instructional decisions. It is our recommendation
that MNPS continue to embrace such a desire for accountability and student achievement. A
sign of this commitment will be evident in the following recommended evaluative strategies.
First, we will examine methods to institute periodic, formative evaluations. Then, we will
describe a method for conducting annual evaluations, which are meant to be summative as well
as formative in nature. We recognize that constraints on personnel, time, and other resources
may limit the extent to which all evaluative initiatives can be implemented. However, we do
believe that the methods discussed below present several important options for consideration if
MNPS is to truly improve DDDM among its educators.
Ongoing and Formative Evaluation
The purpose of measuring results of the district’s DDDM efforts in an ongoing, formative
manner is to provide regular feedback regarding which components are working well and in
which areas the district needs to improve. In the same way that teachers and schools desire
regular feedback regarding their students’ learning, so the district needs to receive similar
information regarding principal and teacher learning about DDDM.
74
It is recommended that MNPS conduct these formative evaluations of principal and teacher use
of DDDM. In a sense, this ongoing formative evaluation can be compared to teacher-created
formative assessments whose results help to inform the teachers’ future instruction of his or her
students. These formative assessments are meant to be conducted by and for the district with the
sole purpose of providing useful data that will inform future professional development
opportunities offered by the district throughout the remainder of that academic year. Below are
the main components and characteristics of this continually formative assessment.
Data collection and reporting
Results of formative assessments of program implementation will be provided to all pertinent
MNPS staff at the time of issuance of student report cards. There are multiple reasons for such
timing. First, reporting results every nine weeks13 provides frequent enough reporting for staff
responsible for professional development to gauge which areas of training are being adequately
understood and which areas or types of training need to be improved. Secondly, timing the
reports with student report cards provides a symbolic statement of the district’s commitment to
using data to improve student learning opportunities and also to improve internal operations.
Findings from these formative evaluations are to be reported only internally within MNPS.
These formative assessments are solely meant to inform decisions made regarding MNPS efforts
to improve DDDM. Thus, there is no need to report periodic findings in this matter to
individuals outside of MNPS. Additionally, limiting usage to internal improvement will
communicate to all MNPS that there are no rewards or sanctions accompanying positive or
negative results. This will encourage more accurate self reports on surveys.
Components of formative evaluations
Teacher Surveys. Teachers will be surveyed in order to ascertain their views of (1) DDDM and
(2) the effectiveness of district efforts to improve data use. Specifically, these surveys will
provide district staff with a periodic snapshot of teacher attitudes towards data use, types of data
used by teachers, availability of student data, purpose of data use, frequency of data use,
professional development, professional climate, and barriers towards data use. The teacher
survey found in Appendix E, which was used in our evaluation of the district’s use of DDDM, is
a starting point for developing this teacher survey. MNPS should revise this survey instrument
in order to drive at particular points of interest. A sample representative of the entire district
should be surveyed during each reporting period.
Teacher Cohort Study. In addition to the above broad surveys of a representative sample of
teachers, MNPS should also conduct a longitudinal study of a purposefully chosen teacher
cohort. This study will allow the district to monitor the growth (or lack thereof) of a small group
of teachers in their abilities and applications of DDDM. This will allow MNPS to conduct
follow-up interviews with a limited number of teachers in order to dig more deeply into
particular areas of interest, especially in understanding how DDDM is being applied to improved
classroom instruction. While uncovering classroom practice is a time intensive process, this
cohort study model is a viable strategy to do so because of the opportunities for qualitative
analyses among a more manageable number of teachers.
13
Beginning in the 2007-08 school year, MNPS will be changing from a six-weeks report card schedule to a nineweeks schedule.
75
Principal Surveys. Principals will be surveyed for the same purposes as teachers. MNPS should
regularly ascertain from principals their views of (1) DDDM and (2) the effectiveness of district
professional development efforts. The principal survey should run largely parallel with the
teacher survey in content in order to shed light on the validity of respondent answers. For
instance, if principals report that all teachers have been trained in the use of DDDM, while most
teachers report not having been trained, then MNPS will be alerted to the need to recommunicate the need for accurate self reporting on formative surveys. All principals should be
surveyed during each reporting period.
Survey Logistics
It is crucially important that survey respondents provide accurate information. Response sets
that show an unrealistically positive portrayal of data use will negate the entire purpose of this
formative assessment, namely to improve internal operations of the district’s DDDM training
efforts. Such a situation is analogous to the difficulties that a teacher would face in attempting to
make sense of and apply inaccurate student achievement data. While our capstone team has
reason to believe that some respondents provided unrealistically positive responses to survey
items, MNPS has the opportunity to improve the authenticity of survey response data by
ensuring that the following logistical measures are taken. Unlike our capstone team, MNPS will
be able to completely control all details of evaluations, and it should ensure that such control is
utilized to provide for data of the highest quality.
Most importantly, MNPS should ensure anonymity of survey responses. Principal surveys could
continue to be efficiently completed at principal meetings. During this time, it is recommended
that all central office personnel leave the room. Also, individual surveys can be placed by each
respondent in a sealed envelope, which would be collected before district staff members return to
the meeting. Principals should be assured that there is no identifying feature on the surveys.
Teachers should be given the option of completing an on-line or paper survey. Providing such a
choice may help to ensure a higher response rate. Paper surveys should be distributed with a
self-addressed envelope to be returned to the central office via board mail. In both cases,
teachers should be assured that there is no identifying feature to surveys.
For the cohort teacher study, MNPS should ensure teachers that their responses will be kept
anonymous where possible and confidential when anonymity is not feasible.
Assessments of educator knowledge of DDDM
As a part of ongoing formative evaluations, our team suggests that teachers and principals should
be assessed to determine their knowledge of DDDM. Surveys that seek to determine attitudes
towards and usage of DDDM are insufficient tools for determining an important outcome of
professional development: improved principal and teacher knowledge of the content of DDDM.
These assessments should be conducted in a manner that shows them to be significant without
raising the possibility of causing undue individual discomfort or embarrassment.
Meaningful assessment is clearly a large component of today’s standards-based reform, and there
is much truth in the saying that what gets measured gets done. The unusual step of testing for
76
principal and teacher knowledge of DDDM would serve to provide the district with useful
formative data regarding strengths and weaknesses of principals’ and teachers’ knowledge, and
demonstrate to principals and teachers that the district truly values test data as an indispensable
part of a sincere commitment to student learning. Such a symbolic demonstration could only
strengthen principal and teacher enthusiasm for actively seeking to understand and apply student
achievement data.
The logistics for administering these assessments depend on the MNPS work schedule and
calendar that are unknown to the present researchers. Regardless of how these assessments are
administered, we do recommend that they address concepts beyond static knowledge of data and
assessment facts; they should also gauge educators’ ability to apply assessment results to various
instructional scenarios. We are confident that reliable, valuable data regarding principal and
teacher knowledge of DDDM can and should be obtained.
Annual Program Evaluation
While the purpose of the above assessment plan is strictly formative in nature, an annual
summative evaluation of the district’s DDDM program efforts is meant to provide a more
thorough and comprehensive assessment of the program. As this summative evaluation would
likely be used to make decisions regarding increased/decreased funding, expansion/contraction
of program services, and even continuance of the program, it is recommended that the annual
evaluation be conducted by a third party from outside of MNPS. Below are some suggestions
for possible providers of high quality program evaluations.
Suggestions for Evaluators
Future Peabody College Capstone Teams
This report has been conducted by a capstone team of doctoral students at Peabody College of
Vanderbilt University. This is the first cohort of Ed.D. students at Peabody College and is
similarly the first year that Peabody students have conducted a capstone project. Future cohorts
will be conducting similar projects, and it would perhaps be mutually beneficial to MNPS and
Peabody College for future Ed.D. students to build on the work of this initial project.
University Faculty Researchers
The issue of DDDM is currently one that researchers are actively studying. University
researchers might be enticed by the opportunity to work with a district that is welcoming and
open to productive inquiry. If this route were chosen, it is recommended that MNPS begin with
an invitation to Peabody College professors and researchers in order to capitalize on their
experience and benefit from a continuity of purpose with this initial capstone project.
Educational Consultants
Although hiring consultants to evaluate the MNPS program is clearly the most costly option, it is
believed that a high quality, year-end evaluation would be worth the expense. The district
simply cannot afford to put great effort into this DDDM project without an assurance of a
realistic report of program effectiveness.
77
Focus of Summative Evaluation
Our team recommends that MNPS contract with one of the above providers of evaluative
research for an annual, high quality evaluation of the DDDM program’s effectiveness. MNPS
should charge the researchers with providing a comprehensive evaluation of the following areas
of interest. The issues of concern are those that Rossi et. al. (2004) suggest as components of
high quality program evaluations.
Need for the Program
It is essential to determine whether or not the program in question “addresses a significant social
need in a plausible way and does so in a manner that is responsive to the circumstances of those
in need” (Rossi et. al., 2004, p. 102). The need for the DDDM program should not be taken for
granted. In order to productively add work loads to individuals within organizations, it is
necessary to thoughtfully remove work that has become non-essential or even obsolete.
Although the need for a stringent program is now clear, DDDM will hopefully become engrained
in schools’ professional cultures in the future. In such a case, a less intense program of
professional development would be increasingly appropriate.
Program’s Design
The design of a program is influenced, whether directly or indirectly, by a set of assumptions and
theories regarding how productive change can be realized. While it is crucial for programs to be
implemented as designed, it is equally important that the fundamental structure of the program is
sound (Rossi et. al., 2004). Programs of action that are based on faulty ideas will always be
ineffective. In any given year, MNPS must be provided with information regarding the
soundness of its process-guiding theory. Simply, MNPS must ensure that the following question
is answered: Does the manner in which MNPS seeks to improve the use of DDDM make sense?
Implementation
Once a program is based on sound design theory, the program must then be carried out in the
way intended. This will be especially crucial and difficult in a very large organization such as
MNPS that attempts to utilize shared leadership. With a reliance on a multitude of individuals
comes the challenge of ensuring that everybody remains committed to the same agreed upon plan
of action. Here, the question is, “Is MNPS implementing the DDDM program in the way in
which it was designed?”
Outcomes
A quality program evaluation will answer the question, “What has the program accomplished?”
Rossi et. al. (2004) provide an interesting and useful definition for program outcomes, suggesting
that “outcomes . . . must relate to the benefits those products or services might have for the
participants, not simply their receipt” (p. 205). In this sense, the outcomes of the DDDM project
must be more than the exposure (receipt) of professional development regarding DDDM.
Rather, an example of a desired outcome would be the internalization of the offered professional
development and the subsequent transfer of the training to the school and classroom. It is clear
that MNPS understands the importance of this specific outcome, as it was the perceived lack of a
transferred “benefit” by previous professional development in DDDM that was the impetus for
this original study.
78
Efficiency
Clearly, the district might incur high costs by adopting our suggested recommendations for
improving DDDM. Although it is assumed that the district will benefit by the resulting
improved use of student data in schools and classrooms, we cannot necessarily assume that the
benefits are worth the costs. The costs must be determined, carefully assessed, and weighed in
order to determine whether or not the benefits accrued to the district are worth the costs paid.
It is hoped that this section has communicated to MNPS the significant possibilities that future
program evaluation can offer to its objective for improved DDDM. The adoption of our
recommendations and action steps will require substantial effort and financial investment. It is
prudent to understand the effectiveness of programmatic efforts in both a formative and
summative manner. As a district that is committed to improving understanding and use of
DDDM, it is only appropriate that those responsible for the professional development of
principals and, ultimately teachers, in this area would themselves rely on the data-driven nature
of program evaluation.
79
SECTION 9
Discussion and Conclusion
In this section, we revisit key findings and provide a further rationale for our recommendations
for MNPS. Overall, we believe that our findings – while highlighting several key areas for
improvement – present a relatively positive outlook for educators’ DDDM capacity. As this
section discusses, these findings should be interpreted with caution as closer analyses uncover
evidences of educators’ misunderstanding of DDDM, further establishing a rationale for the
recommendations provided in this report.
Encouraging Survey and Interview Data
The initial evaluation of the findings suggests that there appears to be a more positive story about
DDDM in MNPS schools than was expected. Because the origin of this project was MNPS
officials’ expressed disappointment in the capacity of principals and teachers for DDDM, we
expected to find disappointing results regarding DDDM in our study. MNPS officials expressed
frustration that the results of their efforts to train principals had been largely meager; in a sense,
we were charged with determining where an assumed problem lay. However, information
provided by principals and teachers convey a much more positive picture of DDDM.
For example, according to our survey and interview findings, principals consider themselves to
be knowledgeable about and competent in DDDM, with nearly 90% of respondents indicating
that they were “often” or “always” comfortable using data. Similarly, principal interviews
provided additional confirmation for this perceived mastery of DDDM; all interviewees
described themselves as being proficient in using student achievement data. Interestingly, there
was no difference in comfort level with data based on years of experience. Principals roundly
expressed frequent and skilled use of student achievement data.
In parallel findings, teachers expressed general appreciation for and confidence in their ability to
use student achievement data. A few findings stand out. First, teachers indicated consistently
strong agreement regarding the value of student data to inform instructional decisions, monitor
student performance, assist with improving instruction, and positively contribute to student
learning. Teachers also communicated uniform confidence in their ability to collect, understand,
and use data to improve instruction.
Reconciling Reported Data with District Perceptions
This project was conducted because of the fact that MNPS officials firmly believed that
principals and consequently teachers were deficient in both DDDM skills and usage. In fact, the
researchers were charged with uncovering the circumstances that surround this “problem.” With
consistently positive principal and teacher reports of DDDM, the question arises: What can
explain this obvious disconnect? In proposing answers to this question, we will first discuss the
possibility of artificially exaggerated survey responses. Then, we will assume that the self
reported data is accurate and discuss the possible explanatory power of the inherent limitations in
the nature of our study. In the end, we believe that the difference between communicated district
80
beliefs regarding principal and teacher DDDM and the data directly reported by principals and
teachers can be reconciled.
Interpreting Self-Reported Data
Although our findings are surprisingly positive, reasons for caution should be carefully noted.
There are multiple reasons to believe that school-level understanding and use of DDDM is not as
positive as the picture presented by principal/teacher surveys and principal interviews.
Specifically, there are possible biases arising from self reports, a definite disconnect between
reported data use and confirmed data use, and examples of discrepancies between teacher reports
and principal reports.
First, it should be made clear that the quality of information obtained may have been affected by
the overwhelming reliance on self reports. It is well understood that self reports on surveys tend
to carry with them the possibility of less than fully reliable data. The likelihood of individuals
providing inaccurate data in this study is increased due to the fact that, in almost all survey items,
the respondent would have been able to identify the “best” answer. Respondents may have had a
natural tendency to select the socially or professionally desirable answer to present themselves
and their schools in a positive light, and a hesitancy to risk perceived possible sanctions
accompanying undesirable responses.
In ideal circumstances, it is possible to minimize the dangers of self reports. However, actual
circumstances did not permit this team to customize data collection in such favorable ways.
Instead, it was necessary for us to rely on building principals to determine the means by which
teacher surveys were distributed and collected in their schools. It is likely that teacher surveys
were completed during faculty meetings. In such a circumstance, it is possible that teachers
experienced a certain degree of implicit pressure to respond to questions favorably. Similarly,
principals completed surveys during a district-wide principal meeting. While such a controlled
setting helped tremendously in obtaining a 94% response rate, it is likely that the presence of
high level district administrators in the room affected the approach with which principals
answered survey questions.
Similarly, a short lead time with this project made it difficult for the researchers to introduce to
survey respondents the purpose of the study and, importantly, those things for which the study
would not be used. While survey respondents were assured of anonymity and interviewees were
promised confidentiality, we are not fully confident that study participants felt completely
assured that there responses could in no way be used to penalize them. While we have no
conclusive evidence regarding this, it would seem logical that the level of comfort with which
respondents answered questions depended heavily on the amount of trust that respondents felt for
the school and system in which they worked. The researchers cannot speak definitively about
the nature of trust that exists through the various schools and MNPS.
Supporting this hesitancy to give full confidence to the self-reported data is the fact that we have
actual evidence of having received possibly inaccurate data. Specifically, as previously
discussed, there is a discrepancy between principals’ self-reported use of TCAP Value Added
(TVAAS) data and evidence of their actual use. On the survey, principals were asked to report
how often they used the TCAP Value Added website. 52.4 percent of principals reported “often”
81
or “always” using the TVAAS website; 22.2 percent of principals reported “sometimes” using
the site; and only 17.5 percent of principals reported “rarely” or “never” logging onto the site.
Interestingly, however, over the course of the first four months of the school year, only 78 of the
133 district schools ever logged onto the website. Of those who did access the site, 66 schools
had logged on fewer than ten times, and a quarter of these had only logged on once.
In another instance, we compared results from principal surveys to those of teacher surveys.
When asked how often their principal trains teachers on data use, less than half (40 percent) of
teaches responded “often” or “always.” However, nearly 66% of principals responded “often” or
“always” to a similar question on the principal survey. Although less powerful than the TVAAS
login information, the discrepancy in the amount of training provided by principals is another
indication of the possibility of overestimations in self reported data.
TVAAS login data is the only survey item that we were able to check against known, objective
data. Because the results of this check indicate that principals may have provided
overestimations of their data use in this instance, the researchers are left with understandable
hesitancy to assume that overestimation did not occur in other survey and interview items.
Although we were not able to check self reported teacher data against known teacher data, it
must be said that researchers have the same questions surrounding teacher reported survey data.
Evidence of Misunderstanding
Although the researchers believe it is likely that self reports affected the quality of data, it is
possible that the data provided by respondents is largely an accurate representation of their
perceptions. In this case, it should be noted that perceptions still do not necessarily represent
reality. For instance, teachers and principals reported great confidence in their command of
DDDM. While it may be true that principals and teachers believe that they are skilled in
understanding and applying student achievement data, this does not necessarily mean that they
are actually skilled. Self reports of ability are not necessarily accurate measures of ability.
Clearly, individuals commonly misunderstand or lack full understanding of topics and issues
without being aware of their non-proficiencies. In fact, there is evidence of such possible
misunderstandings in the self-reported survey data.
For example, we discussed that principals have a tendency to use state assessment and district
assessment data for many similar reasons, which is a somewhat unexpected finding considering
that they are different types of assessments. State assessments are summative and provide
annual reports about student proficiency levels in core academic classes. District assessments,
by contrast, are formative and are meant to provide educators with regular feedback on students’
academic progress at various times throughout the course of a school year.
We found in the teacher surveys an example of teachers unwittingly demonstrating a less than
thorough understanding of the potential benefits of state assessments. It was surprising to note
the frequency with which elementary teachers believe that all state assessments are “not
applicable” to their teaching position. Over 40 percent of teachers see no application for TCAP
proficiency tests. Although TCAP testing does not begin until the 3rd grade, one would expect
kindergarten through second grade teachers who consistently value and apply data to examine
82
their students’ achievement results in future grades in order to understand the level to which their
students were prepared for higher elementary grade standards.
These two examples of possible incomplete understandings and applications of student
achievement data provide useful material to consider when contemplating the apparent
disconnect between principal and teacher self reports of DDDM and the district’s perception of a
“problem.” The above examples serve as hints to the incomplete and/or insufficient mastery of
DDDM that the district believes to exist among principals and teachers.
Conclusion
MNPS charged this capstone team with identifying the “scope of the problem” regarding DDDM
and suggesting means by which to improve DDDM. The information reported to the capstone
team via surveys and interviews contradicted the originally expressed view of the district
regarding a “problem.” For the most part, information that we received indicate that there are
much fewer problems with DDDM among principals and teachers. If these results are taken at
face value, then this should be a significant source of encouragement for district officials.
However, our team suggests that the district may not necessarily hold inaccurate perceptions of
the DDDM problem, due to limitations of self report. Additionally, we did identify some
important concerns in the self-reported data that gave rise to the following areas in need of
improvement:
 MNPS needs to more accurately and regularly gauge the DDDM practices of its
educators.
 MNPS needs to direct DDDM resources and training efforts toward the application of
data for decisions about teaching and learning at the classroom level.
 MNPS needs to provide principals with more explicit training focused on training
teachers in DDDM for instructional improvement.
In response to these needs, we have made recommendations for improvement, including
investing in teacher leadership, establishing expectations of participation in current leadership
development opportunities, creating a district-wide induction program for principals and
teachers, increasing principal time to focus on leadership for learning, and accessing funding
resources for high quality leadership development. Finally, MNPS is encouraged to continue
with future evaluations of its efforts to improve DDDM. High quality program evaluations will
provide increasingly useful data that will support district efforts to improve its training of
principals and teachers in DDDM.
83
References
Armstrong and Anthes (2001). Identifying the factors, conditions, and policies that
support schools’ use of data for decision making and school improvement: Summary of
findings. Education Commission of the States. Retrieved on February 27, 2007 from
http://www.ecs.org/clearinghouse/30/69/3069.htm
Baltimore City Schools. (2007) retrieved March 5, 2007 from
http://www.bcps.k12.md.us.
Broad Foundation (2007). Retrieved March 7, 2007 from www.broadfoundation.org.
Broad, M. & J. Newstrom (1992). Transfer of Training. Da Cappo Press.
Chicago Leadership Academies for Supporting School Success. (2007) retrieved March
1, 2007 from www.classacademies.org.
Cohen, D., & Garet, M. Ch. 11: Reforming educational policy with applied social
research, pp. 123-140. In D. Anderson & B. Biddle (1991), Knowledge for
policy: improving education through research. London: Falmer Press.
Consortium for School Networking (2006). 3-D Self-Assessment. Retrieved September
24, 2006 from http://www.3d2know.org/assessment/survey.cfm
Doyle, D. (2003). Data-Driven Decision-Making The Journal, 30 (10). Retrieved
February 27, 2007 from http://thejournal.com/the/printarticle/?id=16368
Durfour, R., & Eaker, R. (1998). Professional learning communities at work: Best
practices for enhancing student achievement. Blommington, IN: National Education
Service.
Englert, K., Fries, D., Goodwin, B., Martin-Glenn, M., Michael, S. (2004).
Understanding how principals use data in a new environment of accountability. Aurora,
CO: Mid-continent Research for Education and Learning. Retrieved from
http://www.mcrel.org/topics/products/189/.
Keeney, L. (1988). Using data for school improvement. Annenberg Institute for School
Reform. Retrieved on February 27, 2007 from
http://www.annenberginstitutute.org/tools/using_data.pdf
Kochran, F., Bredesen, P., & Riehl, C. (2002). Chapter 13: Rethinking the professional
development of school leaders. In J. Murphy (Ed.) The Education Leadership Challenge:
Redefining Education for the 21st Century. Chicago: IL, University of Chicago Press.
Levine, A. (2005). Educating school leaders. Washington, D.C.: The Education Schools
Project. Retrieved March 7, 2007 from http://www.edschools.org/reports_leaders.htm.
84
Marsh, J.A., Pane, J.F., & Hamilton, L.S. (2006). Making sense of data-driven decision
making in education. RAND, Occasional Paper.
Minnesota Department of Education (2006). Statewide Data-Driven Readiness Study.
Principal and Teacher Survey.
Murphy, et al. (2001). The productive high school: Creating personalized academic
learning communities. Newbury Park, Ca: Corwin.
NSDC (2001). Standards for staff development revised. National Staff Development
Council, Oxford: OH.
No Child Left Behind: A Toolkit for Teachers, Retried February 25, 2007 from
http://www.ed.gov.
Patton, M.Q. (2002). Qualitative Research and Evaluation Methods. Thousand Oaks, CA: Sage.
Petrides, L. and Nodine, T. (May 2005). Anatomy of school improvement:
Performance-driven practices in urban school districts. The Institute for the Study of
Knowledge Management of Education, The NewSchools Venture Fund. Retrieved on
July 23, 2005 from
http://www.newschools.org/viewpoints/documents/District_Performance_Pra
ctices.pdf
Rossi, P.; M. Lipsey; & H. Freeman (2004). Evaluation: A Systematic Approach.
Thousand Oaks, California: Sage Publications, Inc.
Rubin, H. & Rubin, I. (1995). Qualitative Interviewing. Thousand Oaks, CA: Sage.
Schletchy, P. (2005). Creating great schools:six critical systems at the heart of
educational innovation. San Francisco, CA: Jossey-Bass.
Spillane, J.P., & Seashore-Louis, K. (2003) School improvement processes and practices:
Professional learning for building instructional capacity. In J. Murphy (Ed.) The
Educational Leadership Challenge: Redefining Leadership for the 21st Century. Chicago,
IL: University of Chicago Press.
Supovitz, J.A. and Klein, V. (2003). Mapping a course for improved student learning:
How innovative schools systematically use student performance data to guide
improvement. Consortium for Policy Research in Education.
Survey of Chicago Public School Principals (2005, Spring). Consortium on Chicago
School Research. Retrieved September 24, 2006 from www.consortium-chicago.org .
Tennessee Department of Education Website (2006). www.state.tn.us/education.
85
Torrence, V.D. (2002). Principals’ use of data: a national perspective. Dissertation;
Education Leadership and Policy Studies, Virginia Polytechnic Institute.
Watson, J. and Mason, S. (2003). Understanding schools’ capacity to use data. American
Educational Research Association.
86
Appendix A
General District Interview Protocol
Introduction: Thank you for participating in this interview today. We plan to keep the
interview to less than one hour. The purpose of the study is not to evaluate your leadership or
performance as a school leader but rather to learn more about data use and training in Metro
Nashville Public Schools. With your permission, we would like to audio record this interview to
ensure accuracy of notes. All audio recordings will be destroyed once the answers have been
transcribed. Your participation is voluntary and you can stop if at any time you feel
uncomfortable.
Are you ready to begin?
(Turn audio tape on)
On the record ASK – Do we have your permission to record this interview?
Background Questions
1. How long have you worked in education?
2. How long have you served in this position with MNPS?
3. Other than your current position with MNPS, what professional roles have you filled in
education?
Data use in MNPS
Perception of and Attitude toward Data Use:
1. What do you believe to be the MNPS philosophy toward data-based decision-making?
Any examples?
a. Why did this philosophy develop? By whom?
b. How does the district communicate this philosophy to its educators (i.e.,
principals and teachers)?
2. Do you believe that data use is a priority in MNPS? Why or why not?
a. If it is a priority, how does the district communicate that to principals?
i. Time focused on data use?
ii. Resources/funding focused on data use?
b. If it is not a priority, do you believe it should be?
3. What is the general attitude about data use in MNPS among district officials? Among
principals? Among teachers?
a. That is, do people value data as a resource in decision-making at the district and
school-level? Are they motivated to use it?
Principal Capacity to Use Data:
1. Do you believe that data is used effectively or ineffectively by educators in MNPS?
Explain
2. Do you believe that MNPS principals have the capacity to use data? Why do you believe
so?
87
a. What kind of experience and/or knowledge do they have to use data effectively?
3. Do they have the capacity to train teachers to use data effectively? Why do you believe
so?
a. What kind of experience and/or knowledge do they have to train teachers
effectively?
4. Do you believe that teachers have the capacity to use data? Why?
a. What kind of experience and/or knowledge do they have to use data effectively?
5. Do you believe that the district provides principals and teachers with the necessary
resources to improve their data use capacity? Why?
a. What kind of learning opportunities, information, and/or technical assistance does
the district provide to principals? To teachers?
b. How does the district determine principals’ capacity needs? Teachers’ capacity
needs?
Quality of Data Use
1. We have learned that some district officials view educators’ data use practices as
problematic.
a. Why do you believe they would feel this way?
b. Do you agree? Why or why not?
2. If you do not believe there is a problem …
a. Why do you believe data use practices are of such high quality in MNPS?
b. Do you think they could be improved? Why or why not?
i. If so, what efforts would you recommend to improve data use?
3. If you believe there is a problem …
a. How would you describe it?
b. Why do you believe this problem exists?
c. What efforts have been made to address this problem, if any?
d. What other efforts would you recommend to address this problem?
We are now finished with the interview. Do you have any questions for us?
Thank you very much for your time. We appreciate your insight as we learn more about data use
practices in MNPS.
88
Appendix B
MNPS Principal Interview Protocol
Introduction: Thank you for participating in this interview today. We plan to keep the
interview to 10 to 15 minutes in length. The purpose of the study is not to evaluate your
leadership or performance as a school leader but rather to learn more about data use and training
in Metro Nashville Public Schools. Your identity is confidential and will at no time be revealed.
PROFESSIONAL EXPERIENCE:
Professional Background:
1. How long have you been a principal? How long have you been the principal in this
school?
DATA QUESTIONS
Personal attitudes about data use
2. How would you describe your comfort level in using student data?
3. What student data is most valuable to your school?
Teacher data use
4. What role do you see yourself playing – if any – in helping teachers use student data?
a. What does data training look like at your school?
b. What are your expectations for teacher use of data in your school? (Probe - What
kind of data? How often?)
5. What data do teachers need to help improve student achievement?
6. What kind of training do teachers need to help improve their use of data to improve
student achievement?
Support for data use
7. What training do you need to improve your use of data driven decision making in your
school?
a. In your opinion, are these needs being addressed by current training opportunities
in the district?
8. What support do you receive from the central office in organizing and accessing student
data?
b. What do you do with the information you receive?
9. How could the district improve their support of you and your school? (probe – in the
area of using data)
89
10. In your opinion, what are the barriers to data use?
c. Personally?
d. In your school?
e. In MNPS district?
90
Appendix C
Principal Survey
Question #1: How often do you use test results from the following to improve teaching and
learning in your building?
(Circle one number for each
item)
A. TCAP Proficiency Levels
I Do
not
have
access
to this
data
Never
0
1
Rarely
Sometimes
Often
Always
Not
Applicable
to my
school/
grade
level
2
3
4
5
6
B.
TCAP Writing
0
1
2
3
4
5
6
C.
Gateway
0
1
2
3
4
5
6
D.
State End-of-Course Tests
0
1
2
3
4
5
6
E.
ACT Test Results
0
1
2
3
4
5
6
F.
Student IQ Test Data
0
1
2
3
4
5
6
G.
SAT Test Results
0
1
2
3
4
5
6
H.
TCAP Value Added
0
1
2
3
4
5
6
I.
ACT EXPLORE Results
0
1
2
3
4
5
6
J.
ACT PLAN Results
0
1
2
3
4
5
6
K.
PSAT
0
1
2
3
4
5
6
L.
Informal Reading Inventory IRI
0
1
2
3
4
5
6
M.
IB Exam Results
0
1
2
3
4
5
6
N.
AP Exam Results
0
1
2
3
4
5
6
O.
IPT/CELLA
0
1
2
3
4
5
6
P.
Language! Assessments
0
1
2
3
4
5
6
Q.
District Math Assessments
0
1
2
3
4
5
6
R.
District Reading Tests
0
1
2
3
4
5
6
S.
District Writing Assessment
0
1
2
3
4
5
6
T.
District Social Studies Test
0
1
2
3
4
5
6
U.
District Science Assessment
0
1
2
3
4
5
6
V.
District Art Assessments
0
1
2
3
4
5
6
W. District Music Assessments
0
1
2
3
4
5
6
X.
0
1
2
3
4
5
6
District Foreign Language Test
91
Question #2: In the past 12 months, have you participated in training related to the
following topics? Circle all that apply.
Training Type:
In the
District
A. Classroom strategies for applying the results of
student achievement data
B. Using technology as a tool for interpreting the
results of student achievement data
C. Understanding the connection between data
driven decision making and student learning
D. The use of professional development time to
focus on student data.
Not at
all
1
Outside
of the
district
2
1
2
0
1
2
0
1
2
0
0
Question #3: Teachers in my building use student achievement data to:
I do not
know
Never
Rarely
Sometimes
Often
Always
A. Determine student
instructional needs
0
1
2
3
4
5
B. Plan instruction
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
C. Monitor student
academic progress
D. Sort students into ability
groups
E. Identify areas where
they need to strengthen
content knowledge
F. Improve teaching
strategies
G. Involve parents in
student learning
H. Recommend tutoring
services or other
remediation programs
I. Identify students
needing special
education services
J. Measure the
effectiveness of their
instruction
K. Inform collaborative
efforts to improve
curriculum and
instruction
L. Make changes in their
instruction based on
assessment results
Question #4: How long have you been a principal? _________________ Years
4A. How long have you been the principal at your current school? ____ Years
92
Question #5: How many college courses (3 credit hour equivalents) did you have in data
driven decision making or student assessment in your administrative course work?
____1 ____2 _____3
____4
_____5
_____6
_____7
Question #6: Have you participated in the Principals Leadership Academy of Nashville
(PLAN) at Vanderbilt University?
__________YES
_______NO (If no, please move on to question 10)
Question #6A. If so, what YEAR did you participate? _________
Question #7: Did you complete the program? ________ YES
____________ NO
Question #7A: If not, Why not?_____________
Question #8: Would you recommend a MNPS principal colleague participate in PLAN if they
have not yet done so?
_____________YES
______________ NO
Question #9: Do you use the information learned in PLAN in your work as a principal?
_______________ YES
______________NO
Question #10: On average, how many hours per week do you spend on the following work
related activities:
A.
B.
C.
Hours per week
Activity
Curriculum
Planning (General, SIP development)
Budget
D.
Personnel
E.
Staff development
F.
Internal school management
G.
Central office meetings or tasks
H.
Student-related activities
I.
Working with parents
J.
Community organizations
K.
L.
Personal professional development
Evaluating Teachers
M.
Student discipline
Question #11: Please choose the best answer for each of the survey items below.
93
(Circle one number for each item)
A. Teachers in my school can easily
access the information they need
from school and district data systems
B.
Teachers and students
communicate frequently about
student performance data
C.
Student performance data
available to me are accurate and
complete
D. Student performance data are
easily available to the people that
need them
E.
Parents and community
members know what our school is
doing and what is needed to improve
student achievement
F.
Staff development time is
focused on using data to improve
student learning and student
achievement
G. Successful educational practices
are widely shared in the district
H. My professional development
has helped me use data more
effectively
I.
I have control over staff
development plans for my school
J.
Student achievement data are
used to determine teacher
professional development needs and
resources
K.
As a school we have open and
honest discussions about data
L.
I am a valued member of my
district’s data-driven reform efforts
M. I receive the needed support
from the central office to use data in
my school
N. My school’s improvement goals
are clear, measurable, and based on
student data
O. Our districts goals are focused
on student learning
Do Not Know
Strongly
Disagree
Disagree
Neutral
Agree
Strongly
Agree
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
94
Question #12: Please choose the best answer for each statement below.
(Circle one number for each item)
A. The district has involved
educators in the process of
identifying and creating an
inventory of data elements that
must be captured to meet school
and district reporting
requirements.
B. The district information systems
are currently capturing all data
necessary to meet teacher,
counselor, building administrator
and central office administrator
reporting requirements.
C. The district regularly surveys
teachers, staff, and
administrators to find out what
information they need to make
informed decisions.
D. The district is able to "easily"
extract the data necessary to
meet teacher, counselor, and
building administrator needs.
E. The district is able to provide
data necessary to follow trends
as well as growth of individual
students or cohorts of students
over time.
F. The district has a process in
place for identifying,
recommending and implementing
intervention strategies based on
data that has been collected and
analyzed.
G. The data provided by the district
is up to date.
H. District and school administrators
have plans to provide technical
support to teachers in
implementing targeted
interventions to improve low
performance for individual
students, classes and schools
based on the data that has been
collected and analyzed.
Not
Important
Never
Rarely
Sometimes
Often
Always
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
95
Question #13: Please indicate the availability and your use of the following types of data
from the TCAP tests: (Circle one number in each row.)
A.
B.
C.
D.
E.
F.
G.
Type of data
Percent of students at each
performance level (below
proficient, proficient, etc.)
Scale scores or other
scores that show how close
students are to
performance levels
Results for subgroups of
students (e.g.,
racial/ethnic subgroups,
LEP students, students
with disabilities,
economically
disadvantaged students)
Results for each grade
level
Not
available
in this
way
Results for each classroom
Results for individual
students
Changes or trends in test
results across years
Available and…
Have
reviewed
Have not
but did
reviewed
not use
Used
minimally
Used
moderately
Used
extensively
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
Question #14: How often do you use results from STATE ASSESSMENTS in your school for
the following purposes:
(Circle one number for each item)
Not Applicable
Never
Rarely
Sometimes
Often
Always
A.
Identify individual students who
need remedial assistance
0
1
2
3
4
5
B.
Tailor instruction to individual
student needs
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
C.
D.
Identify and correct gaps in
curriculum for all students
Improve or increase the
involvement of parents in student
learning
E.
Evaluate teacher performance
0
1
2
3
4
5
F.
Recommend tutoring or other
educational services to students or
their parents
0
1
2
3
4
5
G.
Write school improvement plan
0
1
2
3
4
5
H.
Identify staff development needs
for my school
0
1
2
3
4
5
I.
Assign student grades
0
1
2
3
4
5
J.
Assign or reassign students to
classes or groups
0
1
2
3
4
5
96
Question #15: How often do you use results from DISTRICT ASSESSMENTS in your school
for the following purposes:
(Circle one number for each item)
A.
B.
C.
D.
E.
F.
G.
H.
Identify individual students
who need remedial assistance
Tailor instruction to individual
student needs
Identify and correct gaps in
curriculum for all students
Improve or increase the
involvement of parents in
student learning
Evaluate teacher performance
Recommend tutoring or other
educational services to
students or their parents
Write school improvement
plan
Identify staff development
needs for my school
Not Applicable
Never
Rarely
Sometimes
Often
Always
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
I.
Assign student grades
0
1
2
3
4
5
J.
Assign or reassign students to
classes or groups
0
1
2
3
4
5
Question #16: Please indicate the degree or frequency for the following questions:
(Circle one number for each item)
A.
B.
C.
D.
E.
F.
G.
With what frequency are you
comfortable using data-driven
decision-making in your school?
The district office encourages our
school to use data-driven
decision-making
The district office provides the
necessary resources that make it
possible for me to use datadriven decision-making.
The district provides training for
data drive decision making.
I provide the necessary training
for my staff in data use and data
driven decision making.
I use the state TVAAS restricted
website to analyze student test
data.
How often do you believe that
your teachers are using datadriven decision-making in their
classrooms?
Not Applicable
Never
Rarely
Sometimes
Often
Always
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
97
DEMOGRAPHIC INFORMATION
Question #17: What is the highest degree that you have earned?
[ ] Bachelors
[ ] Masters
[ ] Masters +30 hours
[ ] EdS
[ ] Doctorate
Question #18: Are you presently pursuing graduate coursework related to the field of
education?
[ ] Yes
[ ] No
Question #19: If you have a doctorate, please indicate the type:
PhD
Question #20: What is your gender?
[ ] male
[ ] EdD
[ ] female
Question #21: Which of the following is your current job assignment?
_____ High School Principal
_____ Middle School Principal
_____ Elementary School Principal
_____ Other:_____________________________________ (Please specify)
Thank you for your participation!
98
[]
Appendix D, Principal Survey: Concept Map
INPUTS
Data accuracy and availability
District training for DDDM
Data reporting and communication
Principal experience
1. Tenure
2. Placement
3. Education
Other training (i.e., PLAN)
INTERMEDIATE OUTCOMES
Principal knowledge/self-efficacy
Survey Items
Q11 A, C, D
Q13 A-G
Q16 C
Q2 A-D
Q11 M
Q16 D
Q12 A-H
Source(s)
Minnesota Statewide DDDM
Readiness Study Principal
Survey (2003)
Minnesota Statewide DDDM
Readiness Study Principal
Survey (2003)
Consortium for School
Networking 3D Self Assessment
(2006)
Q4
Q21
Q5
Q17
Q18
Q19
Q6
Q7
Q8
Q9
Watson & Mason (2003)
Watson & Mason (2003)
Levine (2005);
Watson & Mason (2003)
Q11 H, L
Q16 A
Torrence (2002)
PLAN
Englert et. al (2004)
Principal data use
1. Type of data
a. National assessments
b. State assessments
c. District assessments
d. Special assessments
2. Reasons for data use
a. State assessments
b. District assessments
c. Level of data
(i.e., specificity of data results)
Instructional leadership
1. Use of time
Q1 E, F, I, J, K, M, N
Q1 A,B, C, D, H
Q13 A-G
Q16 F
Q1 P-X
Q1 F, L, O
Q14 A-J
Q15 A-J
Q13 A-G
Q10 A-M
2. Training teachers
Q11 F, I, J
Q16 E
Professional collegiality
Q11 G, K, L
Expectations for improvement
Q11 E, N, O
Q16 B
LONG-TERM OUTCOMES
Teacher data use
1. Perceptions of data use among school
teachers
Q3 A-L
Q11 B; Q16 G
99
MNPS District Documents (Fall
2006)
TN Department of Education,
TVAAS Website (Fall 2006)
MNPS District Documents (Fall
2006)
Englert et. al (2004)
Suporvitz & Klein (2003)
Suporvitz & Klein (2003)
TCAP data (Fall 2006)
Consortium on Chicago School
Research (2005)
Minnesota Statewide DDDM
Readiness Study Principal
Survey (2003)
Spillane and Seashore-Louis
(2003)
Minnesota Statewide DDDM
Readiness Study Principal
Survey (2003)
Consortium on Chicago School
Research (2005)
Appendix E
Teacher Survey
Question #1: How often do you use achievement data from the following assessments to
improve teaching and learning in your classroom?
(Circle one number for each
item)
Y. TCAP Proficiency Levels
I do
not
have
access
to this
data
Never
Rarely
Sometimes
Often
Always
Not
applicable
to my
school/
grade level
0
1
2
3
4
5
6
0
1
2
3
4
5
6
AA. Gateway
0
1
2
3
4
5
6
BB. State End-of-Course Tests
0
1
2
3
4
5
6
CC. ACT Test Results
0
1
2
3
4
5
6
DD. Student IQ Test Data
0
1
2
3
4
5
6
EE. SAT Test Results
0
1
2
3
4
5
6
FF. TCAP Value Added
0
1
2
3
4
5
6
GG. ACT EXPLORE Results
0
1
2
3
4
5
6
HH. ACT PLAN Results
0
1
2
3
4
5
6
II.
PSAT
0
1
2
3
4
5
6
JJ.
Informal Reading Inventory IRI
0
1
2
3
4
5
6
KK. IB Exam Results
0
1
2
3
4
5
6
LL. AP Exam Results
0
1
2
3
4
5
6
MM. IPT/CELLA
0
1
2
3
4
5
6
NN. Language! Assessments
0
1
2
3
4
5
6
OO. District Math Assessments
0
1
2
3
4
5
6
PP. District Reading Tests
0
1
2
3
4
5
6
QQ. District Writing Assessment
0
1
2
3
4
5
6
RR. District Social Studies Test
0
1
2
3
4
5
6
SS. District Science Assessment
0
1
2
3
4
5
6
TT. District Art Assessments
0
1
2
3
4
5
6
UU. District Music Assessments
0
1
2
3
4
5
6
VV. District Foreign Language Test
0
1
2
3
4
5
6
Z.
TCAP Writing
100
Question #2: In the past 12 months, have you participated in training related to the
following topics? Circle all that apply.
Training Type:
Conducted
by my
school
A. Classroom strategies for applying the
results of student achievement data
B. Using technology as a tool for
interpreting the results of student
achievement data
C. Understanding the connection
between data driven decision making
and student learning
D. Working with teachers in my school
to evaluate and interpret student
achievement data.
Conducted
outside of
the
district
Not
at all
1
Conducted
by the
district
(i.e.,
districttraining
not
specific to
my
school)
2
3
0
1
2
3
0
1
2
3
0
1
2
3
0
Question #3: Teachers in my building use student achievement data to:
I do not
know
Never
Rarely
Sometimes
Often
Always
A. Determine student
instructional needs
0
1
2
3
4
5
M. Plan instruction
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
N. Monitor student
academic progress
O. Sort students into ability
groups
P. Identify areas where
teachers need to
strengthen content
knowledge
Q. Improve teaching
strategies
R. Involve parents in
student learning
S. Recommend tutoring
services or other
remediation programs
T. Identify students
needing special
education services
U. Measure the
effectiveness of their
instruction
V. Inform collaborative
efforts to improve
curriculum and
instruction
W. Make changes in their
instruction based on
101
assessment results
Question #4: Circle the best response for each item below.
Strongly disagree
Disagree
A. I value student
1
2
achievement data as a
useful tool for making
instructional decisions.
B. Student achievement data
1
2
is important for monitoring
students’ academic
performance.
C. Student achievement data
1
2
assists teachers with
improving instruction for
students.
D. Student achievement data
1
2
has a positive impact on
student learning in school.
E. I am knowledgeable of
1
2
ways to collect student
achievement data.
F. I am able to understand
1
2
reports on student
achievement data.
G. I have the skills with
1
2
technology that enable me
to make use of student
achievement data.
H. I know how to improve
1
2
classroom instruction by
using student achievement
data.
Agree
3
Strongly Agree
4
3
4
3
4
3
4
3
4
3
4
3
4
3
4
Question #5: How long have you been a teacher (including this 2006-07 school year)?
_________________ Years
5A. How long have you been a teacher at your current school (including this 2006-07
school year)?
_________________ Years
Question #6: How many college courses (3 credit hour equivalents) have you had in data
driven decision making or student assessment in your teacher preparation course work?
_____1
____2
_____3
_____4
_____5
_____6
_____7
Question #7: To what extent do the following issues hinder teachers’ use of student achievement data to
make instructional decisions at your school? Circle the most appropriate response for each item below.
Not a barrier
Small barrier
102
Moderate barrier
Large barrier
A. Outdated technology
B. Lack of training/professional
development
C. Lack of ongoing support from
my school principal
D. Doubts about the
importance of achievement data for
improving student learning
E. Too many other demands
related to teaching
1
1
2
2
3
3
4
4
1
2
3
4
1
2
3
4
1
2
3
4
Question #8: Please choose the most accurate answer for each of the survey items below.
(Circle one number for each item)
A. Teachers in my school can easily
access the information they need
from school and district data systems
P.
Teachers and students
communicate frequently about
student achievement data
Q. Student achievement data
available to me are accurate and
complete
R.
Student achievement data are
easily available to the people that
need them
S.
Parents and community
members know how well our school
is doing and what is needed to
improve student achievement
T.
Staff development time is
focused on using achievement data
to improve student learning and
student achievement
U.
Successful instructional
practices are widely shared in my
school.
V.
My professional development
has helped me to better understand
my students’ achievement data
results.
W. My professional development
has helped me to apply my students’
achievement data results to my
classroom instructional strategies.
X.
In my school, student
achievement data are used to
determine teachers’ professional
development needs and resources
Y.
As a school, we have open and
honest discussions about
achievement data
Z.
I am a valued member of my
school’s efforts to improve the use of
Do Not Know
Strongly
Disagree
Disagree
Neutral
Agree
Strongly
Agree
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
103
achievement data to guide
instructional decisions.
AA. I receive the needed support
from the central office to use
achievement data in my school
BB. I receive the needed support
from my school principal to use data
in my school.
CC. My school’s improvement goals
are clear, measurable, and based on
student achievement data
DD. My school’s goals are focused
on student learning
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
Question #9: Please choose the best answer for each statement below.
Not
Never
Rarely
Sometimes
(Circle one number for each item)
Important
The district regularly surveys
teachers to find out what
information they need to make
informed instructional decisions.
J. The achievement data provided
to me by the district is up to
date.
K. District administrators use
achievement data to develop and
implement instructional
interventions to improve low
performance of individual
students and classes.
L. My school principal uses
achievement data to develop and
implement instructional
interventions to improve low
performing students and classes.
M. The district provides me with
achievement data necessary to
follow trends as well as growth
of individual students or cohorts
of students over time.
N. My school principal provides me
with achievement data necessary
to follow trends as well as
growth of individual students or
cohorts of students over time.
Often
Always
I.
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
Question #10: Please indicate the availability and your use of the following types of data
from the TCAP tests: (Circle one number in each row.)
Type of data
Not
applicable
Not
available
Available and…
Have not
Have
reviewed
reviewed
104
Used
minimally
Used
moderately
Used
extensively
A.
B.
C.
to my
teaching
position
in this way
0
1
2
3
4
5
6
0
1
2
3
4
5
6
0
1
2
3
4
5
6
Percent of students at each
performance level (below
proficient, proficient, etc.)
Scale scores or other scores
that show how close students
are to performance levels
Results for subgroups of
students (e.g., racial/ethnic
subgroups, LEP students,
students with disabilities,
economically disadvantaged
students)
but did not
use
D.
Results for each grade level
0
1
2
3
4
5
6
E.
Results for each classroom
0
1
2
3
4
5
6
F.
Results for individual students
0
1
2
3
4
5
6
G.
Changes or trends in test
results across years
0
1
2
3
4
5
6
Question #11: How often do you use results from STATE ASSESSMENTS in your school for
the following purposes:
(Circle one number for each item)
Not applicable to
my teaching
position
Never
Rarely
Sometimes
Often
Always
A.
Identify individual students who
need remedial assistance
0
1
2
3
4
5
B.
Tailor instruction to individual
student needs
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
C.
D.
E.
F.
Identify and correct gaps in
curriculum for all students
Improve or increase the
involvement of parents in student
learning
Evaluate your own teaching
performance
Recommend tutoring or other
educational services to students or
their parents
G.
Write school improvement plan
0
1
2
3
4
5
H.
Identify personal staff
development needs
0
1
2
3
4
5
I.
Assign student grades
0
1
2
3
4
5
J.
Assign or reassign students to
classes or groups
0
1
2
3
4
5
Question #12: How often do you use results from DISTRICT ASSESSMENTS in your school
for the following purposes:
(Circle one number for each item)
Not applicable
to my teaching
position/subject
105
Never
Rarely
Sometimes
Often
Always
A.
B.
C.
D.
E.
F.
G.
H.
Identify individual students
who need remedial assistance
Tailor instruction to individual
student needs
Identify and correct gaps in
curriculum for all students
Improve or increase the
involvement of parents in
student learning
Evaluate your own teaching
performance
Recommend tutoring or other
educational services to
students or their parents
Write school improvement
plan
Identify personal staff
development needs
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
I.
Assign student grades
0
1
2
3
4
5
J.
Assign or reassign students to
classes or groups
0
1
2
3
4
5
Sometimes
Often
Always
Question #13: Please indicate the frequency for the following questions:
Not Applicable
(Circle one number for each item)
A.
B.
C.
D.
E.
F.
G.
How often are you comfortable
using achievement data to guide
instructional decisions?
My school principal encourages
teachers at my school to use
achievement data to guide
instructional decisions.
I have the necessary resources to
use achievement data to guide
instructional decisions.
The district provides training for
using achievement data to guide
instructional decisions.
My principal provides training for
using achievement data to guide
instructional decisions.
I use the state TVAAS results to
analyze my students’ test data.
How often do you believe that
teachers in your school are using
achievement data to guide
instructional decisions.?
Never
Rarely
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
0
1
2
3
4
5
BACKGROUND INFORMATION
Question #14: What is the highest degree that you have earned?
[ ] Bachelors
[ ] Masters
[ ] Masters +30 hours
106
[ ] EdS
[ ] Doctorate
Question #15: Are you presently pursuing graduate coursework that is related to your teaching
profession?
[ ] Yes
[ ] No
Question #16: If you have a doctorate, please indicate the type:
PhD
[ ] EdD
[]
Question #17: Which of the following categories best describes your school?
_____ Elementary school (i.e., ranges from Pre-kindergarten/Kindergarten to 4th grade)
_____ Middle school (i.e., ranges from 5th grade to the 8th grade)
_____ High school (i.e., ranges from 9th grade to the 12th grade)
_____ Other (i.e., includes a different grade configuration than identified above)
Please explain the “other” grade configuration _________________________
Question #18: Which of the following categories best describe your current teaching
assignment(s)? Please check all applicable responses.
_____ English, Reading, Language Arts teacher
_____ Mathematics teacher
_____ Science teacher
_____ History, Social Studies teacher
_____ Special education teacher (i.e., instruct special education students in a “pull out”
classroom setting; that is, students are not in a mainstream classroom)
_____ English Language Learner teacher (i.e., instruct English Language Learners in a “pull out”
classroom setting; that is, students are not in a mainstream classroom)
______ Advanced, Honors, AP/IB teacher
_____ Other: please specify _____________________________________
Thank you for your participation!
107
Appendix F
Teacher Survey: Concept Map
INPUTS
Data accuracy and availability
DDDM Staff Development
Other district support for teachers
Teacher experience
1. Tenure
2. Placement
3. Education
INTERMEDIATE OUTCOMES
Instructional leadership/Principal support
Professional collegiality
Expectations for improvement
LONG-TERM OUTCOMES
Teacher attitudes re: value of data
Teacher knowledge/self-efficacy
Survey Items
Q8 A, C, D
Q9 B
Q10 A-G
Q2 A-D
Q8 F, H, I, J
Q13 D
Q8 M
Q9 A, C, E
Source(s)
Minnesota Statewide
DDDM Readiness Study
Survey (2003)
Minnesota Statewide
DDDM Readiness Study
Principal Survey (2003)
Consortium for School
Networking 3D Self
Assessment (2006)
Q5
Q17
Q18
Q6
Q14
Q15
Q16
Watson & Mason (2003)
Watson & Mason (2003)
Q8 N
Q9 D, F
Q13 E
Q8 G, K
Spillane and SeashoreLouis (2003)
Q8 E, O, P
Q13 B
Q4 A-D
Q4 E-H
Q13 A
Q7 A-E
Perceived barriers
Teacher data use
1. Perceptions of data use among school
Q3 A-L
teachers (i.e., Culture of DDDM)
Q8 B
2. Type of data for personal use
a. National assessments Q1 E, F, I, J, K, M, N
b. State assessments Q1 A,B, C, D, H
c. District assessments Q1 P-X
d. Special assessments Q1 F, L, O
3. Reasons for personal data use
a. State assessments Q11 A-J
b. District assessments Q12 A-J
c. Level of data (i.e., specificity of data results) Q10 A-G
108
Watson & Mason (2003)
Spillane and SeashoreLouis (2003)
Minnesota Statewide
DDDM Readiness Study
Principal Survey (2003)
Torrence (2002)
Torrence (2002)
USDOE (1999)
Consortium on Chicago
School Research (2005)
MNPS District Documents
(Fall 2006)
TN Depart. Of Education,
TVAAS Website (Fall
2006)
MNPS District Documents
(Fall 2006)
MNPS District Documents
(Fall 2006)
Englert et. al (2004)
Torrence (2002)
Torrence (2002)
TCAP results (Fall 2006)
109
Download