Race to the Top Progress Update

advertisement
1
Race to the Top
Progress Update - On-site Review
Part B: In preparation for the on-site program review and Secretary stocktake, States must update
applicable questions (i.e., those for which there is new information) and provide appropriate
documentation to substantiate its responses for all relevant application sub-criterion (e.g. (A)(2) and
(D)(4)). 1
Application sub-criterion:2 (c)(2) Accessing and Using State Data
STATE’s goals for this sub-criterion:

Continue providing data and information products that stakeholders can use to inform
their decisions regarding policy and services.

Provide professional development on Common Education Data Analysis & Reporting
Systems (CEDARS) business intelligence tools and how to produce annual and
longitudinal reports.
Relevant projects:

Continue providing data and information products (School Report Cards, etc.; as
discussed in RttT application) for use by various stakeholders

Continue developing CEDARS products to make more accessible and increase use;
continue providing professional development for school district and charter school
personnel regarding how to use CEDARS tools
1. Is the State on-track to implement the activities and meet the goals and performance
measures that are included in its approved scope of work for this sub-criterion? If
so, explain why. If not, explain why not.
Data and Information Products
North Carolina is on-track to meet the goals and performance measure for sub-criterion
(C)(2), though some activities are delayed. NC continues to deliver various products for
use by all stakeholders, including local education agency (LEA) personnel, parents,
researchers and the general public. These products include:
 School Report Cards – Due to the change in North Carolina's Accountability
model for the 2012-13 school year and the delay in receiving and evaluating
assessment data, the release of the School Report Card (SRC) will be later than
usual this year. This year's SRC release date is January 28, 2014. The assessment
results for 2012-13 were released at the State Board of Education Meeting on
November 7, 2013. This delay is due to the new assessments, which are aligned to
1
Note that States will only be required to submit documentation for the on-site program review, not for monthly calls. States
should work with their Program Officers to determine relevant state-specific documentation.
2
All highlighted fields will be pre-populated by the Department Program Officer prior to State completion.
1
2
the new Standard Course of Study (Common Core and Essential Standards), and
the required additional processes to set achievement level cut scores.

EVAAS reports – EVAAS results will be released in January 2014.

CEDARS - The CEDARS effort has slowed significantly in the past year. The
staff member responsible for CEDARS training has been reassigned to implement
the new Student Information System (SIS), so no new trainings have been held. In
addition, many of the staff in the LEAs that would have attended the CEDARS
training are working on the conversion to PowerSchool and are not available for
work in CEDARS. However, as we move forward with the implementation of
Home Base, CEDARS will play a more critical role with higher visibility than
originally expected.

Data for External Users
o NC Education Research Data Center (NC ERDC)
o Carolina Institute for Public Policy
o The Employment Security Commission - Common Follow-up System
(CFS)
o RttT Evaluation Team
NC continues its partnership with the NC ERDC at Duke University and the
Carolina Institute for Public Policy at UNC-Chapel Hill. Through these
arrangements, governed by annually updated memoranda of understanding, the
North Carolina Department of Public Instruction (NCDPI) supplies data that is
made available for various researchers studying education policy issues. North
Carolina also provides data to the multi-agency team conducting the RttT
evaluation for the state.
2. Does the State have evidence indicating the quality of implementation for this subcriterion? What is/has the State doing/done as a result of this information?
Data quality continues to improve as a result of CEDARS implementation. At this time
CEDARS is primarily used by internal NCDPI staff for data validation. Both the length
of time required for business owners to review data submissions and the turnaround time
for any re-work has diminished dramatically in the past year, as evidenced by the more
streamlined process for preparing files for Federal submission. For the 2011-12 and
2012-13 school years, NC DPI will submit all applicable Federal files from the CEDARS
data warehouse
The validation tools built into CEDARS allow for more rapid identification of problem
areas that can quickly be turned over to the appropriate business area for resolution.
Inserting these edit checks earlier in the data-loading process has reduced the amount of
2
3
time spent correcting data and generating Education Data Exchange Network (EDEN)
reports. These validations have also resulted in fewer corrections required for the EDEN
reports upon submission.
3. What obstacles and/or risks could impact the State’s ability to meet its goals and
performance measures related to this sub-criterion?
The most significant factor putting us at risk of not achieving our goals is the limited
resources at both the SEA and LEA levels that must be spread across multiple projects.
Given how many tasks (particularly new initiatives) to which limited personnel in LEAs
must attend, engaging them in using data (through CEDARS or otherwise) in new ways
to make decisions is challenging. This lack of resources is particularly apparent this year
as we institute a new SIS at both the local and State level. Similarly, limitations on State
staff time translate into limitations on outreach that NCDPI can provide. These realities
will have an impact the rate at which we will make progress toward achievement of our
goals.
Evaluation: Based on the responses to the previous question, evaluate the State’s
performance and progress to date for this sub-criterion (choose one)
Red (1)
Orange (2)
Yellow (3)
Green (4)3
3
Red – substantially off-track and/or has significant quality concerns; urgent and decisive action is required; Orange –off-track
and/or there are quality concerns; many aspects require significant attention; Yellow –generally on-track and of high or good
quality; only a few aspects require additional attention; Green – on-track with high quality.
3
4
Paperwork Reduction Act Statement
According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such
collection displays a valid OMB control number. Public reporting burden for this collection of information is estimated to
average 74 hours (annually) per response, including time for reviewing instructions, searching existing data sources, gathering
and maintaining the data needed, and completing and reviewing the collection of information. The obligation to respond to this
collection is required to obtain or retain benefit (34 CFR 75.720, 75.730-732; 34 CFR 80.40 and 80.41). Send comments
regarding the burden estimate or any other aspect of this collection of information, including suggestions for reducing this
burden, to the U.S. Department of Education, 400 Maryland Ave., SW, Washington, DC 20210-4537 or email
ICDocketMgr@ed.gov and reference the OMB Control Number 1894-0011.
4
Download