Race to the Top Progress Update Sub-criterion (C)(2)

advertisement
(C)(2) Accessing and Using State Data, Part B Narrative, North Carolina, May 2014
Page 1 of 3
Race to the Top Progress Update
Sub-criterion (C)(2)
Part B: In preparation for monthly calls, States must submit written responses to the following
questions for two application sub-criteria (e.g. (A)(2) and (D)(4)). 1 All responses in this section
should be tailored to the goals and projects associated with this sub-criterion.
Application sub-criterion:2 (c)(2) Accessing and Using State Data
STATE’s goals for this sub-criterion:
•
Continue providing data and information products that stakeholders can use to inform
their decisions regarding policy and services.
•
Provide professional development on Common Education Data Analysis & Reporting
Systems (CEDARS) business intelligence tools and how to produce annual and
longitudinal reports.
Relevant projects:
•
Continue providing data and information products (School Report Cards, etc.; as
discussed in RttT application) for use by various stakeholders
•
Continue developing CEDARS products to make more accessible and increase use;
continue providing professional development for school district and charter school
personnel regarding how to use CEDARS tools
1. Is the State on-track to implement the activities and meet the goals and performance
measures that are included in its approved scope of work for this sub-criterion? If
so, explain why. If not, explain why not.
Data and Information Products
North Carolina is on-track to meet the goals and performance measure for sub-criterion
(C)(2). NC continues to deliver various products for use by all stakeholders, including
local education agency (LEA) personnel, parents, researchers and the general public.
These products include:
•
School Report Cards – published January 28, 2014
•
EVAAS reports
•
CEDARS
•
Data for External Users
1
2
On each monthly call, program officers and states should work together to select two sub-criteria for the following month.
All highlighted fields will be pre-populated by the Department Program Officer prior to State completion.
(C)(2) Accessing and Using State Data, Part B Narrative, North Carolina, May 2014
Page 2 of 3
o NC Education Research Data Center (NC ERDC)
o Carolina Institute for Public Policy
o The Employment Security Commission for the Common Follow-up
System (CFS)
o RttT Evaluation Team
NC continues its partnership with the NC ERDC at Duke University and the Carolina
Institute for Public Policy at UNC-Chapel Hill. Through these arrangements, governed
by memoranda of understanding, the North Carolina Department of Public Instruction
(NCDPI) supplies data that is made available to approved agents studying education
policy issues under the direction of NCDPI staff. North Carolina also provides data to the
multi-agency team conducting the RttT evaluation for the state as stipulated by USED.
CEDARS
While the amount of in-person CEDARS training has decreased, refresher documents and
videos also available on the internet
(http://www.ncpublicschools.org/cedars/reporting/documentation/). These videos cover a
range of topics including an introduction to the CEDARS data warehouse and an
understanding of the purpose of the warehouse and how it is connected to other NCDPI
data systems; warehouse dashboard functionality; a description of the types of
dashboards published by user type; and a description of the available data. Additional
items of interest include how to register for an account, locating user guides, determining
who has been identified as a CDW Trainer, navigating the CDW load schedule and a
brief overview of the CDW Data Dictionary. In addition to the overview webinars, the CEDARS team presented at the following
conferences and meetings in the 2013-14 school year about CEDARS:
•
Financial and Business Services summer conference – LEA level - July 25, 2013
•
NC SIS Symposium – School level – Demonstration sessions designed to
introduce dashboards: Feb 18, 2014 (2 sessions)
2. Does the State have evidence indicating the quality of implementation for this subcriterion? What is/has the State doing/done as a result of this information?
Data quality continues to improve as a result of CEDARS implementation. Both the
length of time required for business owners to review data submissions and the
turnaround time for any re-work has diminished dramatically in the past year, as
evidenced by the more streamlined process for preparing files for Federal submission.
For the 2012-13 school year, NC DPI was able to submit all applicable Federal files from
the CEDARS data warehouse (that do not have due dates in the future)
(C)(2) Accessing and Using State Data, Part B Narrative, North Carolina, May 2014
Page 3 of 3
The validation tools built into CEDARS allow for more rapid identification of problem
areas that can quickly be turned over to the appropriate business area for resolution.
Inserting these edit checks earlier in the data-loading process has reduced the amount of
time spent correcting data and generating Education Data Exchange Network (EDEN)
reports. These validations have also resulted in fewer corrections required for the EDEN
reports upon submission.
An unfortunate result of the success of RttT has been a significant decline in CEDARS
usage on the part of the LEAs. Both agency and LEA staff have focused efforts on the
conversion of the SIS and implementation of the IIS. Internal staff members continue to
use CEDARS to produce the Federally required reporting to both EDEN and OSEP. The
development of dashboards has diminished significantly during the last year given
resource constraints. The dashboards which have been developed (Discipline and LEP)
have been very useful to the LEA program staff as a method of verification. The
development of all other programmatic specific dashboards will resume once the SIS and
IIS efforts are fully operational.
CEDARS is key to both the P-20W sectors, as the source of NCDPI longitudinal data,
and RttT as the warehouse for historical data not housed within the SIS.
3. What obstacles and/or risks could impact the State’s ability to meet its goals and
performance measures related to this sub-criterion?
As reported last year, the most significant factor putting us at risk of not achieving our
goals is the limited resources at both the SEA and LEA levels that must be spread across
multiple projects. Given how many tasks (particularly new initiatives) to which limited
personnel in LEAs must attend, engaging them in using data (through CEDARS or
otherwise) in new ways to make decisions is challenging. This lack of resources and
emphasis on the SIS conversion and IIS implementation has had a profound effect on the
ability to enhance the CEDARS dashboard and garner the enthusiasm from two years ago
for CEDARS within the LEA community. The interest is still there, as evidenced by the
attendance and questions during the recent Statewide Homebase conference, but time is
the critical factor in usage. The conversion of the SIS requires both LEA and SIS staff to
verify the data at such a detailed level, the need for a dashboard at this time appears
redundant to them. Both staffs have expressed the relevance of the CEDARS solution
once a higher level of verification is acceptable.
Evaluation: Based on the responses to the previous question, evaluate the State’s
performance and progress to date for this sub-criterion (choose one)
Red (1)
Orange (2)
Yellow (3)
Green (4)3 3
Red – substantially off-track and/or has significant quality concerns; urgent and decisive action is required; Orange –off-track
and/or there are quality concerns; many aspects require significant attention; Yellow –generally on-track and of high or good
quality; only a few aspects require additional attention; Green – on-track with high quality. 
Download