Race to the Top Progress Update

advertisement
1
Race to the Top
Progress Update - On-site Review
Part B: In preparation for the on-site program review and Secretary stocktake, States must update
applicable questions (i.e., those for which there is new information) and provide appropriate
documentation to substantiate its responses for all relevant application sub-criterion (e.g. (A)(2) and
(D)(4)). 1
Application sub-criterion:2 (c)(2) Accessing and Using State Data
STATE’s goals for this sub-criterion:

Continue providing data and information products that stakeholders can use to inform
their decisions regarding policy and services.

Provide professional development on Common Education Data Analysis & Reporting
Systems (CEDARS) business intelligence tools and how to produce annual and
longitudinal reports.
Relevant projects:

Continue providing data and information products (School Report Cards, etc.; as
discussed in RttT application) for use by various stakeholders

Continue developing CEDARS products to make more accessible and increase use;
continue providing professional development for school district and charter school
personnel regarding how to use CEDARS tools
1. Is the State on-track to implement the activities and meet the goals and performance
measures that are included in its approved scope of work for this sub-criterion? If
so, explain why. If not, explain why not.
Data and Information Products
North Carolina is on-track to meet the goals and performance measure for sub-criterion
(C)(2). NC continues to deliver various products for use by all stakeholders, including
local education agency (LEA) personnel, parents, researchers and the general public.
These products include:

School Report Cards – published October 25, 2012

EVAAS reports

CEDARS

Data for External Users
1
Note that States will only be required to submit documentation for the on-site program review, not for monthly calls. States
should work with their Program Officers to determine relevant state-specific documentation.
2
All highlighted fields will be pre-populated by the Department Program Officer prior to State completion.
1
2
o NC Education Research Data Center (NC ERDC)
o Carolina Institute for Public Policy
o The Employment Security Commission for the Common Follow-up
System (CFS)
o RttT Evaluation Team
NC continues its partnership with the NC ERDC at Duke University and the Carolina
Institute for Public Policy at UNC-Chapel Hill. Through these arrangements, governed
by annually updated memoranda of understanding, the North Carolina Department of
Public Instruction (NCDPI) supplies data that is made available for various researchers
studying education policy issues. North Carolina also provides data to the multi-agency
team conducting the RttT evaluation for the state.
CEDARS
NCDPI schedules monthly, two-hour webinars aimed at all State-, LEA-, and schoollevel employees. The training provides participants an introduction to the CEDARS data
warehouse and an understanding of the purpose of the warehouse and how it is connected
to other NCDPI data systems. Attendees are also provided a demonstration of warehouse
dashboard functionality; a description of the types of dashboards published by user type;
and a description of the available data. Additional items of interest include how to
register for an account, locating user guides, determining who has been identified as a
CDW Trainer, navigating the CDW load schedule and a brief overview of the CDW Data
Dictionary (http://www.ncpublicschools.org/cedars/reporting/events/).
Refresher documents and videos are also available on the internet
(http://www.ncpublicschools.org/cedars/reporting/documentation/ ).
In addition to the overview webinars, the CEDARS team presented at the following
conferences and meetings in the 2012-13 school year about CEDARS:

English as a Second Language (ESL) Data Retreat – LEA level - Hands-on
sessions used for data integrity, ESL progression: September 10 & 12, 2012

Data Management Group – SEA and LEA level – Demonstrated new dashboards
created by two business areas: September 20, 2012

LEA Trainer Quarterly Meeting – SEA and LEA level - Designed to tie other
state level initiatives to CDW: Jan 25, 2013

SEA Regional Coordinators – SEA level - Hands-on, discussions on how to
engage Superintendents and leadership: Feb 11, 2013

NCW Symposium – School level – Demonstration sessions designed to introduce
dashboards: Feb 26, 2013 (2 sessions)
2
3

Finance Council of the Central Carolina Regional Education Service Alliance –
LEA Administrators – Demonstration sessions designed to introduce dashboards
and demonstrate custom report creation: May 9, 2013
NCDPI staff members also received two levels of Oracle Training provided by the
vendor focusing the use of Oracle Business Intelligence Enterprise Edition (OBIEE). The
“Beginner” training was held March 6 and 7, 2013 and included twelve staff members.
“Advanced” training was held April 11 and 12, 2013 for nine staff members. A total of
fourteen staff members participated in the training.
NCDPI continues to update CEDARS information products and make them more
accessible to stakeholders. Efforts include reorganizing the CEDARS landing page
(http://www.ncpublicschools.org/cedars/) to make it more user-friendly for all users and
adding a data submission schedule to the site to assist users in identifying when certain
types of data are loaded into the CEDARS Data Warehouse. NCDPI has also added a
data dictionary for the CEDARS Data Warehouse. For each data element contained in
the warehouse, the dictionary provides the subject area, sub-folder, authoritative source,
and definition. The dictionary is complete except for the section for Career and Technical
Education (CTE) elements. Work on that section of the dictionary will continue in the
2013-14 school year.
To provide information to the field, NCDPI distributed a CEDARS newsletter in October,
2012. The first newsletter was published on the CEDARS webpage and distributed by
email to the CEDARS trainers. The CEDARS staff would like to produce the newsletter
quarterly, but because of resource limitations due to the implementation demands of a
new Student Information System (SIS) another newsletter has not been delivered.
(http://www.ncpublicschools.org/docs/cedars/reporting/documentation/2012october.pdf).
NC DPI publishes dashboards in the CEDARS data warehouse at the request of SEAs
and LEAs. These dashboards provide ‘Detail’ and ‘Aggregate’ views based on user role
and interact with data using built-in prompts. The dashboards are exportable in various
formats and are printable. Two NCDPI business areas have created dashboards. One
contains aggregate level data and may be viewed by any user. The other has student
record-level data and is only open to users with individual-level access.
2. Does the State have evidence indicating the quality of implementation for this subcriterion? What is/has the State doing/done as a result of this information?
Data quality continues to improve as a result of CEDARS implementation. Both the
length of time required for business owners to review data submissions and the
turnaround time for any re-work has diminished dramatically in the past year, as
evidenced by the more streamlined process for preparing files for Federal submission.
For the 2011-12 school year, NC DPI was able to submit all applicable Federal files from
the CEDARS data warehouse
3
4
The validation tools built into CEDARS allow for more rapid identification of problem
areas that can quickly be turned over to the appropriate business area for resolution.
Inserting these edit checks earlier in the data-loading process has reduced the amount of
time spent correcting data and generating Education Data Exchange Network (EDEN)
reports. These validations have also resulted in fewer corrections required for the EDEN
reports upon submission.
Table 1 (below) reflects the most recent figures relating to usage over time. While gains
in usage have been modest and not sustained, the NC DPI is encouraged that these spikes
in usage in the last quarter of 2012 coincided directly with a professional development
push from the SEA to LEA users. These professional development opportunities included
webinars, CDW newsletter publication, and a coordinated communication effort from the
CEDARS staff. Plans are to resume the efforts in 2013, with a focus on the third and
fourth quarters once the new SIS is in place. The value of CEDARS has also been noted
at the local level in North Carolina. After a recent training, an attendee commented, “I
just wanted to let you know how excited my staff is to see the data available to them. I'm
grateful to you and your leadership team for working hard on this project for us! I'll be
the first to tell you that I've already been able to pull information to help me fill out
federal reports that I have to do in my LEA.”
Table 1: CEDARS Usage Data
Month # Unique Users
APR
4
MAY
16
JUN
18
JUL
21
2012 AUG
21
SEP
51
OCT
69
NOV
53
DEC
54
JAN
29
FEB
30
2013
MAR
22
APR
27
Year
3. What obstacles and/or risks could impact the State’s ability to meet its goals and
performance measures related to this sub-criterion?
The most significant factor putting us at risk of not achieving our goals is the limited
resources at both the SEA and LEA levels that must be spread across multiple projects.
Given how many tasks (particularly new initiatives) to which limited personnel in LEAs
must attend, engaging them in using data (through CEDARS or otherwise) in new ways
to make decisions is challenging. This lack of resources is particularly apparent this year
4
5
as we institute a new SIS at both the local and State level. Similarly, limitations on State
staff time translate into limitations on outreach that NCDPI can provide. These realities
will have an impact the rate at which we will make progress toward achievement of our
goals.
Evaluation: Based on the responses to the previous question, evaluate the State’s
performance and progress to date for this sub-criterion (choose one)
Red (1)
Orange (2)
Yellow (3)
Green (4)3
3
Red – substantially off-track and/or has significant quality concerns; urgent and decisive action is required; Orange –off-track
and/or there are quality concerns; many aspects require significant attention; Yellow –generally on-track and of high or good
quality; only a few aspects require additional attention; Green – on-track with high quality.
5
6
Paperwork Reduction Act Statement
According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such
collection displays a valid OMB control number. Public reporting burden for this collection of information is estimated to
average 74 hours (annually) per response, including time for reviewing instructions, searching existing data sources, gathering
and maintaining the data needed, and completing and reviewing the collection of information. The obligation to respond to this
collection is required to obtain or retain benefit (34 CFR 75.720, 75.730-732; 34 CFR 80.40 and 80.41). Send comments
regarding the burden estimate or any other aspect of this collection of information, including suggestions for reducing this
burden, to the U.S. Department of Education, 400 Maryland Ave., SW, Washington, DC 20210-4537 or email
ICDocketMgr@ed.gov and reference the OMB Control Number 1894-0011.
6
Download