SPDG Year 2 Grant Performance Report

advertisement
U.S. Department of Education
Grant Performance Report Cover Sheet (ED 524B)
Check only one box per Program Office instructions.
[X] Annual Performance Report [ ] Final Performance Report
OMB No. 1894-0003
Exp. 04/30/2014
General Information
1. PR/Award #: H323A120020
2. Grantee NCES ID#: 13
(Block 5 of the Grant Award Notification - 11 characters.)
(See instructions. Up to 12 characters.)
3 Project Title: State Personnel Development Grant
(Enter the same title as on the approved application.)
4. Grantee Name (Block 1 of the Grant Award Notification.):Georgia Department of Education
5. Grantee Address (See instructions.)
6. Project Director (See instructions.) Name: Dr. Julia Causey Title: Program Manager, Division of Special Education, DOE
Ph #: (404) 657 -9954 Ext: (
)
Fax #: (404) 651-6457
Email Address: jcausey@doe.k12.ga.us
Reporting Period Information (See instructions.)
7. Reporting Period:
From: 03/01/2013 To: 02/28/2014
(mm/dd/yyyy)
Budget Expenditures (To be completed by your Business Office. See instructions. Also see Section B.)
8. Budget Expenditures
Federal Grant Funds
a. Previous Budget Period
b. Current Budget Period
c. Entire Project Period
(For Final Performance Reports only)
Non-Federal Funds (Match/Cost Share)
$131,467.00
$1,156,038.00
Indirect Cost Information (To be completed by your Business Office. See instructions.)
9. Indirect Costs
a. Are you claiming indirect costs under this grant? _X_Yes __No
b. If yes, do you have an Indirect Cost Rate Agreement approved by the Federal Government? __X_Yes ___No
c. If yes, provide the following information:
Period Covered by the Indirect Cost Rate Agreement: From: _07/01/2013 To: 06/30/2014 (mm/dd/yyyy)
Approving Federal agency: ___ED ___Other (Please specify): ___________________________________________________
Type of Rate (For Final Performance Reports Only): ___ Provisional ___ Final ___ Other (Please specify): _______________
d. For Restricted Rate Programs (check one) -- Are you using a restricted indirect cost rate that:
_X__ Is included in your approved Indirect Cost Rate Agreement?
___ Complies with 34 CFR 76.564(c)(2)?
Human Subjects (Annual Institutional Review Board (IRB) Certification) (See instructions.)
10. Is the annual certification of Institutional Review Board (IRB) approval attached? ___Yes ___ No _X__ N/A
Performance Measures Status and Certification (See instructions.)
11. Performance Measures Status
a. Are complete data on performance measures for the current budget period included in the Project Status Chart? `_Yes _X__ No
b. If no, when will the data be available and submitted to the Department? __9___/__30___/____2017__ (mm/dd/yyyy)
12. To the best of my knowledge and belief, all data in this performance report are true and correct and the report fully discloses all
known weaknesses concerning the accuracy, reliability, and completeness of the data.
_John D. Barge______________
Name of Authorized Representative:
Title: State School Superintendent
_____________________________________________________
Signature:
Date: _____/_____/_______
i
U.S. Department of Education
Grant Performance Report (ED 524B)
Executive Summary
OMB No. 1894-0003
Exp. 04/30/2014
PR/Award # (11 characters):H323A120020
The Georgia Department of Education (GaDOE), Division of Special Education Services and Supports (DSESS), has
conducted frequent needs assessments for State Performance Plan development and execution. Noting the need for increased
graduation with a high school diploma, decreased dropouts, and improved postsecondary outcomes, GaDOE, DSESS
initiated GraduateFIRST in 2007 using State Personnel Development Grant (SPDG) resources. By the fourth year of
operation, the graduation and math gaps between GraduateFIRSTstudents with disabilities (SWD) and those in the rest of the
state had narrowed. In addition, the gap in English language arts had narrowed; and the dropout rate and suspension rates
had decreased by approximately 50 percent.
Noting this impact, the GaDOE decided to utilize the knowledge, resources, and interventions of GraduateFIRST to close the
gaps in other schools throughout Georgia. SPP baseline data also showed the need for improving transition plans within
student Individual Education Plans (IEPs) and address the need to enhance assistance for children with autism or autism-like
behaviors. The SPP sets a goal of having at least 80 percent of youth with IEPs exiting high school to be enrolled in higher
education, competitively employed, or enrolled in some other post secondary program. Currently, about 76 percent of
students with disabilities in Georgia are engaged 12 months after graduation. In addition, it was determined that for the
growing population of children with autism or autism-like behaviors, early intervention would be most effective.
To meet these needs, the GaDOE DSESS implemented two Initiatives—GraduateFIRST and College and Career
Readiness/Transition Project. Partners in GraduateFIRST (Initiative 1) include the National Dropout Prevention Center for
Students with Disabilities (NDPC-SD) at Clemson University, 17 Georgia Learning Resource System (GLRS) Centers, the
Georgia Parent Mentor Partnership, Parent 2 Parent (P2P), and the GaDOE School Improvement Division. Partners within
Initiative 2 also include the University of Kansas Transition Coalition and the National Secondary Transition Technical
Assistance Center (NSTTAC). The third Initiative was implemented with the Emory University Autism Center providing
incidental learning training for early learning teachers, providers and parents.
Initiative 1, GraduateFIRST has benefited from 5-years of experience working with selected schools in Georgia’s previous
SPDG and the first year of the current SPDG. During last year GraduateFIRST began working in close collaboration with the
School Improvement Division, to impact identified participating schools with a uniform approach. GraduateFIRST was able
to provide many tools, professional development options, a website with resources, experienced regional Collaboration
Coaches, and accountability methods to the collaborative effort. Of prime importance to this effort are Collaboration
Coaches, GLRS Directors, local team leaders and team plus the methods and procedures defined in the GraduateFIRST
Implementation Manual.
Schools and school districts applied and made professional learning commitments requiring GraduateFIRST participation.
Eighty-four schools were selected to participate in Year 2 with 16 being new to the program including 7 high schools and 15
middle schools. Participating schools include 23 elementary schools, 31 middle schools, 28 high schools, and 2 Academy
programs. Trainings are paired with fidelity tools to ensure participants have the knowledge and skills needed to implement
GraduateFIRST. These tools include the GraduateFIRST Data Probe and Discussion Guide, the GraduateFIRST Process and
Primary Area of Engagement Implementation Scales; Step-by-Step Process Guide to Improving Graduation Rates and
Achievement; and the NDPC-SD Attribute Forms for Affective Engagement, Attendance, Behavior, Academic Engagement,
Parent/Family Engagement, and School Climate. The tools are in the GraduateFIRST Implementation Manual and on the
website.
A fidelity tool to measure implementation of the overall structure of GraduateFIRST, as well as the primary areas of
engagement was administered in the fall of 2013 and again in the spring of 2014. Seventy schools out of 84 (83.3 percent)
received a rating of 2 (started out level) or higher on effectiveness of implementing their school plan. Primary area of
engagement scale scores received a 2 or higher rating in 73 reporting schools (94.8 percent).
ii
Student assessment data is collected quarterly using the GraduateFIRST Assessment Tool that gathers attendance,
suspensions, and course failures for approximately 3,300 students. Attendance improved from Year1 to Year 2 with both
middle schools and high schools showing a decline in absenteeism. The same observation was made for course failures and
suspensions. Middle schools showed larger gains than high schools.
Georgia’s Single Statewide Accountability System (SSAS) gathers numeric data on each school in the state system. Among
the indicators gathered are data on graduation, dropout rate, attendance and achievement. GraduateFIRST is using this
available data to assess outcome progress. Analysis found that students with disabilities (SWD) are graduating at a rate a
little below 40 percent, compared to slightly above 75 percent for students without disabilities in the participating
GraduateFIRST high schools. Students with disabilities have more days absent than students without disabilities with those
in high school having the most days absent. Students with disabilities fail more courses than students without disabilities with
the largest disparity in high schools on the End of Course Tests.
Initiative 2, the College and Career Readiness (CCaR) Project, focuses on compliant transition plans and positive
postsecondary outcomes for SWD. Current findings, as measured by Indicators 13 and 14 of the SPP, have found there is a
need for more secondary transition targeted assistance to improve postsecondary outcomes for SWD.
The CCaR Project is implemented and facilitated in 15 districts with coaching from seven CCaR Specialists (CCaRS) who
work with the district/school teams, under the direction and guidance of the project supervisor (Core). During Year 1, the
GaDOE reviewed all school district transition plans submitted to the GaDOE for compliance and based upon this review, 15
school districts were chosen for participation in the CCaR Initiative.
To facilitate the work of the CCaRS, the Kansas Transition Coalition provided a three-week Seminar Series that focused on
providing consistent knowledge of evidence-based quality and best practices in transition. In addition, a two-day CCaRS
Transition Coaching Institute was held in September 2013 on evidence-based practices in transition. Following this, 16
Hitting the Mark trainings were held in October-November 2013 for 225 participants within the 15 participating school
districts focusing on writing compliant transition plans. To assess these trainings, three-month post evaluations were gathered
for the Transition Institute and the Hitting the Mark training. The majority of the participants rated trainings as helpful.
In February 2014, the 15 participating CCaR school districts began a 12-week on-line training and coaching (TRAN-Qual)
series provided by the University of Kansas Transition Coalition. The purpose of this training is to guide the district through
a self study of transition plan compliance and implementation of effective transition practices. Outcome data is collected and
analyzed (pre and post testing) of participant knowledge. Success in the 15 school districts is being monitored using Quality
Indicators of Exemplary Transition Programs, the Single Statewide Accountability System, and the one year post exiting
survey of former graduates.
Initiative 3, the Autism Early Intervention project, addresses the increasing prevalence of autism and the role of early
intervention services for young children with autism or autism-like behaviors. The Emory University Autism Center (EAC)
has developed an evidence-based, systematic incidental teaching model for delivering instruction in preschool/prekindergarten classrooms that significantly improves the number of young children with autism or autism-like behaviors who
transition to general education classrooms. Seventeen classrooms from three Georgia regions (2 counties per region)
participated actively throughout Year 2. Over 56 teachers and provider were trained in three Rounds covering social
communication, social skills and kindergarten readiness. Approximately 20 parents participated in Round 1 and Round 3
trainings. A three-month follow-up of teachers, providers, and parents found that all rated the methods, techniques, etc. that
they learned as effective or very effective in their classrooms and homes. PLA-Check and the Incidental Teaching Rating
Scale are being used to measure implementation fidelity.
iii
.
Georgia State Personnel Development Grant
Annual Performance Report – Year 2
Table of Contents
Cover Sheet (ED 524B)………………………………………………………………………………..…………. i
Executive Summary……………………………………………………………………………………..…..……. ii
Section A. Objectives Information and Related Performance Measures Data……………………………..….…. 3
Program Measure 1. Projects use evidence-based professional development practices to support the
attainment of identified competencies……………..……………………………………………………….…….. 3
I. Professional Development Components…………………………………………………………………...6
II. Post Professional Development Assessment…………………….………………………………………. 25
Program Measure 2. Participants in SPDG professional development demonstrate improvement in
implementation of SPDG-supported practices over time…………………………………………………..…......42
I. Fidelity Assessment…………………………………………………………………………………… 48
II. Three-Month Professional Development Follow-Up Assessment………………………………………. 65
III. Assessment of Coaching Effectiveness………………………………………………………………… 71
Program Measure 3. Projects use SPDG professional development funds to provide follow-up
activities designed to sustain the use of SPDG-supported practices……………………………………………...86
Section B. Budget Information………………………………………………………………………….………. 92
Section C. Additional Information…………………………………………………………………………….....94
Attachment A – Completed Worksheet – Initiative 1 - GraduateFIRST………………………………………...-2Attachment B – Completed Worksheet – Initiative 2 – CCaR Project………………………………………….-19Attachment C – Completed Worksheet – Initiative 3 – Autism Early Intervention Project…………………….-29Attachment D - GraduateFIRST Implementation Scale: Process and Student Engagement…………………….-49Attachment E – CCaR Fidelity Instruction – Quality Indicators of Transition Programs………………………...-62-
iv
OMB No. 1894-0003
Exp. 04/30/2014
U.S. Department of Education
Grant Performance Report (ED 524B)
Project Status Chart
PR/Award # (11 characters): ______________________
SECTION A - Performance Objectives Information and Related Performance Measures Data (See Instructions. Use as many pages as necessary.)
1. Project Objective
[ ] Check if this is a status update for the previous budget period.
Program Measure 1: Projects use evidence-based professional development practices to support the attainment of identified competencies.
1.a. Performance Measure
Percentage of evidence-based professional development
components within Initiate 1 – GraduateFIRST, scoring 3 or 4
on the federal professional development worksheet rubrics.
1.b. Performance Measure
Percentage of evidence-based professional development
components within Initiative 2 – CCaR Project, scoring 3 or 4
on the federal professional development worksheet rubrics.
Measure Type
Quantitative Data
Target
Project
Raw
Number
Ratio
%
16
13.6/16
85%
Percentage of evidence-based professional development
components within Initiative 3 – Autism Early Intervention,
scoring 3 or 4 on the federal professional development
worksheet rubrics.
1.d. Performance Measure
Percentage of School Team members participating in
GraduateFIRST Forums reporting that the knowledge and
Target
Project
Raw
Number
Ratio
10/16
%
65
Target
Project
Raw
Number
Ratio
%
8
8/16
50.0
Measure Type
100.0
Actual Performance Data
Raw
Number
Ratio
%
11
11/16
68.8
Actual Performance Data
Raw
Number
Ratio
%
10
10/16
62.5
Quantitative Data
Target
3
16/16
Quantitative Data
Measure Type
Project
16
Quantitative Data
Measure Type
10.4
1.c. Performance Measure
Actual Performance Data
Raw
Number
Ratio
%
Raw
Number
Ratio
%
Actual Performance Data
Raw
Number
Ratio
%
information received was above or well above their
expectations in being useful in their work with the schools to
improve student engagement/achievement, as evidenced by
post-training evaluations.
1.e. Performance Measure
Percentage of GraduateFIRST School Team members/Team
Leaders strongly agreeing that the training information
received in monthly School Team meetings will be useful in
their work with the schools to improve student
engagement/achievement, as evidenced by post-training
evaluations.
1.f. Performance Measure
Percentage of participants in the Initiative 1 Engaging
Families of Students with Disabilities for Student
Achievement Forum who reported that the Forum was above
or well above expectations.
1.g. Performance Measure
Percentage of Initiative 2 – CCaR Project Specialists agreeing
or strongly agreeing that the training information received
would be useful in their work with the schools, as evidenced
by post-training evaluations.
1.h. Performance Measure
Percentage of Initiative 2 District participants who can apply
the learning received in the Transition Institute and Hitting the
Mark training with some or no support.
1.i. Performance Measure
115
115/135
Measure Type
85.2
Raw
Number
Ratio
%
221
221/276
80.0
Target
Raw
Number
Ratio
%
55
55/73
75
Target
Raw
Number
Ratio
%
22
22/24
91.7
Target
Measure Type
4
213
213/276
77.2
Actual Performance Data
Raw
Number
Ratio
%
61
61/73
83.6
Actual Performance Data
Raw
Number
Ratio
%
23
23/24
95.8
Quantitative Data
Measure Type
Project
Actual Performance Data
Raw
Number
Ratio
%
Quantitative Data
Measure Type
Project
87.4
Quantitative Data
Measure Type
Project
118/125
Quantitative Data
Target
Project
118
Raw
Number
Ratio
%
218
218/257
84.8
Actual Performance Data
Raw
Number
Ratio
%
236
Quantitative Data
236/257
91.8
Increase in pre-post training assessments of Hitting the Mark
participants.
1.j. Performance Measure
Average percentage of ASPIRE Best Practice Forum
participants reporting that they could apply the learnings
across the Forum Learning Targets with no or some support.
1.k. Performance Measure
Percentage of learning targets achieving statistically
significant increases for Initiative 3 – Autism Early
Intervention Project teachers and partners, as measured by
post training feedback instruments
1.l. Performance Measure
Percentage of post training satisfaction items receiving an
average rating of 5 or higher by parents participating in the
Initiative 3 Autism Early Intervention Project.
Target
Program
Raw
Number
Ratio
%
17
17/68.0
25.0
Video on-line professional development modules and
accompanying guides, using evidence-based practices to
improve teacher effectiveness on the Teacher Keys
22.8
22.8/68.0
33.5
Quantitative Data
Measure Type
Target
Actual Performance Data
Project
Raw
Number
Ratio
%
Raw
Number
Ratio
%
31.5
31.5/35
90%
32
32.3/35
92.3
Quantitative Data
Measure Type
Target
Actual Performance Data
Project
Raw
Number
Ratio
%
Raw
Number
Ratio
%
10
10/10
100%
10
10/10
100%
Quantitative Data
Measure Type
Target
Actual Performance Data
Project
Raw
Number
55
1.m. Performance Measure
Actual Performance Data
Raw
Number
Ratio
%
Ratio
55/58
%
Raw
Number
94.8
57
Ratio
57/58
%
98.3
Quantitative Data
Measure Type
Target
Actual Performance Data
Project
Raw
Number
5
Ratio
%
Raw
Number
Ratio
%
Effectiveness System (TKES) performance standards, will be
developed and posted on the GraduateFIRST website in
collaboration with other DOE Divisions.
/
/
NA for
Year 2
Explanation of Progress (Include Qualitative Data and Data Collection Information)
I. Professional Development Components
1.a. – 1.c. – Professional development components – OSEP reporting worksheet
The completed OSEP Reporting Worksheets for Program Measure 1 of Georgia’s Initiative 1 (GraduateFIRST), Initiative 2 (CCaR Project) and
Initiative 3 (Autism Early Intervention) are attached to this Year 2 Annual Performance Report (Attachments A-C). Following is a discussion of the
activities related to the professional development domains, components, and specifications for each of the two selected Georgia Initiatives.
1.a. Assessment of professional development - Initiative 1 – GraduateFIRST
The OSEP professional development worksheet was completed independently by each member of the GraduateFIRST Design team using
SurveyMonkey and independently verified by the Director.
Components in Place:
Because of the five-year history of implementing GraduateFIRST, many tools, professional development options, website with resources,
experienced regional Collaboration Coaches, and accountability methods were established and ready for use beginning with Year 1 of the SPDG.
During Year 1, a decision was made in the GaDOE to work with Focus Schools identified through the Georgia ESEA Flexibility Waiver as needing
specific supports and interventions for students with disabilities in the lowest performing subgroup [A1) Selection]. The GaDOE SPDG staff
worked closely with the GaDOE School Improvement Division with selected schools (61 elementary schools, 64 middle schools, and 29 high
schools—154 total).
During spring, 2013 (beginning of Year 2), the GraduateFIRST Project Application and Frequently Asked Questions (FAQ) were developed. Middle
and high schools in Georgia were invited to submit an application to participate in Years 2-5 of the SPDG. Elementary schools that participated in
GraduateFIRST during the 2012-2013 school year (Year 1) were also invited to submit an application. Selected schools agreed to these
GraduateFIRST Commitments [A(1) Selection]:
1. Increasing the graduation rate and closing the achievement gap for students with disabilities (SWD) as improvement priorities.
2. Ensuring participation and engagement of the principal and designated school team members in GraduateFIRST.
3. Enhancing family and community engagement.
6
4. Collecting and analyzing data, developing an Action Plan, identifying target list of students, and implementing evidence-based strategies with
fidelity.
5. Involving designated central office personnel, including the special education director, in GraduateFIRST initiatives.
In addition, through the application process, the schools committed to [A(1) Selection]:
1. Designating a school-based GraduateFIRST team that includes, at a minimum, the principal, the Team Leader, the School Improvement
Specialist, a general educator, and a special educator. In addition, schools could include guidance counselors, parent mentors or parents,
graduation coaches, and others, as appropriate.
2. Designating a school-based Team Leader (TL) to work directly with the GLRS Collaboration Coach (CC) to facilitate GraduateFIRST
communication and project goals and objectives.
3. Ensuring the participation of the principal and designated school team members in GraduateFIRST activities.
4. Collecting and analyzing data, developing the Action Plan, identifying a target list of students to be followed and assessed quarterly, and
implementing evidence-based strategies with fidelity.
5. Collecting and reporting project evaluation data including attendance, behavior, and achievement by grading periods for targeted students to
the Georgia Department of Education and the SPDG evaluators.
6. Maintaining ongoing communication with the families of the targeted students regarding GraduateFIRST activities.
Participating school activities and expectations (defined above), as well as the roles and responsibilities for Collaboration Coaches and Team Leaders
are further defined in the GraduateFIRST Implementation Manual [A(1) Selection]. The schools and school districts also made commitments for the
professional learning required for GraduateFIRST participation. The application was signed by the local educational agency (LEA) superintendent,
special education director, and the school principal indicating district-wide commitment for participation.
During Year 2, 84 schools (23 elementary schools, 31 middle schools, 28 high schools—including one middle/high school combination, and 2
Academy schools are participating in GraduateFIRST (total of 84 schools). These schools represent 46 local educational agencies (LEAs). Also,
there are 26 schools that were part of GraduateFIRST Cohorts 1-3 from the previous SPDG, and 16 schools that are new to the GraduateFIRST
process. Of the 84 schools/programs participating in GraduateFIRST, 76 percent (63 schools) are also Focus Schools, and three schools and one
Academy are identified as Priority Schools. GraduateFIRST works closely with School Improvement to coordinate services for the 63 Title I
Schools.
Initiative 1 trainers are the 15 regional GraduateFIRST Collaboration Coaches (CC). The Collaboration Coaches received initial training and updates
from the National Dropout Prevention Center for Students with Disabilities (NDPC-SD). They have provided training for schools and school-based
teams for five years. NDPC-SD provides standardized training using modules that include curriculum and training materials across the topic areas to
ensure clear, consistent language and expectations of trainers. These modules incorporate adult learning principles [A(2) Selection], are skill based
with vital behaviors or core components that participants are expected to use as a result of the training [A(2) Selection], and incorporate
accountability for delivery and quality monitoring of training [A(2) Selection]. Additionally, trainers utilize evaluation data and fidelity data to
provide appropriate follow-up training.
7
Additional training is provided by the GaDOE Division of Special Services and Supports and the Statewide Lead Collaboration Coach. Trainers
receive updates and refreshers during monthly face-to-face Collaboration Coach professional learning sessions. Less experienced Collaboration
Coaches are paired with experienced Collaboration Coaches to co-deliver training [A(2) Selection]. This model allows new trainers to build their
skills with both the curriculum content and presentation delivery and receive feedback to further improvement.
Training is provided with planned follow-up to further the application of knowledge and skills [A(2) Selection]. Expectations for follow-up are
discussed and assigned at monthly Collaboration Coach meetings and at monthly regional Team Leader meetings. Notes and minutes of the
statewide meetings and coaching sessions are collected, analyzed and summarized by the SPDG evaluators, and shared with the Collaboration
Coaches to ensure clarity and consistency of expectations [A(2) Selection].
Georgia’s GraduateFIRST Project has a statewide Design Team and 1.0 FTE devoted to designing a training plan; ensuring all trainers meet the
expectations; planning of training events; and monitoring the efficacy of the trainers through evaluations, the training content, and the overall training
plan. The FTE spans two positions (Director and the Lead Collaboration Coach) that are assigned to supervise the training. The Director and the
Lead Collaboration Coach meet at least bi-weekly and provide reports at the monthly Design Team meetings [B(1) Training].
The Director and the Lead Collaboration Coach review the post-training evaluations and provide a report to the Design Team [B(1) Training].
For most of the schools participating in GraduateFIRST this year, initial training was provided during the 2012 Focus School Institute. For the 17
schools new to GraduateFIRST, online modules were developed and archived at www.gaspdg.org. Many returning GraduateFIRST schools also used
this information to review the GraduateFIRST process and procedures, update new school personnel, and refine implementation.
Additional training was provided through the 2013 GraduateFIRST Best Practice Forums (Elementary, Middle School, and High School). During
these one-day forums, evidence-based practices and school experiences were shared for participating elementary, middle, and high schools [B(1)
Training].
Georgia’s GraduateFIRST trainings incorporate effective adult learning principles and strategies including introducing information, illustrate/
demonstrate, practicing, evaluation, reflection, and mastery. Principles from these six adult learning practices are incorporated in order to promote
planning, application, and deep understanding of the GraduateFIRST process, practices, and skills [B(2) Training]. Promoting learner acquisition,
use, and evaluation of new knowledge, materials, and practice for Team Leaders are accomplished through the following cycle:
1. Information is introduced through the use of pre-training exercise, training lectures and/or presentations.
2. This information is illustrated or demonstrated with real life examples, instructional videos, and active learner input. Using a guided process,
Team Leaders and the school-based teams are asked to select at least one Primary Area of Engagement (e.g., academic, behavior, cognitive,
and/or student engagement) that is the best contextual fit for their school.
3. Team Leaders are asked to apply the new knowledge and skills in real life application or problem solving tasks.
4. Team Leaders are then asked to evaluate their application by assessing strengths and weaknesses and review and make changes.
8
5. During the monthly coaching sessions with the Collaboration Coaches, the Team Leaders participate in group discussions and provide
feedback.
6. Finally, School Teams and Team Leaders are asked to self-assess using the GraduateFIRST Implementation Scale and document the evidence
of the implementation with artifacts [B(1) Training] and [B(2) Training].
Collaboration Coaches provide feedback and necessary revisions are made to the Action Plan.
The GraduateFIRST project uses a model provided by the NDPC-SD. Consistent with this model, GraduateFIRST has established core components
that training participants are expected to use as a result of training [B(3) Training]. These GraduateFIRST core components include:
1. Establish an effective GraduateFIRST team.
2. Collect and analyze data using the GraduateFIRST Data Probe and GraduateFIRST Data Probe Discussion Guide.
3. Use data to identify and prioritize areas of need for intervention including school climate, attendance, behavior, academic content and
instruction, family/community engagement, and student engagement and to identify a target group of students.
4. Use data to develop an Action Plan that includes the selection of evidence-based practices.
5. Select a group of target students to follow progress in attendance, course completion, and behavior (in- and out-of-school suspensions).
6. Implement and monitor the School Action Plan with fidelity including conducting baseline measures, collecting and analyzing progress
monitoring data using the GraduateFIRST Assessment Tool, and adjusting over time in accordance with progress monitoring data.
7. Evaluate the effectiveness of the School Action Plan.
Trainings are paired with fidelity tools to ensure participants have knowledge and skills needed to implement GraduateFIRST. These fidelity tools
include the GraduateFIRST Data Probe and Discussion Guide, the GraduateFIRST Assessment Tool, Implementation Scales, Step-by-Step Process
Guide to Improving Graduation Rates and Achievement, GraduateFIRST Timeline, and the NDPC-SD Attribute Forms for Affective Engagement,
Attendance, Behavior Engagement, Academic Engagement, Parent/Family Engagement, and School Climate. These tools and other resources are
compiled in the GraduateFIRST Implementation Manual and on the GraduateFIRST website www.gaspdg.org [B(3) Training].
All GraduateFIRST trainings include planned follow-up/coaching to ensure that participants are applying skills and knowledge to effectively
implement GraduateFIRST.
Collaboration Coaches work with Team Leaders and other School Team members monthly to coach the implementation of the GraduateFIRST
process. During these monthly coaching sessions, Team Leaders and other Team members participate in follow-up sessions and receive feedback
regarding implementation. [B(3) Training]
Learning targets are identified for all GraduateFIRST trainings/professional learning activities. Following the professional learning, participants
assess their knowledge on the learning targets. This information is used to improve future professional learning and to determine areas needed for
additional coaching or follow-up. Immediate post participant surveys are conducted to assess participant satisfaction and knowledge. Three-month
post professional development surveys are conducted to determine participants’ use of knowledge or skills. [(B(4) Training]
9
In addition, Collaboration Coaches are expected to collect data on school implementation in order to provide individualized support as needed and to
help alleviate barriers to effective implementation [(B(4) Training].
Each collaboration coach submits minutes from all Team Leader trainings and coaching sessions to the SPDG evaluators for review and completion
of monthly summaries. The Lead Collaboration Coach reviews the monthly summaries of the Team Leader minutes and shares this information with
the Design Team.
Collaboration Coaches participate in monthly sessions held by the GaDOE that focus on professional learning topics, participant feedback, and
refining coaching in the GraduateFIRST process. Collaboration Coaches provide support and feedback for each other as they refine their training and
coaching skills [B(5) Training].
All Coaches are required to submit monthly electronic logs of their coaching activities which are reviewed by the SPDG third party evaluators, the
Lead Collaboration Coach, and the GraduateFIRST Design Team [B(5) Training].
The Lead Collaboration Coach provides feedback and Collaboration Coach support through telephone conferences, other electronic communications,
as well as face-to-face meetings [B(5) Training].
The GraduateFIRST initiative has 1.0 FTE dedicated to overseeing coaching activities related to the implementation of GraduateFIRST. The
Director and the Statewide Lead Collaboration Coach, in collaboration with the SPDG Project Director, are responsible for job descriptions,
developing and facilitating training for the coaches, and using fidelity and outcome data to determine further training needs of the coaches [C(1)
Coaching].
Collaboration Coaches receive multiple sources of feedback about their coaching, including the GraduateFIRST Team Leader Coaching Evaluation,
GraduateFIRST Implementation Scales, records of application of knowledge and skills, satisfaction survey results, as well as student outcome data.
These data are used collectively to provide Collaboration Coaches feedback about performance and implementation outcomes [C(2) Coaching].
School Teams and Team Leaders provide information about the support and coaching they have received through the Coaching Effectiveness Survey,
which was administered in February 2014 [C(1) Coaching].
Coaching strategies used in the GraduateFIRST project that are appropriate for adult learners are [C(2) Training]:
1. Job-embedded coaching that addresses issues educators face daily in their schools and aligns with the GraduateFIRST and NDPC-SD
framework.
2. Coaching strategies matched to the needs and learning styles of the School Team.
3. Use of multimodal resources such as videos, podcasts, and articles about evidence-based practices.
4. Reflection about practice.
10
Collaboration Coaches provide assistive feedback related to the core components and skills for GraduateFIRST. In addition, Collaboration Coaches
help school-based Team Leaders address challenges and barriers faced in implementation. For GraduateFIRST schools that are also Focus Schools,
School Improvement Specialists and Collaboration Coaches share information about the implementation progress of each school [C(2) Training].
Collaboration Coaches help sustain continuous improvement through:
1. Regular meetings with district and building administrators, Team Leaders, and RESA School Improvement Specialists.
2. Multiple emails, face-to-face training, and use of on-the-job coaching.
3. Examples of various materials, forms, and strategies that are posted on the GraduateFIRST website and shared by Collaboration Coaches
statewide.
4. Pulse Check data administered by School Teams to determine the status and of their School Action Plan and changes needed. [C(2)
Training]
In order to strengthen the coaching provided through the SPDG, a nine part series dedicated to improving coaching was developed and archived
online. All Collaboration Coaches participated in these coaching modules that included topics on [C(2) Training]:
1.
2.
3.
4.
5.
6.
7.
Effects of on-job/classroom coaching.
Differences between training and coaching.
Expert versus ongoing coaching.
Characteristics of effective coaching.
Reflective listening and barriers to reflective listening.
Praise, advice, and feedback.
Qualities of meaningful feedback.
The performance assessment Professional Development Domains [D(1) – D(5) Performance Assessment] are also in place. Each GraduateFIRST
school has designated a School Team that provides the leadership within the school for completing the GraduateFIRST process. Team members
develop a self-directed, continuous improvement Action Plan for the school. This Team is responsible for ensuring the GraduateFIRST components
are implemented so that continuous improvement of student outcomes drives policy decisions. Each School Team is comprised differently based on
the identified needs, but School Team members usually include administrators, special educators, general educators, and School Improvement
Specialists, if assigned. [D(1) Training]
Each GraduateFIRST school has designated a school-based Team Leader that coordinates GraduateFIRST activities. Working directly with the
Collaboration Coach, the Team Leader schedules and conducts school level Team meetings each month, ensures appropriate time for project
activities; collects and analyzes data, assists with the implementation of the Action Plan, and progress monitors the implementation. [D(1) Training]
The School Team Leader submits student progress monitoring data for the targeted students quarterly to be entered into a statewide database so data
can be aggregated and disaggregated for analysis. Using multiple data sources, the School Action Plan is reviewed and revisions made as necessary.
11
Each GraduateFIRST school completes the two fidelity Implementation Scales. Schools complete this self-assessment in the fall and again in the
spring. In order to confirm with a degree of certainty that the self-reported scores on the spring reporting on the Implementation Scales are reliable
across schools, districts, and coaches, Collaboration Coaches use a peer review process to verify 20 percent of the GraduateFIRST Implementation
Scales including scoring and artifacts submitted. [D(1) Performance Assessment]
The Collaboration Coaches facilitate a Data Day for School Teams using a guided approach to analyze data and complete the GraduateFIRST Data
Probe identifying a group of target students and focus area(s) for intervention. [D(2) Performance Assessment]
Student progress monitoring data collected through the GraduateFIRST Assessment Tool were used as benchmarks for the School Team and Team
Leader to re-evaluate the outcomes for each targeted student at least monthly and/or by grading intervals. This information is used to make
adjustments in the interventions provided. [D(2) Performance Assessment] The SEA Design Team meets monthly to review data, identify areas for
improvement and refinement while considering scalability and sustainability. [D(2) Performance Assessment]
Through the 2012 Focus School Institute, GraduateFIRST Fundamentals, and the 2013 Best Practice Forums, the School Teams have received
training on the core components for GraduateFIRST. Team Leaders and the School Team are responsible for orienting staff concerning the core
components and the primary area of engagement. In subsequent years, schools will be encouraged to broaden the scope of implementation years and
encourage more staff participation. [D(3) Performance Assessment]
Schools self-assess using the GraduateFIRST Implementation and Student Engagement Scales. These data are aggregated and disaggregated by
Collaboration Coach. These data also guide improvements in training, coaching, and educators’ practices in either academic, behavior, cognitive, or
student engagement areas and are shared with the SEA Design Team, Collaboration Coaches, and School Teams. Student outcome data is collected
quarterly on attendance, behavior, and course or academic performance for each student identified on the target list. These progress monitoring data
are used to help identify selected students who may need additional intervention or accommodations. These progress monitoring data are also used
to help guide the school leadership team and to identify if there are regional coaching needs. The results of the student outcome data are shared at the
SEA Design Team and regionally with Collaboration Coaches. [D(3) Performance Assessment]
The SEA Design Team and the Collaboration Coaches review school and student outcomes to determine the need for program modifications,
additional training, and/or changes in procedures or practices. Based on this student outcome data, some schools have made changes in procedures,
practices, or building/student schedules. Collaboration Coaches use the student outcome data as the basis for coaching sessions. Student outcome
data and the post training surveys are used to plan future trainings and identify coaching needs. [D(4) Performance Assessment]
Implementation data from the GraduateFIRST Implementation Scales provide criteria, proficiency levels, and proficiency descriptions that are used
by School Teams, Team Leaders, and Collaboration Coaches. Schools are encouraged to celebrate progress toward goals. Collaboration Coaches
model this celebration by asking schools to report successes during the Team Leader and Team meetings. Team Leaders and administrators
showcased their successful implementation practices or “barrier busters” at state and regional meetings, state conferences, and the GraduateFIRST
Best Practice Forums. [D(4) Performance Assessment]
12
Procedures for submitting data for GraduateFIRST are detailed in the GraduateFIRST Implementation Manual. A monthly timeline and guide is
provided for Team Leaders and administrators to assist participating educators in understanding what data is required each month and how to submit
the data. Collaboration Coaches review with school administrators and Team Leaders how to submit data for GraduateFIRST. Schools new to
GraduateFIRST participated in an online module highlighting the GraduateFIRST process and the data submission required. Collaboration Coaches
assist with this data collection during the monthly Team Leaders coaching sessions. [D(5) Performance Assessment]
Through the 2012 Focus School Institute, GraduateFIRST Fundamentals, and monthly coaching, all GraduateFIRST administrators received training
on the GraduateFIRST framework and process, selecting a target group of students, and monitoring student progress to achieve higher student
outcomes. During Year 2, Collaboration Coaches supported administrators of the 18 new GraduateFIRST schools to ensure understanding of the
process.
During the GraduateFIRST Best Practice Forums (Elementary, Middle, and High School Forums) in December 2013, administrators, Team Leaders,
and other school/district representatives received additional information about GraduateFIRST supported practices and various ways to support
implementation. [E(1) Facilitative Administrative Support/Systems Intervention]
Throughout Year 2, Collaboration Coaches have made contacts with the administrators about school and student progress. Many administrators have
participated in the monthly coaching sessions with their Collaboration Coach. Additional resources and support materials such as videos, PowerPoint
presentations, links to archived webinars, and the Implementation Manual are available on the website to support professional development. [E(1)
Facilitative Administrative Support/Systems Intervention]
Often Focus School leadership team meetings are held with both the School Improvement Specialist and Collaboration Coach to provide ongoing
seamless support. Collaboration Coaches carefully review data to determine if administrators and Team Leaders need additional support or
professional development. Collaboration Coaches have concluded that in schools where the administrators are actively involved, there is greater
support and follow through. [E(1) Facilitative Administrative Support/Systems Intervention]
Building principals and the school leadership team in conjunction with their Collaboration Coach have continued to discuss perceived barriers to
implementation and ways to minimize those barriers. Some schools have revised policies and procedures to promote successful student outcomes.
School leaders have also discussed organizational changes that may be needed for successful implementation. [E(2) Facilitative Administrative
Support/Systems Intervention]
In GraduateFIRST, coaching is provided to help school leaders use student attendance, course performance, and discipline data to make decisions
about allocating resources to improve outcomes. [E(2) Facilitative Administrative Support/Systems Intervention]
For sustainability, the SEA Design Team recognizes the need for LEA district personnel to be actively involved in the implementation of
GraduateFIRST. The SEA Design is planning to pilot district-wide implementation in several sites next year. [E(2) Facilitative Administrative
Support/Systems Intervention]
13
The completed GraduateFIRST OSEP Reporting Worksheet of SPDG Evidence-based Professional Development Components is included in
Appendix A.
Rating of SPDG Evidence-Based Professional Development Components:
Of the 16 professional development domains within the OSEP Professional Development Worksheet, all 16 scored at least a 3 or 4 (100 percent).
This percentage met the Year 2 goal. In addition, five of the 16 professional development domains scored 4 on a four-point rating scale or 31.3
percent. Table 1 below shows the comparison of the professional domain ratings in Year 1 and Year 2 of the SPDG. One can note that five domains
improved from Year 1 to Year 2.
Table 1. Comparison of Year 1 and Year 2 professional development ratings – GraduateFIRST.
OSEP Professional Development Worksheet Item
A(1) Selection
A(2) Selection
B(1)Training
B(2) Training
B(3)Training
B(4) Training
B(5) Training
C(1) Coaching
C(2) Coaching
D(1) Performance Assessment (Data-based Decision Making)
D(2) Performance Assessment
D(3) Performance Assessment
D(4) Performance Assessment
D(5) Performance Assessment
E(1) Facilitative Administrative Support/ Systems Intervention
E(2) Facilitative Administrative Support/ Systems Intervention
Year 1 Rating
2012-2013
4
3
3
3
3
3
3
3
3
3
2
3
3
4
2
3
Components to be focused on in Year 3:
During Year 3 of the SPDG, additional emphasis will be placed on the following:
1. Development of training and supports for a district focus for GraduateFIRST
14
Year 2 Rating
2013-2014
4
3
4
3
3
4
3
4
3
3
3
3
3
4
3
3
2. Expansion and sharing of the successful modules from GraduateFIRST Best Practice Forum
3. Continuation of sharing information and resources about improving and sustaining the GraduateFIRST Initiative
Help Requested from OSEP in Relation to Professional Development for this Initiative: Additional examples of how to sustain successful practices
at the school and district levels.
1.b. Assessment of professional development - Initiative 2 –College and Career Readiness (CCaR)
The OSEP professional development worksheet was completed by the Coordinator of the CCaR Initiative, in collaboration with the College and
Career Readiness Specialists (CCaRS) and the SPDG design team and independently verified by other GaDOE staff working closely with the CCaR
Initiative. Following is a brief description of the progress being made in Initiative 2 with the professional development domains, components, and
specifications in the OSEP Worksheet.
Components in Place:
An emphasis of Initiative 2 in Year 1 and the beginning of Year 2 was on compliant practices in writing transition plans. Transition plans were
gathered from each school district from a pre-populated random sample. From January– March 2013, school districts verified the extent to which the
transition plans within the student IEPs are compliant. From March-April, school districts submitted 10 percent of a random sample of transition
plans to the GaDOE for their review. GaDOE and the CCaR Project personnel reviewed 20 percent of these plans for compliance. Based on this
baseline data, 15 school districts were chosen for participation in the CCaR Initiative. Participating agreements, with clear expectations for
professional development, were made between the school districts and the GaDOE [A(1) Selection].
At the beginning of Year 2, agreements were made, with clear expectations, between partner agencies [i.e., Georgia Vocational Rehabilitation
Agency; Career, Technical and Agricultural Education (CTAE); and the Georgia Council for Developmental Disabilities], who agreed to provide
support to the CCaR participating school districts [A(1) Selection].
Contracts/agreements were completed that outlined roles and responsibilities between the National Secondary Transition Technical Assistance Center
(NSTTAC) and the University of Kansas (KU) Transition Institute [A(1) Selection]. The KU Transition Institute is providing specific professional
development for the CCaR Project Personnel and participating school districts with the purpose of improving transition interventions focused on
evidence-based practices. NSTTAC is providing additional training for the state schools. An additional partner has been added from the Georgia
Inclusive Post Secondary Education Consortium who will take an active role with an agreement for developing practices and activities for the State
Transition Plan. [A(1) Selection].
Seven CCaRS were hired to be Initiative Coaches (one of the seven is a substitute) as well as one Core (supervisor) using contracts that outline clear
roles, responsibilities and expectations for professional development, coaching, and other support for the 15 participating school districts [A(1)
Selection] and [A(2) Selection].
15
The Kansas Transition Coalition, KU, provided a three-week Seminar Series for the CCaRS that focused on providing consistent knowledge of
evidence-based quality and best practices in transition. A two-day CCaRS Transition Coaching Institute was held in September 2013 on evidencebased practices in transition. The Institute also included strategies and roles for working with district/school level project leaders, strategies for
determining specific needs, CCaR Initiative planning and goal setting strategies, methods for problem solving and supporting school districts, and
strategies for engaging with project leaders through dialogue [B(1)-B(5)Training].
During October/November 2013 (Year 2), 16 Hitting the Mark trainings were held for 225 participants within the 15 participating school districts.
This training focused on writing and identifying components of a compliant transition plan using a Transition Documentation Checklist and rubric.
The seminar series, Transition Institute, and the Hitting the Mark training had clearly stated learning targets, incorporated adult learning principles
[B(2) Training] and were skill based [B(3) Training]. Pre-post training assessments were gathered for the Hitting the Mark training and the CCaRS
Seminar Series This pre-post training assessment and the post session participant evaluations from the Transition Institute were used to review
training content and delivery to determine needed modifications [B(4) and B(5) Training]. Three-month post evaluations were gathered for the
Transition Institute and the Hitting the Mark training to determine the usefulness of the training received in the implementation of evidence-based
transition programs and strategies in the participating districts [B(4) Training]. The GaDOE conducted a 10% fidelity check for the Hitting the
Mark Training, which involved staff observing the training process used by the Core and CCaRS, as well as checking the fidelity of the training
through review of artifacts such as pre-post test data and learning targets.
In February 2014, the 15 participating CCaR school districts began a 12-week on-line training and coaching series (TRAN-Qual) supported by the
KU Transition Coalition. The purpose of this training is to guide the districts through a self study of transition plan compliance and implementation
of effective transition practices. The 12-week on-line training includes online learning, adult learning principles, group discussion, applied learning
activities, data-based reflection on current practices/programs, and action planning and implementation [B(2-3]. Outcome data of participant
knowledge was collected and analyzed (pre and post testing) [B(4) Training].
The SEA has provided a .50 FTE dedicated to creating and overseeing the implementation of the CCaR Project including being responsible for
writing job descriptions, creating interview protocols to fill positions in the CCaR Project, setting up training, overseeing the fidelity of
implementation, collecting data using fidelity tools, reviewing outcome data to determine further training, and evaluating the effectiveness of the
Cores and CCaRS.
Cores and CCaRS are using multiple artifacts and sources of feedback to improve their coaching performance and support for participating schools
[C(1)-C(2)] Coaching]. The CCaR Coaches meet with their assigned district teams monthly to facilitate problem discussions and sharing of district’s
transition practices and initiatives [C(1)-C(2)] Coaching]. Regularly scheduled updates and fidelity information are discussed in monthly conference
calls between the participating state schools, and NSTTAC, as well as regularly-scheduled conference calls between the CCaRS and the Kansas
Transition Coalition. [B(1) Training] and [C(1)-C(2) Coaching]
All of the 15 participating school districts have a Leadership Team that is responsible for the implementation of each initiative provided through
training from the CCaRS or self-initiated evidence-based practices in their Transition Action Plans. The Leadership Team meets monthly with the
16
CCaRS and provide information on the progress of the initiatives and provide data required by the project, when necessary. Successes made are also
shared and celebrated. Monthly minutes and agendas are reviewed with artifacts from the meeting. To determine the need for additional
individualized feedback and support, GaDOE staff have provided professional development and problem solving sessions via webinars, emails,
conference calls, and face-to-face meetings. [D(1)-D(2) Performance Assessment]
School districts participate in a monthly practice of the skills acquired through the Hitting the Mark training which is tracked by the CCaRS for
fidelity of the training as well as proficiency of the districts. The fidelity of the participation, completion of components and acquisition of
knowledge in the KU Seminars is monitored by the KU Transition Coalition (KU) [D(1) – D(3) Performance Assessment]. The Self Study
Seminars include: Introduction to IDEA and Secondary Transition, Implementing Transition Assessment, Family Involvement and Student
Involvement in Transition Planning, Preparing for Employment and Postsecondary Education, and Interagency Collaboration in Transition Planning.
Fidelity data gathered informs the implementation process and identifies intervention changes needed for more effective outcomes, including district
data for Indicator 13, Effective Transition and Indicator 14, Post-Secondary Outcomes, as reported in the State Performance Plan (SPP). [D(3)-(5)
Performance Assessment]
Participating school districts are currently creating/revising their policies, practices, and procedures for transition with guidance from the GaDOE.
CCaRS monitor the progress of this process. District Transition Action Plans will be reviewed at the end of Year 2---for modifications to be carried
out during Year 3 of the SPDG (2014-2015 school year). [D(1)-D(3) Performance Assessment]
As required by the SPP and an integral part of the CCaR Initiative, Indicator 13 (transition plan compliance) is monitored annually with a review of
transition plans (Prong 1)—with additional transition plans reviewed (Prong 2) for those districts scoring less than 100% on Prong 1. Prong 1 and
Prong 2 data reviews are one fidelity measurement for successful implementation of the CCaR project.
As districts/high schools and feeder middle schools have implemented the CCaR Initiative during Year 2, the CCaRS as well as district and school
administrators have provided facilitative administration and other support to project leaders. The Core Specialist continued to supervise and, in
conjunction with outside partners, provide training and technical assistance to the CCaRS. CCaRS provided ongoing coaching and technical
assistance to the 15 participating CCaR school districts with support from the Kansas Transition Institute. The participating state schools have been
supported with ongoing coaching and technical assistance from the NSTTAC [E(1)-(2) Facilitative Administrative Support/Systems
Intervention].
The completed CCaR Initiative 2 Reporting Worksheet of SPDG Evidence-based Professional Development Components is included within
Appendix B.
Rating of SPDG Evidence-Based Professional Development Components:
Of the 16 professional development domains within the OSEP Professional Development Worksheet for the CCar Initiative, 11 were scored at least a
rating of 3 or 4 (on a four-point rating scale), or 68.8 percent. This percentage slightly exceeded the Year 2 goal of 65 percent. Table 2 below shows
17
the comparison of the professional domain ratings in Year 1 and Year 2 of the SPDG. Improvements were made from Year 1 to Year 2 in four
professional development domains.
Table 2. Comparison of Year 1 and Year 2 Professional Development Ratings – CCaR Initiative
OSEP Professional Development Worksheet Item
A(1) Selection
A(2) Selection
B(1)Training
B(2) Training
B(3)Training
B(4) Training
B(5) Training
C(1) Coaching
C(2) Coaching
D(1) Performance Assessment (Data-based Decision Making)
D(2) Performance Assessment
D(3) Performance Assessment
D(4) Performance Assessment
D(5) Performance Assessment
E(1) Facilitative Administrative Support/ Systems Intervention
E(2) Facilitative Administrative Support/ Systems Intervention
Year 1 Rating – 2012-2013
3
3
4
3
4
3
1
4
2
1
3
1
1
3
1
1
Year 2 Rating – 2013-2014
3
3
4
3
4
3
3
4
3
1
3
1
1
3
2
2
Components to be focused on in Year 3:
1. Continue with participating school districts in the implementation of evidence-based practices.
2. Development of guidance tool (rubric) for implementing quality IEPS.
3. Tracking of targeted students – New fidelity measure.
Help Requested from OSEP in Relation to Professional Development for this Initiative: Information about the STEPSS State Toolkit.
1.c. Assessment of professional development – Initiative 3 – Autism Early Intervention
The OSEP professional development worksheet was completed by the Emory Autism Center staff and independently verified by the GaDOE
Initiative 3 Coordinator. Following is a brief description of the progress being made in Initiative 3 with the professional development domains,
components, and specifications in the OSEP Worksheet---see Appendix C for a completed Worksheet.
18
Components in Place:
At the end of Year 1, the GaDOE SPDG key staff successfully recruited the collaboration and fostered active participation with leadership of crucial
statewide partnerships with Head Start (and Early Start), Babies Can’t Wait (Georgia early intervention program) and Georgia Department of Early
Care and Learning (pre-k lottery system), Parent-to-Parent, and the Georgia Parent Mentor Partnership [A(1) Selection]. Seventeen classrooms from
three Georgia regions (2 counties per region) participated actively throughout Year 2. Participants were selected based on their documented
incidence of autism and interest in participating and shared goals of increasing inclusion for children with autism in general education by at least their
kindergarten year. Site participants in the Autism Early Intervention Project are demonstration classroom regular and special education teachers,
paraprofessionals, and associated specialists (i.e., speech and language therapists, occupational therapists, etc). Parents of children in the Project are
also benefitting from Project participation.
Agreements are in place with clear expectation for Initiative 3 participation. [A(1) Selection] and [A(2) Selection]. To clarify expectations for
teachers and administrators, two introductory webinars were aired statewide (April 24, 2013 and August 20, 2013)—end of Year 1.
A formal agreement in Year 2 between Emory University and GA DOE contained increased specificity of collaboration, including a third Round of
training, a second webinar, three educational evening sessions, and additional supplemental coaching for each classroom between Rounds of training
[A(2) Selection].
Emory trainers/consultants completed the development of materials and assessment methodology, as originally agreed-upon. Three major training
modules, delivered in sequence across the school year focused on: 1) Social Communication, 2) Promoting Peer Interactions, and 3) Developing
Kindergarten Readiness Skills [A(2) Selection].
An additional experienced early childhood professional supplemented Emory training via attendance at training events (including parent meetings),
reviewing coaching procedures, assisting with project evaluation, contributing to system development issues needed to support the project goal of
preparing children for success in inclusive kindergartens, and providing coaching support for the participating classroom staff [A(2) Selection].
The GaDOE SPDG Coordinator of the Autism Early Intervention Project allocated a minimum of .25 FTE effort towards the following
responsibilities: (1) collaboration with Emory consultants on design of the training plan; (2) assurance that trainers met skill-level expectations; (3)
logistical planning and collaborative organization of all training events; (4) facilitation of communication among all project participants; (5) ongoing
assessment of fidelity of implementation and efficacy of the overall training plan, along with many related duties [B(1)-B(5)Training].
Trainings have incorporated strategies of adult learning principles [B(2) Training] by:
1. Developing rapport with participants: Awareness webinars and needs assessment activities were conducted near the end of year 1 and again
at the beginning of year 2 to introduce trainers and trainees and to begin the building of positive relationships.
2. Providing regular constructive and specific feedback: Coaching sessions emphasized frequent positive behavior-specific feedback on
observed teacher performances, classroom design, and child progress.
19
3. Facilitating reflective learning opportunities: Workshop and coaching sessions encouraged questions and active trainee participation.
4. Providing meaningful learning experiences that are clearly linked to student outcomes and assessment: Each full-day teacher workshop
provided two to three group participation and demonstration activities (e.g., small group discussions on how certain toys may be used for
teaching; team role-plays on how to teach independent daily living skills to students; and small and large-group brainstorming on social
objectives that may be needed by a child with autism).
5. Linking learning to desired outcomes: Learning objectives specified for each module guided all stages of materials preparation, didactic
presentations, small group activities, coaching, and assessment.
6. Promoting active participation: Opportunities to rehearse new skills was provided in the context of entertaining workshop activities as well
as during frequent coaching sessions.
7. Conducting coaching: At least two individualized coaching sessions were provided in every classroom following each workshop.
A variety of additional active learning activities were offered throughout each workshop for the purpose of maintaining interest and providing for
repeated practice by trainees. For example, large group discussions, along with frequent question/answer opportunities, were designed to secure
active participation by workshop participants. Two to three small group problem-solving exercises were also provided within each workshop to
ensure additional practice in applying new knowledge to common teaching challenges. [B(3) Training]
For each of the three teacher-training modules, pre- and post quizzes assessed the trainee’s acquisition of new knowledge, while survey’s captured
trainee satisfaction with the training provided. “Hands-on” training was provided in each demonstration classroom on the day following each
workshop, and at least one more time before the next training cycle began. Coaching sessions included modeling by one or more of the Emory
trainers/coaches, opportunities for trainees to practice new techniques, along with abundant positive behavior-specific feedback. Coaching notes
were prepared based on coaches observations of how new skills were being implemented. [B(3) Training]
Parent meeting presentations were tailored specifically to meet the interests and needs of parents, and parents were also welcomed to attend teacher
training events. Parent meeting presentations emphasized practical ideas that may be used to promote the inclusion of their young child with autism
at home and in other community settings (e.g., what to do about eating in fast-food restaurants, how to teach language on the playground or while
watching TV, etc.). [B(3) Training]
Pre- and post-tests were used to assess knowledge acquired during each of the three Rounds of workshop training, with care taken to ensure a match
between designated learning objectives and the participants’ acquisition of key knowledge. When indicated, adjustments were made in the wording
on quizzes (e.g., when many trainees seemed to miss an item due to unclear or overly complex wording). In addition, when indicated by weak results
in a certain area, the content of workshops during Rounds 2 and 3 were adjusted to incorporate additional review of difficult topics (for example, in
areas of reward assessment, selection of social objectives, etc.). [B(4) Training]
The Early Autism Center (EAC) trainers were selected as collaborators due to their unique positions as researchers/developers of the Walden
Incidental Teaching Model. The DOE Autism Early Intervention Coordinator played the primary role in evaluation of training events and feedback
to Emory trainers, and she was assisted by a team of key GA DOE/SPDG leadership. [B(5) Training]
20
The DOE Autism Early Intervention Coordinator played the primary role in evaluation of training events and feedback to Emory trainers, and she
was assisted by a team of key GA DOE/SPDG leadership. For example: the GaDOE Autism Coordinator and a DOE technical consultant conducted
dress rehearsals of Webinar presentations by Emory trainers, 2 to 3 DOE staff members attended live broadcasts of each Webinar, and the GA
DOE/SPDG leadership team conducted prior review and revision feedback to Webinar presentation materials that had been originally prepared by
Emory trainers. Training and assessment materials were carefully reviewed by the GA Autism Project Coordinator and other GaDOE/SPDG leaders
prior to presentation, and specific feedback offered to Emory trainers for revision prior to presentation and dissemination of all written products.
[B(5) Training]
One or more members of the GA DOE/SPDG leadership (including the Autism Project Coordinator) and often partner administrators (Parent-toParent and/or regional parent mentors) and the external Early Childhood Consultant were present at every teacher training and parent workshops,
which enabled them to provide Emory trainers with specific useful feedback following training events. [B(5) Training]
The DOE Autism Early Intervention Coordinator served as coaching evaluator. She observed coaching sessions provided by each Emory coach on
repeated occasions, offering constructive positive feedback and support to coaches. Three EAC Trainers served as coaches. All were experienced in
the use of “hands on” training using checklist-based performance appraisals. They also prepared consulting logs, consulting note formats, and sitespecific notebooks to ensure both fidelity of implementation and consistent follow-up across coaching sessions. Emory trainers/coaches also
developed coaching assessment materials, including an Incidental Teaching Checklist and Environmental Design Survey. [C(1)-C(2) Coaching]
In order to ensure consistency and follow-up across coaching sessions, Emory coaches complete Coaching Notes that detail areas of success since the
last classroom visit, as well as specifying the modeling and practice provided to teachers as they implement targeted skills. Follow-up reminders are
also documented on coaching notes. (See Attachment E for a copy of the coaching notes form, along with an example of a completed form that
summarizes events of a coaching session). The EAC coaches keep notebooks individualized per classroom to organize contact information/site
directions, coaching logs, coaching session notes, and classroom composition by child verbal status. [C(1)-C(2) Coaching]
The GaDOE SPDG Director oversees all aspects of development and collection of formative and outcome targets, measures and processes, and she
regularly reviews data collected over the course of this project. For example, she adjusted incidental teaching checklists developed by Emory coaches
as performance outcome measures to better assess fidelity of implementation by teachers. [D(1) Performance Assessment]
EAC trainers/coaches completed preparation of Year 2 fidelity and outcome assessment procedures, assisted by regular input from the SPDG staff.
Fidelity measures include the number and demographics of training participants, collected from the sign-in sheets (e.g., demonstration classroom
teacher/educator vs. other public school educators vs. other childcare specialists vs. parents of students with disabilities). Also being analyzed are
data on classroom staffing and student population arrangements, the quantity of training time, and the quality of both training and coaching as
perceived by trainees and attendees at parent meetings. Fidelity data will be compared to outcome data at the end of Year 3 to examine relationships
that may impact site selection and to help in identification of improvements that may be needed in partnership relationships. [D(1) Performance
Assessment]
Ongoing input from pre/post quizzes and consumer satisfaction surveys (collected in different formats by Emory trainers and SPDG personnel,
respectively) contributed to immediate refinements and improvements in subsequent presentations (e.g., reviews of key information reviewed,
21
materials adjusted, active learning opportunities increased, and logistical arrangements of training sites adjusted for convenience of attendees). [D(1)
Performance Assessment]
The primary goal of the Autism Early Intervention Project in Year 2 has been to develop and implement a training sequence that enhances the
knowledge and skills of teachers and other educational personnel in areas that enable them to better prepare young children with autism for success in
kindergarten. Achievement of targeted knowledge and skills by demonstration classroom trainees is the foundation of efforts to ensure continuous
academic and behavioral growth by their students with autism.
Pre/post teacher workshop quizzes. Teacher(s’) acquisition of knowledge associated with targeted learning objectives has been assessed in pre-post
workshop training quizzes. In areas in which post-quiz results indicated incomplete knowledge, additional review and assessment of content was
incorporated into subsequent teacher workshops.
Incidental Teaching Performance Appraisal Checklists. In order to track teachers’ development of incidental teaching skills that improve social
communication of young students with autism, a pre-post testing strategy is used covering the steps required to implement effective incidental
teaching episodes.
Environmental Design Checklists. Pre- and post-surveys are being used to measure the attractiveness of classroom arrangements and the display of
toys in a manner that creates learning opportunities. Most of the classrooms were well-equipped and arranged to attract children’s interests even prior
to training, albeit with some exceptions. Therefore, the emphasis of training in this area shifted more specifically to teachers’ abilities to select
teaching materials that attract frequent initiations and improve the engagement of children with autism and related disorders.
Observational data on teacher effectiveness. Detailed data collection procedures were adapted specifically for this project to measure overall levels
of classroom engagement, the amount of time that teachers are in close proximity to their students, and the level at which teachers interact with their
students. Two observational probes were collected before and/or at the beginning of the training year, and two final observations will be collected
and compared at the end of the training year (i.e., April and May, 2014).
Observational data on child behaviors. The impact of teachers’ skills on ongoing behavioral growth is of central importance to any personnel
preparation effort. The Pla-chek system outlined above also included objective measures of the children’s levels of engagement; amount of time
oriented towards teachers, verbalizations, and, watching; and/or receiving a social bid from a typical peer. Data are analyzed by population group
(children with autism/related disorders vs. typical peers) and as overall inclusive groups of students (where applicable). This permits comparison of
how children with special needs spend their time relative to how typical peers spend their time in similar circumstances. The overall classroom data
addresses whether teachers are able to achieve good results from all students in an inclusive classroom. [D(1) – (3) Performance Assessment]
During Year 2, leadership from virtually all statewide agencies that focus on service delivery to young children were involved by SPDG leadership as
active partners in project planning [including Part C – Babies Can’t Wait, Head Start (and Early Start), Department of Early Care and Learning
(DECAL), Parent to Parent, and Parent Mentor Partnership, in addition to LEAs in targeted regions. Project information and data has been
periodically shared with SPDG partners. From the outset of the Autism Early Intervention Project, the emphasis has been on establishing two-way
communication systems that both informs key training personnel on the needs of the stakeholders, along with ongoing efforts to use multiple
22
channels for communicating from project trainers to various groups of stakeholders. Interactions have been ongoing between school principals/
administrators when Emory consultants and/or Autism Early Intervention Coordinator were visiting schools or through routine phone and email
exchanges (e.g., providing opportunities such as an EAC coach offering a school principal specialized behavioral consultation for a child with autism
who had a couple of sudden tantrum episodes). [D(2-(3) Performance Assessment]
Data on child outcomes and kindergarten placement information on current and former students with autism (and related disabilities) in
demonstration classrooms were initially obtained during this project’s Needs Assessment during May 8-11, 2013 and later updated immediately prior
to the first training workshop offered in the fall of 2013. This needs assessment information has been updated on an ongoing basis to account for
children who moved placements mid-year (e.g., due to a family move). Additional methods for summarizing the social skills and kindergarten
readiness skills of classroom graduates are currently underway and will provide for careful analysis and refinement of other supplemental teacher and
child outcome measures used in this project. [D(4) Performance Assessment]
Throughout all training workshops (and at some parent meetings), EAC trainers/coaches have been able to incorporate examples of special
efforts/accomplishments by participating teachers/educators in each region. Teachers have often been asked in advance to give a brief example of
one of their strategies that has been especially successful (e.g., one teacher described her games that involve competition between classroom girls vs.
boys as being especially successful in generating enthusiastic participation by students with and without developmental delays). A number of
teachers have proudly volunteered reports of their children’s success, especially in the area of verbal language gains. Many teachers have asked to
visit Walden, and all have been assured of a welcome visit during the summer months. [D(4) Performance Assessment]
An informal appraisal has been made to identify two to three classrooms in each of the three regions that have excelled in terms of enthusiasm and
effective implementation of new knowledge and skills that have been the focus of this project [D(4) Performance Assessment]
Collection and analysis of incoming data, both fidelity and outcome measures, are crucial to evaluating the success of the training effort associated
with Year 2 of this project. Also noted above is the need for some refinements in existing measures, and creation of new scales for rating children’s
outcomes and fidelity of implementation of teaching procedures. D(5) Performance Assessment]
The GA SPDG leadership aims to strive for significantly expanded impact and sustainability of results. Other states have often invested in large-scale
autism “awareness-building” activities that unfortunately may yield minimal impact on the complex teaching skills needed to substantially benefit
young children with autism. This collaboration with Emory “lab school” consultants is unique in the challenge that partners have strived for a
balanced effort that yields maximum benefit for as many educators and students with disabilities as possible.
Experience to date has made evident that expansion of a role(s) in administrative support and training may substantially enhance this project’s
potential to achieve even larger goals. Specifically, administrators need assistance and information in how to best reward their most successful
educators (e.g., via provision of additional classroom materials, time for all team members to participate in training, and recognition with
opportunities for career development). Importantly, close coordination between this project’s SPDG leadership and local school administrators might
focus on ensuring placement of an optimum number of students with autism (i.e., not too few, not too many) within a given classroom. [E(1)-(2)
Facilitative Administrative Support/Systems Intervention]
23
Although immediate impact likely requires continued consulting collaborations, the sustainability of project impact will best be made by developing
statewide and regional systems that enable the GA DOE Coordinator of Early Autism Intervention to make use of regional demonstration sites that
permit ongoing specialized training for increased numbers of educators of young children with autism. [E(1)-(2) Facilitative Administrative
Support/Systems Intervention]
Based on coaching notes and the lead EAC coach/trainer opinion, more than half of the training sites achieved outstanding progress during Year 2
and all made some progress. Ratings were based on assessments of 1: children’s progress; 2) educators’ acquisition of incidental teaching skills; 3)
improvements in classroom environments; 4) implementation of strategies specifically targeted in workshop and coaching sessions; and, 5) children’s
overall levels of engagement. The most successful classrooms represented different sponsoring agencies (i.e., Head Start, and DECAL/DOE pre-k
classrooms). Similarly, the most challenged classrooms represented a range of administrative/staffing/inclusion arrangements. There is no question
that teacher’s entry-level skills and motivation to learn how to better serve children with autism was an important factor in success, yet advance
prediction variables are not entirely clear. [E(1)-(2) Facilitative Administrative Support/Systems Intervention]
The most crucial predictive factors may come from consultants’ ratings of classroom success, with input from experienced observers including the
GA Coordinator of Autism Early Intervention. Findings should inform predictive factors including fidelity measures that are sufficiently sensitive to
forecast desired outcomes. Ideally, the results will be sufficient to select educators and classrooms that may be most responsive to training efforts,
thereby obtaining the desired outcomes. [E(1)-(2) Facilitative Administrative Support/Systems Intervention]
Rating of SPDG Evidence-Based Professional Development Components:
Of the 16 professional development domains within the OSEP Professional Development Worksheet for the Autism Early Intervention Project, 10
were scored at least a rating of 3 or 4 (on a four-point rating scale), or 62.5 percent. This percentage exceeded the Year 2 goal of 50 percent. Table 3
below shows the comparison of the professional domain ratings in Year 1 and Year 2 of the SPDG.
Table 3. Comparison of Year 1 and Year 2 Professional Development Ratings – Autism Early Intervention
OSEP Professional Development Worksheet Item
A(1) Selection
A(2) Selection
B(1)Training
B(2) Training
B(3)Training
B(4) Training
B(5) Training
C(1) Coaching
Year 1 Rating –
2012-2013
3
2
3
4
3
2
3
3
24
Year 2 Rating – 2013-2014
3
3
3
3
3
2
3
3
OSEP Professional Development Worksheet Item
C(2) Coaching
D(1) Performance Assessment (Data-based Decision Making)
D(2) Performance Assessment
D(3) Performance Assessment
D(4) Performance Assessment
D(5) Performance Assessment
E(1) Facilitative Administrative Support/ Systems Intervention
E(2) Facilitative Administrative Support/ Systems Intervention
Year 1 Rating –
2012-2013
3
3
2
2
3
2
2
1
Year 2 Rating – 2013-2014
3
2
2
2
3
2
3
2
Components to be focused on in Year 3:
1. There will be an additional focus on Universal Design for Learning – Developing communication systems and rich classroom environments.
2. Emory Autism Center will develop permanent products (i.e., teaching modules and videos).
3. Refinements will be considered in existing measures, and new scales created for rating children’s outcomes and fidelity of implementation of
teaching procedures
Help Requested from OSEP in Relation to Professional Development for this Initiative: None at this time.
II. Post Professional Development Assessment
1.d. Usefulness of Initiative 1 – GraduateFIRST Best Practices Forum and overall Forum ratings
Best Practice Forums
A Best Practices Forum was held for GraduateFIRST participating schools---Elementary Schools (98 participants), December 3, 2013; Middle
Schools (120 participants), December 5, 2013; and High Schools (107 participants) - December 4, 2013. Forum topics were focused on evidencebased strategies to improve academic, social, and cognitive engagement to reduce dropouts, ensure academic success, and increase graduation. Table
4 below provides a summary of participants who rated the Best Practice Forum as exceeding their expectations in having practical benefit to their
schools, as well as overall Forum ratings (Rating 3 or 4 on a 4-point scale with 1 = Below Expectation, 2 – Met Expectations, 3 = Above
Expectations, and 4 = Well Above Expectations).
25
Table 4. Assessment of Practical Benefit of Best Practices Forum and Overall Ratings.
Grade Level
Elementary
Middle
High
Total - All
Grade Level
Elementary
Middle
High
Total - All
Practical Benefit of Best Practices Forum to My School
# Rating
# Rating Above
Well Above Total # Expectations
Expectations Rating 3
Total
% Rating 3
3
4
and 4
Respondents
and 4
16
22
38
44
86.4
17
20
37
45
82.2
17
26
43
46
93.5
50
68
118
135
87.4
Overall Best Practices Forum Rating
# Rating
# Rating Above
Well Above Total # Expectations
Expectations Rating 3
Total
% Rating 3
3
4
and 4
Respondents
and 4
19
21
40
45
88.9
25
18
43
46
93.5
21
26
47
49
95.9
65
65
130
140
92.9
As can be seen by Table 4 above, a total of 118 GraduateFIRST elementary, middle, and high school Best Practices Forum respondents or 87.4
percent reported that the Forum exceeded their expectations (above or well above) as having practical benefit. Of the total 140 responding
Elementary, Middle, and High School Best Practices Forum respondents, 130 or 92.9 percent reported that overall, the Forum exceeded their
expectations (ratings of 3 or 4).
1.e. Usefulness of professional development at monthly GraduateFIRST School Team meetings
School Team and Team Leader Meetings
Each of the participating schools has formed a School Team with a Team Leader. Training and coaching was provided for School Team Leaders in
each of their monthly meetings with Collaboration Coaches and School Improvement Specialists. This professional development focuses on the
following GraduateFIRST core components:
1. Establish an effective GraduateFIRST team.
26
2. Collect and analyze data using the GraduateFIRST Data Probe and GraduateFIRST Data Probe Discussion Guide.
3. Use data to identify and prioritize areas of need for intervention including school climate, attendance, behavior, academic content and
instruction, family/community engagement, and student engagement and to identify a target group of students.
4. Use data to develop an Action Plan that includes the selection of evidence-based practices.
5. Select a group of target students to follow progress in attendance, course completion, and behavior (in- and out-of-school suspensions).
6. Implement and monitor the School Action Plan with fidelity including conducting baseline measures, collecting and analyzing progress
monitoring data using the GraduateFIRST Assessment Tool, and adjusting over time in accordance with progress monitoring data.
7. Evaluate the effectiveness of the School Action Plan.
In addition to the above GraduateFIRST core components, professional development is provided in a number of evidence-based strategies related to
the Primary Areas of Engagement—academic, social, and cognitive. Meeting evaluations are gathered at each of the monthly Team meetings.
Agendas and minutes for each School Team meeting are submitted to the SPDG third party evaluators who review information received and send
monthly summaries by Collaboration Coaches to the SPDG staff. These summaries are shared with the Collaboration Coaches in their monthly
meetings with GaDOE.
A review of the monthly Team meeting evaluations from September, 2013 to February, 2014 showed that 213 of 278 respondents (76.6 percent)
provided the highest overall rating (4 on a 4-point rating scale) for the School Team meetings. Of 276 participants responding (two respondents did
not provide a rating), 213 or 77.2 percent strongly agreed (4 on a 4-point rating scale) that the information from the School Team meetings will be
useful for improving student engagement/achievement.
Professional Development - GraduateFIRST Fundamentals
For many of the 84 schools participating in GraduateFIRST during Year 2, initial training was provided during the 2012 Focus School Institute (Year
1). For the 18 schools new to GraduateFIRST in Year 2, online modules were accessed. The online modules are archived at www.graduatefirst
fundamentals.com. Many returning GraduateFIRST schools have also used this information to review the GraduateFIRST process and procedures.
Following are the six online modules:
1.
2.
3.
4.
5.
6.
Module 1 – Implementing GraduateFIRST
Module 2 – Supporting Student Engagement
Module 3 – Academic Engagement
Module 4 – Behavioral Engagement
Module 5 – Cognitive Engagement
Module 6 – Family and Community Engagement
A total of 90 participants received professional development during August-October 2013 using these GraduateFIRST on-line modules.
27
Professional Development - Influencer Training
A two-step Influencer training was held on May 29 and June 18, 2013 with a total of 19 participants (11 parent mentors, six stakeholders and the two
GaDOE Family Engagement Specialists). The purpose of this training was to gain skills needed to develop an influence strategy for teams and
organizations. Post session evaluations found that 50 percent rated the Influencer Training as excellent, and 50 percent provided an overall good
rating.
Professional Development for Collaboration Coaches
Professional development and refresher training is provided for Collaboration Coaches by the GaDOE Division of Special Services and Supports and
the Statewide Lead Collaboration Coach in their monthly face-to-face meetings. Training is provided with planned follow-up to further the
application of knowledge and skills in the GraduateFIRST School-based Teams in the participating schools.
Examples of these professional development sessions held during Year 2 are found in Table 5 below for a total of 185 Collaboration Coach
participants:
Table 5. Professional development for Collaboration Coaches in Year 2.
Date
5-16-13
8-15-13
10-17-13
11-14-13
1-16-14
2-20-14
2-20-14
3-20-14
4-17-14
Type of Professional Development
Smart Goals and Locating Effective Strategies
Web-based Learning
Managing the School Improvement Process (Indistar)
Selecting Evidence-Based Practices
Using Implementation Scales as a Coaching Tool
Literacy Teaching and Learning: It’s All about Access –
Evidence-Based Practices for Supporting Elementary Students
Using the GraduateFIRST Implementation Scales to Verify
Implementation
Using Artifacts to Verify GraduateFIRST Implementation
Adult Learning Principals
Total
# of Participants
18
28
23
21
21
21
18
17
18
185
Other GraduateFIRST Professional Development
Table 6 below provides a summary of other GraduateFIRST professional development provided during Year 2 for 338 participants.
28
Table 6. Other GraduateFIRST Professional Development – Year 2.
Date
9-11-13
6-27, 7-29, 7-10, and
1-24-13
11-4-13
2-6-14
2-6-14
3-18-14
3-18-14
Type of Professional Development
PMP Kick-Off/Setting the Pace to Graduate
Implementing GraduateFIRST at Georgia’s School Improvement
Leadership Academy
NDPC Conference: GraduateFIRST – A Framework for
Improving Graduation
Utah DOE & Wested Conference Dropout Prevention – Real
Work for Real Change
Utah DOE & Wested Conference Dropout Prevention – Digging
Deeper
Using Multiple Means of Engagement to Improve Student
Performance
Gainesville City Schools Framework to Address Learning
Barriers
# of Participants
27
15
45
45
94
37
Total
75
338
Family and community engagement activities in the GraduateFIRST Initiative 1 and the College and Career Readiness/Transition Initiative
2
Parent Mentor Partnership
The Parent Mentor Partnership (PMP) continues to be a strategic partner within the Georgia SPDG in Year 2. Nearly 100 Parent Mentors work with
special education directors in 92 Georgia school districts to help build a bridge joining administrators, teachers, staff, families and communities to
help students with disabilities succeed in school. Parent Mentors work with families by providing resources, tips, and ideas to help parents guide
their youth throughout their school careers and their transition from school into adult life. Parent Mentors also work with communities to help create
job and recreational opportunities for students with disabilities and special needs and to improve the quality of life for adults with disabilities. As part
of the criteria for participation in the GaPMP, parent mentors along with their special education directors develop an Annual Plan that includes the
indicator that they will focus their work on along with the vital behaviors that they will work on to impact family engagement for that indicator. At
the end of the year, each mentor is required to submit a final report to the Family Engagement Specialist at the GaDOE, on the outcomes of their
plans.Of the 88 plans that have been submitted, 25% have focused their work on Indictors 1 and/or 2 and 45% on Indicators 13 and/or 14.
In September 2013, the SPDG GraduateFIRST Director participated in the Annual Parent Mentor Partnership Kick-off Conference to provide
information about the framework of GraduateFIRST and to encourage parent mentors to work with the GraduateFIRST School Teams.
29
The SPDG Parent Support Specialist has provided a checklist for parent mentors, which includes scheduling time with Team Leaders in the
participating GraduateFIRST schools to discuss family engagement and the involvement of families in efforts to increase graduation rates. A
PowerPoint presentation was developed to provide information about parent mentor activities to GraduateFIRST School Teams in January, 2014.
This PowerPoint presentation was used in the Clayton, Cobb, Douglas, and Atlanta Public Schools Graduate FIRST School Team meetings. A total
of 72 School Team members attended these presentations.
A small pilot of Engaging Families of Targeted Students in GF was implemented this year. Three schools were selected from Wayne, White and
Haralson Counties. In September 2013, parent mentors attended a pre-session at the GaPMP Kickoff Conference, Setting the Pace to Graduate”.
The goal of the session was to provide mentors with a more comprehensive understanding of the GraduateFIRST initiative and to discuss some
possible vital behaviors that mentors could focus on with families. In January, the three mentors from the above targeted districts were asked to
work with the GF teams to identify a select group of parents that they could reach out to and track. The mentors called the families with a series
of questions that could be used in a survey format for the future to begin an ongoing conversation about their son/daughter. Parent Mentors then
ranked the parent responses as knowledgeable, somewhat knowledgeable and not knowledgeable and used this information to develop assistance
for the families.
During Year 2, in support of the CCaR Initiative, the Parent Mentors have collaborated with school districts and communities to develop and
implement transition fairs (Initiative 2). Students received information in sessions such as: Making a Working Budget, Making Good Choices
regarding Drug and Alcohol, Learning to Drive, Getting My First Job, Independent Living, Self Protection, and Self Defense. Evening meetings are
held for parents to provide information on resources to support their students in post secondary transition.. During Year 2, 21 transition fairs were
held including those in Catoosa County, Walker County, Monroe County, Oconee County, Columbia County, Richmond County, Gwinnett County,
and Douglas County. Vendors participated from education, independent living, Social Security/SSI/SSDI, recreation and job training. An example
of one of the transition fairs was the Marion County event held by the Gainesville Hall Interagency Transition Council in March 2014, where 500
students, parents, teachers, administrators, and community members participated.
The SPDG initiatives increase family engagement through the DOE’s Parent Mentor Partnership. The SPDG staff included MAPs (person-centered
planning tool) in the Goal Setting training of May 2013, which was led by the Director of Individual and Family Support, Center for Leadership in
Disability, Georgia State University. The MAPS training focused on four learning targets involving learning to explain the person-centered planning
philosophy, gaining skills to facilitate MAPS, and being able to explain how MAPS can be used in participating school districts as well as in the
Active Student Participation Inspires Real Engagement (ASPIRE) student-led IEP process. The six areas of MAPS were reviewed and discussed
along with the guiding questions and comments that a facilitator could use when facilitating a MAP. Eighteen MAPS were facilitated during the
eight months following the two-day training session. Participants were provided a copy of PATH and MAP Handbook, Person-Centered Ways to
Build Community. Participants included 26 parent mentors, three GLRS staff, and two GaDOE/SPDG Family Engagement Specialists. Participant
evaluations showed that 80.6 percent were highly satisfied, the presenter was knowledgeable about the topic, and the presenter was engaging.
Participants (77.4 percent) reported that they were highly satisfied with learning new skills and that the information was current and useful.
Parent to Parent (P2) – Georgia’s Parent Training and Information Center.
30
The Parent to Parent (P2P), Georgia’s Parent Training and Information Center has also continued to be a strategic SPDG partner during Year 2 of the
SPDG. Table 7 provides a summary of P2P activities and participants during the first three quarters of Year 2 SPDG reporting period that support
both Initiative 1 – GraduateFIRST and Initiate 2 – College and Career Readiness/Transition. The fourth quarter data/information will be available
later this year and reported in the Year 3 Annual Performance Report.
Table 7. Selected P2P Activities During Year 2 to Support Georgia SPDG Initiatives
Type of Activity
Families assisted by telephone or in person
Families matched to a trained supporting parent
Families who become trained as a Supporting Parent
Parent On-Line Support Project
Training to or in conjunction with parent mentors
Trainings offered related to discipline, graduation, and academic achievement of
students with disabilities
Youth group discussions
Supplemental Curriculum (Encore) training provided
Project Video Clips distributed online
Webinars provided with adults and youths co-presenting regarding transition
Number Impacted
2,467
239
24
112
22
55
40
1 (15 Participants)
2
2 (59 Participants)
An analysis of evaluations (5-point rating scale) of participants attending P2P trainings during Quarter 1 and Quarter 2 (July, 2013 to December,
2013) showed a rating of 4.47 – Sessions met expectations; 4.48 – Materials used in the session were of high quality; 3.73 – Received relevant
information needed to make decisions about my child’s education and/or health services; and 4.41 - Information received will be useful for helping
me advocate for and improve my child’s services.
During Year 2, the P2P continued to administer and moderate the Online Transition Support Project to ensure accurate and timely sharing of
information and encouraging appropriate and positive dialogue among participants. The Online Transition Support Project was developed and
implemented in Year 1 as a result of feedback from students who participated in ENCORE student empowerment sessions across the state.
ENCORE classes/sessions focused on student empowerment/advocacy. The Online Transition Support includes discussion boards, announcements,
photos, activities, and other resources such as P2P Encore trainings and Transition Council meetings.
A Facebook Youth Group page and a Georgia Transition – Parent Group page have also been initiated for the purpose of sharing of information,
resources, and concerns.
Circle of Adults Focusing on Education (C.A.F.E.)
31
A Circle of Adults Focusing on Education ( C.A.F.E.) is a collaborative stakeholder team at the local school and/or school district level created to
address an identified need during the school year. C.A.F.E.s focus on way to engage families with education issues in order to improve student
achievement, particularly for students with disabilities and/or students at risk of not graduating. This collaborative stakeholder team is comprised of
family, educator, and community members who combine their knowledge of real-life family experiences, educator know-how, and community
resources to develop solutions to identified problems. During the previous 5-year SPDG, three C.A.F.E.s were implemented: Manchester High
School in Meriwether county Rutland High School in Bibb County, and Elbert County High School in Elbert County. Other C.A.F.E.s were started
in Wayne County High School, Haralson County Middle School, and Thomaston Upson Middle School. Successes and accomplishments of these
C.A.F.E. teams have been reported in previous Annual Performance Reports to OSEP.
C.A.F.E. activities in these schools have varied during Year 1 and 2 of the current SPDG, but they continue to build partnerships between the
schools, parents, and the community. The goal for Year 2 of the SPDG (2013-2014 school year) is to add two or three additional C.A.F.E. teams to
complement the work of the GraduateFIRST (Initiative 1) and CCaR Teams (Initiative 2) by engaging the community and families in outcomesbased action, but on a district level to more effectively scale this concept up for sustainability. The earlier C.A.F.E.s have had some school changes.
For example, the Thomaston Upson Middle School C.A.F.E. was moved to the high school because of a change in principals.
Although development during Year 2 has been slow, five new schools have indicated interest in starting a C.A.F.E. This work will be further
reported on in Year 3. A C.A.F.E. Dialogue Implementation Guide has been developed for use in selected schools starting a new C.A.F.E.
ASPIRE
The SPDG staff supported the Transition Fair in Gwinnett City during Year 1 and trained 25 parents on ASPIRE.
1.f. Application of learning - Engaging Families of Students with Disabilities for Student Achievement Forum,
University of North Georgia
In February 2014, an Engaging Families of Students with Disabilities for Student Achievement Forum was held at the University of North Georgia
and supported by the Georgia SPDG. The purpose of this Forum was for pre-service teachers to understand the importance of engaging families of
students with disabilities and to conceptualize families as experts and partners in student learning. Of the total 86, participants, 76 were pre-service
teachers from the University of North Georgia. Other participants in this half-day professional development included three University of North
Georgia professors, seven parent mentors, two parent liaisons, two parent panelists, staff from the North Georgia GLRS and the GaDOE SPDG. This
Forum included parent panels, a review of the Family Friendly Checklist, and concurrent sessions on topics such as the PTA National Standards for
Family and Community Partnerships, “Speaking Up for Every Child”, “Sharing Power,” and “Collaborating with Community”.
Table 8 below provides participant feedback from this Forum. As can be seen, 73 of the 86 participants (84.9 percent) completed post session
evaluations. Of the 73 respondents, 78.1 percent indicated that they could apply this learning effectively in their work with parents in the area of
promoting meaningful communication with parents of students with disabilities (Learning Target 1), compared to 83.6 percent for Learning Target 2
– attendance, behavior, and course performance for elementary students impacting graduation. Of the total Forum respondents, 53.5 percent rated the
32
content well above expectations, compared to 60.3 percent for Forum presenters, 42.5 percent for Forum interaction. Results for the overall Forum
rating indicated that 61 of the 73 respondents (83.6 percent) reported that the Forum was above or well above expectations.
The SPDG Parent Support Specialist provided follow-up information and resources to the Forum preservice participants including website links that
provide assistance in building adult capabilities to help children’s outcomes, family engagement in the transition to kindergarten, and other
topics/tools.
Table 8. Participant Feedback – Engaging Families of Students with Disabilities for Student Achievement Forum.
Target #1
Target #2
Content
Presenters
Interaction
Overall
Rating
1
2
3
4
Below
Met
Above
Well Above
Total
Expectations Expectations Expectations Expectations Responses
1
15
57
73
1
11
61
73
17
17
39
73
7
22
44
73
19
23
31
73
1
11
26
35
73
% for 1
Rating
0.0
0.0
0.0
0.0
0.0
% for 2
Rating
1.4
1.4
23.3
9.6
26.0
% for 3
Rating
20.5
15.1
23.3
30.1
31.5
% for 4
Rating
78.1
83.6
53.4
60.3
42.5
1.4
15.1
35.6
47.9
Total
Pctg
100.0
100.0
100.0
100.0
100.0
100.0
1.g. Usefulness of Initiative 2 – College and Career Readiness – Training for CCaRS
The designated supervisor (Core) and eight coaches (CCaRS), as well as one CCaRS substitute continued their work in Year 2 providing professional
development, coaching, and other support to the 15 school districts participating in the College and Career Readiness Initiative. The University of
Kansas Transition Coalition provides backup support to the Cores and the CCaRS. The NSSTAC provided support for three participating state
schools (i.e., Georgia Area School for the Deaf, Atlanta Area School for the Deaf, and the Georgia Area School for the Blind).
During Year 2, the CCaRS and Core Specialist participated in online training seminars focusing on specific topics related to transition planning and
services. These 3-week seminars were held over a 6-month period of time to assure that all CCaRS and Cores had acquired similar and consistent
information about effective transition practices and followed through with reinforcement activities. In addition, the online seminars were designed to
allow the CCaRS to work together to develop products and learn about materials and resources they could use to support the participating CCaR
school districts. Following are the topics of the online training seminars:
1. Introduction to IDEA and Secondary Transition – 3 Weeks – June, 2013
2. Implementing Transition Assessment – 3 Weeks – July 2013
3. Family Involvement and Student Involvement in Transition – 3 Weeks – August/September, 2013
33
4. Preparing for Employment and Postsecondary Education – 3 Weeks – September, 2013
5. Interagency Collaboration in Transition Planning – October/November, 2013
Each Online Seminar had specific learning targets and pre/post tests for the participants. Following is a summary of comparisons of pre and post test
scores, which showed significant differences in all four seminars.
Seminar 1 – Introduction to IDEA and Secondary Transition
Paired-samples t-test: N = 10, t = 3.021, p = -.013
Seminar 2 - Implementing Transition Assessment
Paired-samples t-test: N = 9, t = 2.512, p = .033
Seminar 3 - Family Involvement and Student Involvement in Transition
Paired-samples t-test: N = 7, t = 5.814, p<.001
Seminar 4 - Preparing for Employment and Postsecondary Education
Paired-samples t-test: N = 13, t = 5.078, p <.01 for all for learning targets
In each of the four online seminars, the CCaR participants were asked whether they could apply what they learned in their work with the participating
school districts. Table 9 below provides information regarding participant perception of the application of knowledge learned in the seminars. Note:
Data was not available for Seminar 2.
Table 9. Application of Knowledge/Skills Learned in Online Seminars.
Online
# of
Strongly Agree Knowledge
Seminars Respondents Learned can be Applied - # and %
10
5
50
Seminar 2
7
3
43
Seminar 3
7
3
43
Seminar 4
Average %
45.8
Agree Knowledge Learned
can be Applied - # and %
4
40
4
57
4
57
50.0
Neutral - # and
Percent
1
10
4.2
Total Percent
100
100
100
100
1.h. Application of learning - College and Career Readiness Institute (Transition Institute) and Hitting the Mark
Training
The GaDOE has entered into a partnership with University of Kansas, Transition Coalition, Center for Research on Learning, to provide training and
ongoing coaching and assistance to the participating district teams. A partnership has also been made between GaDOE and the National Secondary
34
Transition Technical Assistance Center (NSTTAC) to provide training and technical assistance for the Georgia State schools to develop and
implement of transition Action Plans as well as evidence-based transition practices.
In September, 2013, a College and Career Readiness Institute (Transition Institute) was held for participating district teams. At this Institute, teams
from participating districts received information regarding evidence-based transition practices and developed an Action Plan to be implemented with
CCaRS support of the CCaRS. Teams from the Georgia State Schools also participated in the Institute. The overall Learning Targets for the
September Transition Institute were:
Listing and explaining three specific needs in each district in the area of College and Career Readiness that must be addressed in order to help
students with disabilities improve post-secondary outcomes.
2. Listing ways that district transition assessment can be used to improve transition planning including procedures for transition assessment
eighth grade through graduation.
1.
The 15 participating school districts and the state schools also participated in regional Hitting the Mark trainings in October 2013, focused on writing
and identifying components of a compliant transition plan using a Transition Documentation Checklist Rubric.
Table 10 below provides information regarding participant perception of the application of knowledge learned in the Transition Institute and Hitting
the Mark Training. Of the total 257 participants attending both the Transition Institute and Hitting the Mark training, 236 or 91.8 percent reported
that they could apply the knowledge learned without or with some support.
Table 10. Application of Knowledge/Skills Learned in Work with Transition Plans in Transition Institute and Hitting the Mark Training
Professional
Development/
Training
Transition
Institute
Hitting the Mark
Can Apply Knowledge
# of Total
Learned with Significantly
Respondents More Training or Follow-up
#
%
79
178
9
12
11.4
6.8
Can Apply Knowledge
Learned With Some
Support
#
%
29
91
36.7
51.1
Can Apply Knowledge
Learned Without
Support
#
%
41
75
Other Initiative 2 – College and Career Readiness/Transition Professional Development
1.i. Pre/post assessment gains for participants of Hitting the Mark Training
35
51.9
42.1
Total Percent
100
100
Table 11 below provides a summary of pre/post assessment scores for 15 Hitting the Mark training sessions for 203 of the 225 participants (90.2
percent). The average pre-training score was 68.0 and the average post training score was 90.86, which is a 33.5 percentage point increase from pre
to post training.
Table 11. Pre-post Evaluations for Hitting the Mark Training Participants.
Date of Training
10/2/2013
10/8 & 10/16/2013
10/10/2013
10/17/2013
10/18/2013
10/21/2013
10/21/2013
10/22/2013
10/22/2013
10/22/2013
10/23/2013
10/30/2013
10/30/2013
11/5/2013
11/5/2013
11/18/2013
District
Self-Identified
Marion
Wayne
Webster
State Schools
Seminole
Wilcox
Dooly
Vidalia City
Upson
Liberty
Haralson
State School - GAB
Dekalb
Bleckley
Greene
Total
#
Trained
34
10
28
4
8
12
2
7
6
24
9
30
16
13
14
8
225
#
completed
31
6
27
4
8
12
2
7
3
18
8
29
14
13
14
7
203
Pre
Average
68.03
64.83
58.63
67.50
76.00
63.58
67.00
54.57
82.00
57.83
80.63
73.10
58.00
72.08
70.21
74.00
Post Average
91.71
74.17
90.78
91.25
96.63
89.17
91.00
88.71
92.33
99.00
95.63
92.59
80.43
92.23
96.14
90.86
CCaR On-Line Training and Coaching
In February 2014, the 15 CCaR Project school districts began participating in a 12-week on-line training and coaching (TRAN-Qual) series supported
by the University of Kansas Transition Coalition. The purpose of this training is to guide the district through a self study of transition plan
compliance and implementation of effective transition practices. The 12-week on-line training includes online learning, group discussion, applied
learning activities, data-based reflection on current practices/programs, and action planning and implementation. The training includes three
Transition Coalition Self-Study units. Unit 1 has been completed and units 2 and 3 are to be completed:
36
IDEA and Secondary Transition – Overview of the transition requirements of IDEA and best practices in planning for the transition from
school to adult life.
2. Transition Assessment – The Big Picture – Various types and approaches to transition assessment and the continuous assessment process.
3. Student Involvement and Self-Determination – General knowledge of self determination and a framework for providing self-determination
instruction for the students, and student involvement in the transition/IEP planning process.
1.
As of March 17, 2014, teams from the 15 participating CCaR schools districts are at or around week seven of the 12 weeks, taking into account
adjustments for weather related cancellations.
Monthly Team Meetings
Transition School Teams from each of the 15 participating school districts meet monthly for the purpose of ongoing professional development and
review of progress made on Action Plans. Table 12 below provides a summary of these Transition Team meetings involving 98 participants from the
Initiative 2 participating school districts.
Table 12. CCaR/Transition Meetings Held During Year 2.
Date
12/4/13, 1/9/14
11/19/13, 12/3/13, 2/4/14
11/18/13, 1/27/14, 2/26/14
11/4/13, 1/28/14, 2/26/14
11/21/13, 12/10/13, 1/24/14
11/20/13, 1/21/14
10/27/13
11/18/13, 11/19/13, 12/16/13
2/24/14
11/20/13, 1/9/14, 2/3/14
1/17/14, 2/25/14
12/12/13, 2/25/14
10/1/13, 11/2/13, 12/13/13, 1/25/14, 2/25/14
9/27/13, 11/2/13, 12/7/13, 2/20/14
District Meeting
Marietta City
Habersham
Liberty
Vidalia City
Seminole
Dooly
Upson
Marion
Greene
Wilcox
Wayne
Dekalb
Haralson
Webster
# of Participants
16
18
13
19
13
11
22
17
12
12
7
6
60
16
37
Date
District Meeting
# of Participants
Total
242
1.j. Usefulness of the ASPIRE Innovative Practices Forum.
During Year 4 of the previous SPDG, a student-led IEP project known as Active Student Participation Inspires Real Engagement (ASPIRE) was
initiated as a way of improving the quality of IEPs, including transition plans and providing students with self determination/self advocacy skills.
The ASPIRE program was implemented in 12 schools in Year 4, involving 72 students. During Year 5 of the previous SPDG, 56 additional schools
implemented Project ASPIRE—for a total of 68 participating schools. During Year 6 (no-cost extension year), 21 school districts implemented
ASPIRE. Of the 21 districts, one district implemented ASPIRE as a district-wide initiative. Within that district, there were 33 schools implementing
the student-led-IEP program. The 33 schools represent elementary/primary (100 percent), middle (55 percent) and high schools 71 percent).
During Year 1 and 2 of the current SPDG, ASPIRE has continued to be implemented throughout the State. Approximately 23 school districts and two
Georgia Network for Educational and Therapeutic Support (GNETS) are implementing ASPIRE student-led IEPs. Approximately 1,000 students
with disabilities are participating in student-led IEP meetings/process. Wayne County Middle School, Wayne County High School, Habersham
Middle School, and Habersham High School are implementing ASPIRE as part of their Action Plan in the CCaR project. The Houston County
School District is also implementing ASPIRE district-wide..
Five GLRSs were each awarded a $5,000 mini-grant to implement ASPIRE in their assigned districts. The grant money could be used for materials,
training, follow-up, and data collection. The following is a list of the GLRS Center and the number of entities they are supporting:
1.
2.
3.
4.
5.
Southeast Georgia GLRS: 7 Schools, and 1 GNETs
East Central GLRS: 12 Schools
North Georgia GLRS: 6 Schools and 2 GNETS serving 4 Schools
Northwest GA Center GLRS – Training
Northwest GA GLRS – Focus on Parent Trainings.
In addition to training schools in the implementation of ASPIRE, additional requirements for mini-grants included conducting training sessions with
the parents of students participating in the program. The GLRS are responsible for the monitoring of the fidelity of the program. All GLRS report that
districts provided the opportunity for parents to participate in the ASPIRE training by providing a face-to-face training or viewing the parent training
videos provided on the DOE website.
On March 6, 2014, an ASPIRE Innovative Practices Forum was held with 45 participants. The Forum focused on seven transition learning targets.
In addition to a general session –“I-Pad and the IEP” and a lunch session – “District-wide Implementation of ASPIRE”, nine carousel presentations
38
were provided. Participants were encouraged to share notes and reflections from this conference, as well as complete a Forum evaluation. An
ASPIRE Toolkit was provided for participants at the Forum. Of the 45 participants attending the Forum, 35 submitted evaluations (83.3 percent). Of
those participants reporting, an average of 32.3 or 92.3% reported that across the seven learning targets, they could apply the Forum learning
effectively; or with some support, in their work.
1.k. Increased knowledge and skills of incidental teaching Round 1 and 2 Training by the Initiative 3 teachers and other
project participants
Three major training modules, delivered in sequence (three Rounds of training) across Year 2 focused on: Social Communication (Round 1),
Promoting Peer Interactions (Round 2) and Developing Kindergarten Readiness Skills (Round 3). The content of training emphasized an Incidental
Teaching approach to improving early intervention strategies for young children with autism. The three workshop training modules were organized
as follows:
1. Verbal communication – Training content focused on how to teach verbal language to children with autism who are nonverbal, language
delayed, or delayed in the social aspects of conversational speech. Also highlighted were strategies for identifying reinforcers that improve
the motivation of children with autism to speak.
2. Promoting Peer Interactions – Information on how to specify social objectives was supplemented with research-based methods for
improving interactions with teachers/adults. Also emphasized were games and teaching strategies that may be used to improve skills in
interacting with same-aged peers.
3. Kindergarten Readiness – Specific ways to teach and promote children’s independent daily living skills were identified, along with reviews
of procedures for promoting language and social skills that contribute to successful inclusion outcomes. Strategies for blending individualized
academic objectives into activities that encourage engagement by children with autism were also discussed.
“Hands-on” training was provided in each demonstration classroom on the day following each workshop, and at least one more time before the next
training cycle began. Coaching sessions included modeling by one or more of the Emory trainers/coaches, opportunities for trainees to practice new
techniques, along with abundant positive behavior-specific feedback.
Over 100 teachers and partners participated in the three trainings. In order to reach all participants, Rounds one and two of incidental teaching
training were presented at three different times and in three regions for a total of six sessions. Pre and post data were collected from each
participant. This enabled testing for significant learning to have occurred on the presentation learning targets. Table 13 below provides the learning
targets for the Round one and two training sessions along with statistical results that test for the occurrence of significant learning. As can be
observed, statistically significant gains at the .000 level on all 10 of the learning targets were achieved.
Table 13. Learning Targets for Round one and Two Training Sessions and Statistical Paired t Test Results.
Learning Target
Number Providing
Feedback
39
Pre Mean
Post Mean
p Level of Significance
Learning Target
Explain why inclusion is helpful
List and define or explain major Steps of Incidental
Learning.
How you create a "teachable moments".
How to conduct 1:1 Teaching Sessions.
Create and Implement a Reinforcement Assessment.
Blend incidental teaching of language into everyday
classroom activities.
Define and write appropriate social objectives.
Describe 2 procedures used to promote peer
interactions.
Better consider sensory preferences.
Name 3 factors to consider when helping a parent
blend incidental teaching into their home.
Total for all learning targets.
Number Providing
Feedback
64
Pre Mean
2.0
Post Mean
3.6
p Level of Significance
p=.000
64
64
64
64
1.4
2.4
2.3
1.9
3.4
3.6
3.5
3.3
p=.000
p=.000
p=.000
p=.000
64
44
44
2.2
2.7
2.6
3.5
3.5
3.6
p=.000
p=.000
p=.000
44
2.4
3.5
p=.000
44
2.3
3.5
p=.000
10
121.5
196.5
p=.000
Round 3 trainings were conducted during February and March 2014 and covered the area of Kindergarten Readiness. The post training instrument
captured the participants’ responses to various aspects of the training, using a 1-6 point scale, with 6 being the highest rating. In all cases, the 13
items on the post evaluation instrument received an average rating of 5 or higher.
1.l. Usefulness of the incidental teaching Round 1 and 3 Training by parents
Round one of the Autism Early Intervention Project provided a workshop concentrating on Social Communication and the third Round training
focused on Kindergarten Readiness. Parents attended both Rounds of training and at the conclusion of the training session, the parents provided post
training feedback using an instrument designed by the Emory Autism Center. The item scaling was from 1 to 6 with 6 being “extremely good or
helpful”. Table 14 below provided the average response given by the parents that provided session feedback. As can be observed in Table 14, all
items received an average rating of 5.0 or greater with the exception of one item for the Round one session at Forsyth/Gainesville/Hall.
Table 14. Feedback Item Averages of Parent Attendees in Round One and Three Session Trainings and the Number of Attendees.
Feedback Instrument Item
The subjects were well chosen.
Round One
Round Three
Chatham/Effingham Forsyth/Gainesville/Hall Chatham/Effingham Bibb/Houston
5.6
5.5
5.7
6.0
40
Feedback Instrument Item
The instructors were knowledgeable.
The methods of instruction were appropriate.
The presenters were engaging and kept my
attention.
The instructional materials were useful.
I gained new knowledge and insights.
The training provided me with a thorough
understanding of incidental teaching.
I feel comfortable using incidental teaching
with my child.
The information was relevant to my child.
I am satisfied with the opportunity I had to
participate.
Informal conversations with other participants
were beneficial.
I would recommend this workshop to others.
The workshop was well organized.
The time of the workshop (moth, day, hour)
was convenient.
The length of the workshop was appropriate.
Number of Parents Attending
Round One
Round Three
Chatham/Effingham Forsyth/Gainesville/Hall Chatham/Effingham Bibb/Houston
5.7
5.8
5.9
6.0
5.3
5.7
5.9
6.0
5.7
5.3
5.9
6.0
5.5
5.6
5.2
5.3
5.6
5.3
5.6
5.9
5.3
5.9
6.0
5.8
5.3
5.4
5.4
5.7
5.0
5.7
5.3
5.4
5.6
6.0
5.9
6.0
5.3
5.2
5.9
5.9
5.5
5.4
5.5
5.6
5.3
4.5
5.7
5.9
NA
6.0
5.9
6.0
5.3
7
5.3
9
NA
7
6.0
9
1.m. Video on-line professional development modules and accompanying guides, using evidence-based practices to
improve teacher effectiveness on the TKES performance standards
Georgia’s Competitive Preference Priority will develop targeted professional development to meet the needs of teachers identified via the new
Georgia Teacher Keys Evaluation System (TKES), which includes a growth measure. During the 2013 session that ended on March 28, 2013, the
Georgia Legislature passed House Bill 244, authorizing TKES and its student achievement components. Georgia has completed the first post-pilot
implementation year of this new teacher evaluation system. The Competitive Preference Priority will not begin until Year 3 (fall 2014).
41
OMB No. 1894-0003
Exp. 04/30/2014
U.S. Department of Education
Grant Performance Report (ED 524B)
Project Status Chart
PR/Award # (11 characters): ______________________
SECTION A - Performance Objectives Information and Related Performance Measures Data (See Instructions. Use as many pages as necessary.)
2. Project Objective
[ ] Check if this is a status update for the previous budget period.
Program Measure 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported
practices over time.
2.a. Performance Measure
Average percentage of participating GraduateFIRST schools
scoring at least a 2 on a four-point scale in Graduate First
Process and the Primary Areas of Engagement fidelity
assessments in Year 2.
2.b. Performance Measure
Average percentage increase in GraduateFIRST Process and
Primary Areas of Engagement fidelity assessments from fall
to spring – Year 2.
2.c. Performance Measure
Average percentage of students assessed across grade levels
and across Intervals 1 and 2 in GraduateFIRST by the Student
Assessment Tool having no days absent across data Intervals
1 and 2.
Quantitative Data
Measure Type
Target
Project
Raw
Number
128.8
Ratio
128.8/161
%
80%
Target
Project
Raw
Number
Ratio
%
14
14/80
18.0
143/161
88.9
Actual Performance Data
Raw
Number
Ratio
%
14.61
14.61/65.40
22.3
Quantitative Data
Measure Type
Target
42
143
Quantitative Data
Measure Type
Project
Actual Performance Data
Raw
Number
Ratio
%
Raw
Number
Ratio
%
2,591
2,591/3,239
80
Actual Performance Data
Raw
Number
Ratio
%
2.436
2,436/3,239
75.2
2.d. Performance Measure
Average percentage of students assessed across grade levels
and across Intervals 1 and 2 in GraduateFIRST by the Student
Assessment Tool having no course failures across data
Intervals 1 and 2.
2.e. Performance Measure
Average percentage of students assessed across grade levels
and across Intervals 1 and 2 in GraduateFIRST by the Student
Assessment Tool having total in- and out-of school
suspensions across data Intervals 1 and 2.
2.f. Performance Measure
Percentage of participating high schools in Initiative 2 CCaR
project with effective and compliant transition portions of the
IEP plans, as measured by state general supervision
monitoring standards.
2.g. Performance Measure
Percentage in the CCaR Project of the quality indicators
within the Transition Planning domain that are at least at the
Mostly Achieved level, as measured by growth on the
Transition Quality Indicators.
2.h. Performance Measure
Percentage of the Initiative 3 Autism Early Intervention
Project implementing, with fidelity, evidence-based incidental
teaching pre-readiness skills for children with autism, as
measured by the PLA-Check fidelity measure.
Quantitative Data
Measure Type
Target
Project
Raw
Number
Ratio
%
2,267
2,267/3,239
70
Target
Raw
Number
Ratio
%
2,915
2,915/3,239
90
Target
Raw
Number
15
Ratio
15 /15
%
100
Target
Raw
Number
8
Ratio
%
8/8
100
Target
43
Actual Performance Data
Raw
Number
Ratio
%
2,436
2,436/3,239
75.2
Actual Performance Data
Raw
Number
Ratio
%
15
15 /15
100
Actual Performance Data
Raw
Number
Ratio
%
7
7/8
87.5
Quantitative Data
Measure Type
Project
41.2
Quantitative Data
Measure Type
Project
1,336/3,239
Quantitative Data
Measure Type
Project
1,336
Quantitative Data
Measure Type
Project
Actual Performance Data
Raw
Number
Ratio
%
Raw
Number
Ratio
%
1,486
1,478/1,748
85
Actual Performance Data
Raw
Number
Ratio
%
832
832/1,748
47.6
2.i. Performance Measure
Percentage of the Initiative 3 Autism Early Intervention
Project participants implementing, with fidelity, evidencebased incidental teaching pre-readiness skills for children with
autism, as measured by Incidental Teaching Rating Scale
fidelity measure.
2.j. Performance Measure
Average percentage of Initiative 1 GraduateFIRST
participants reporting that the knowledge and skills gained in
the Best Practices Forums were helpful or very helpful in
implementing the school plan and implementing engagement
practices, as evidenced by three-month post training
assessments.
2.k. Performance Measure
Percentage of Initiative 2, CCaR Project participants attending
the Transition Institute reporting that knowledge and skills
obtained in professional development provided a lot or some
help in implementing evidence-based practices, as evidenced
by three-month post training assessments.
2.l . Performance Measure
Percentage of Initiative 2, CCaR Project participants
attending Hitting the Mark Training reporting that knowledge
and skills obtained in the training somewhat or significantly
helped them assess whether the required components are in
the transition plans, as evidenced by three-month post training
assessments.
2.m. Performance Measure
Percentage of the Initiative 3 Autism Early Intervention
Project training participants in the participating schools
Quantitative Data
Measure Type
Target
Project
Raw
Number
Ratio
%
17.0
17.0/20.0
85.0
Target
Raw
Number
144.9
Ratio
144.9/161
Measure Type
%
90
Raw
Number
Ratio
%
9
9/18/
50
Measure Type
Raw
Number
Ratio
%
29.5
29.5/59
50
Target
44
Actual Performance Data
Raw
Number
Ratio
%
148
148/161
92.0
Actual Performance Data
Raw
Number
Ratio
%
10
10/18
55.6
Actual Performance Data
Raw
Number
Ratio
%
34
34/59
57.6
Quantitative Data
Measure Type
Project
50.5
Quantitative Data
Target
Project
10.1/20.0
Quantitative Data
Target
Project
10.1
Quantitative Data
Measure Type
Project
Actual Performance Data
Raw
Number
Ratio
%
Raw
Number
Ratio
%
Actual Performance Data
Raw
Number
Ratio
%
reporting that the incidental teaching training was either
effective or very effective in working with their students, as
evidenced by three-month post training assessments.
2.n. Performance Measure
Percentage of parents rating the Autism incidental teaching
training as helpful or very helpful in working with their child
at home, as evidenced by three-month post training
assessments.
2.o. Performance Measure
Percentage of participating School Team Leaders within the
Initiative 1 GraduateFIRST schools reporting that coaching
and assistance received on the vital behavior of
“Implementing and monitoring our School Action Plan” was
good or outstanding, as evidenced by ratings of 3 or 4 on a 4point scale.
2.p. Performance Measure
Percentage of participating high schools and feeder middle
schools where CCaR project leaders report that coaching and
assistance by the CCaRS has been very helpful in developing
compliant transition plans and/or effective transition
strategies, as evidenced by a rating of 4 on a 4-point rating
scale.
2.q. Performance Measure
Percentage of the schools participating in Initiative 3 Autism
Early Intervention Project reporting effective and timely
“hands on” and/or electronic coaching and assistance by the
EAC and other consultants, as evidenced by ratings of 4 or
higher on a 5-point rating scale.
24
24/24
100
Target
Raw
Number
Ratio
%
5
5/5
100
Target
Raw
Number
35.7
Ratio
35.7/42
%
85.0
Target
Raw
Number
Ratio
%
29.5
29.5/59
50
Target
45
5
5/5
100
Actual Performance Data
Raw
Number
Ratio
%
34
34/42
81.0
Actual Performance Data
Raw
Number
Ratio
%
28
28/59
47.5
Quantitative Data
Measure Type
Project
Actual Performance Data
Raw
Number
Ratio
%
Quantitative Data
Measure Type
Project
100
Quantitative Data
Measure Type
Project
24/24
Quantitative Data
Measure Type
Project
24
Raw
Number
Ratio
%
19
19/24
79.2
Actual Performance Data
Raw
Number
Ratio
%
20
20/24
83.3
2.r. Performance Measure
Percentage of participating teachers identified as needing
assistance through the TKES or district teacher evaluation
system reporting effective and timely assistance and coaching,
as evidenced by ratings of 4 or higher on a 5-point rating
scale.
2.s. Performance Measure
Reduction of the graduation gap between students with and
without disabilities in the GraduateFIRST high schools as
measured by the GOSA.
2.t. Performance Measure
Reduction of the dropout rate for students with disabilities in
the GraduateFIRST high schools and middle schools as
measured by the GOSA.
Quantitative Data
Measure Type
Target
Project
Raw
Number
Ratio
%
Actual Performance Data
Raw
Number
Ratio
%
/
/
Quantitative Data
Measure Type
Target
Project
Raw
Number
Ratio
%
77.9
77.9/77.9
100
Actual Performance Data
Raw
Number
Ratio
%
38.4
Target
Raw
Number
Reduction in absenteeism for students with disabilities in the
GraduateFIRST schools to the level of their peers without
disabilities, as measured by the GOSA.
Ratio
%
Target
Raw
Number
12.5
2.v. Performance Measure
Reduction of the achievement gap between students with and
without disabilities in the participating GraduateFIRST
/
Ratio
12.5/12.5
Target
46
%
100
Actual Performance Data
Raw
Number
Ratio
%
18.28
18.28 /12.51
146.0
Quantitative Data
Measure Type
Project
NA for
Year 2
Quantitative Data
Measure Type
Project
49.3
Actual Performance Data
Raw
Number
Ratio
%
/
2.u. Performance Measure
38.4/77.9
Quantitative Data
Measure Type
Project
NA for
Year2
Raw
Number
Ratio
Actual Performance Data
Raw
Number
Ratio
schools, as measured by the GOSA.
69.4
2.w. Performance Measure
Percentage of participating CCaRS schools in Initiative 2
showing improvements in graduation rates, as measured by
Data Probes.
69.4/69.4
100
Target
Raw
Number
Percentage of participating schools/districts in Initiative 2
CCaR Project showing reductions in dropouts among students
with disabilities
Ratio
%
/
Target
Raw
Number
Ratio
%
Actual Performance Data
Raw
Number
Ratio
%
/
2.y. Performance Measure
Percentage of students with disabilities exiting from high
school who are enrolled in higher education or in some other
postsecondary education or training program; or
competitively employed or in some other employment within
one year of leaving high school.
2.z. Performance Measure
Percentage of participating children with autism receiving
supports and services in least restrictive settings.
2.z.1 Performance Measure
Measure Type
/
Raw
Number
6,999
Ratio
6,999/8,749/
%
80.0
Target
Measure Type
47
Actual Performance Data
Raw
Number
Ratio
%
6,769
6,769/8,749
77.4
Quantitative Data
Measure Type
Project
Baseline
Only
Quantitative Data
Target
Project
Baseline
Only
Quantitative Data
Measure Type
Project
64.3
Actual Performance Data
Raw
Number
Ratio
%
/
2.x. Performance Measure
44.6/69.4
Quantitative Data
Measure Type
Project
44.6
Raw
Number
Ratio
%
Actual Performance Data
Raw
Number
Ratio
%
Quantitative Data
Percentage of participating special education teachers
identified on the TKES or district teacher
evaluation system who are participating in on-line training
and receiving coaching support who are showing
improvement in the TKES or district teacher evaluation
system, as reported by the building principal.
Target
Project
Raw
Number
Ratio
/
%
Actual Performance Data
Raw
Number
Ratio
%
/
NA for
Year 1
Explanation of Progress (Include Qualitative Data and Data Collection Information)
I.
Fidelity Assessment
Initiative 1 - GraduateFIRST
Selection of Schools and GraduateFIRST Structure
During Year 2, 84 schools (28 high schools, 31 middle schools, 23 elementary schools and 2 Academies) have been participating in GraduateFIRST.
Each participating school has an assigned GraduateFIRST Collaboration Coach from the GLRS in their region.
Participating schools have designated a School Team that provides the leadership within the school for completing the GraduateFIRST process. Team
members in each participating school have developed a self-directed, continuous improvement school Action Plan with an emphasis on increasing the
graduation rate and/or reducing the achievement gap for the lowest performing subgroups. School Teams are responsible for ensuring that the
GraduateFIRST components are implemented so that continuous improvement of student outcomes drives policy decisions. Each School Team is
comprised differently based on the identified needs, but School Team members usually include administrators, School Improvement Specialists,
special educators, and general educators.
Each GraduateFIRST school has a designated school-based Team Leader who coordinates GraduateFIRST activities. Working directly with the
regional Collaboration Coach and the School Improvement Specialist, the Team Leader schedules and conducts school level Team meetings each
month, ensures appropriate time for project activities, collects and analyzes data, assists with the implementation of the school Action Plan, and
progress monitors the implementation. Regional Collaboration Coaches and School Improvement Specialists attend the Team meetings where School
Teams share successes and challenges relative to ongoing implementation interventions in their selected primary areas of engagement within their
Action Plans.
Selection of Primary Areas of Engagement
Table 15 below provides a summary of the percentage of elementary, middle, and secondary schools implementing evidence-based practices within
the four GraduateFIRST primary areas of engagement—academic, behavior, cognitive, and student engagement. As can be seen, the majority of
participating GraduateFIRST Schools have selected academic engagement (64.7 percent of the total focus area selections). Of the total academic
engagement selections, 28.1 percent were at the elementary school level (compared to 35.6 percent in Year 1), 40.6 percent were at the middle school
48
level (compared to 55.5 percent in Year 1), and 31.3 percent were at the high school level (compared to 19.3 percent in Year 1). Note that some
participating schools had more than one area of engagement. Of the total 99 primary areas of engagement selections, 22 (22.2 percent) were at the
elementary level (compared to 37 percent in Year 1); 40 (40.4 percent) were at the middle school level (compared to 42.0 percent in Year 1); and 37
(37.4 percent) were at the high school level (compared to 19.3 percent in Year 1).
Table 15. Percentage of Primary Area of Engagement Areas by School Level.
Focus Area Selections by School Level
Elementary
Schools
Focus Area
Academic Engagement
Behavior Engagement
Cognitive Engagement
Student Engagement
Total
#
18
1
0
3
22
%
28.1
4.5
-13.6
22.2
Middle Schools
#
26
4
1
9
40
%
40.6
10.0
2.5
22.5
40.4
High Schools
#
20
2
0
15
37
%
31.3
5.4
-40.5
37.4
All Schools
Total
Total % of Total
Selections
Selections
64
64.7
7
7.0
1
1.0
27
27.3
99
100.0
Implementation of Evidence-Based Practices in the Four Areas of Engagement
Each participating GraduateFIRST school has developed a Short Term Action Plan detailing evidence-based interventions to be implemented within
their selection areas of engagement(s). Table 16 below summarizes examples of evidenced-based practices being implemented within the
GraduateFIRST Schools in the four primary areas of engagement. The most frequent academic areas of engagement being implemented are math
strategies/programs, and co-teaching. In the student engagement area, the most frequent intervention is mentoring. Behavior areas of engagement
include interventions such as the Check in/Check Out Program and Positive Behavioral Interventions and Supports (PBIS). Student led conferences
and post secondary planning interventions are being implemented within the cognitive area of engagement.
Table 16. Types of Evidence-Based Practices Being Implemented by Primary Area of Engagement
Academic Engagement
Examples of Evidence-based Practices
Math (e.g., Math Dynamics, videos on math practices, math tutoring, math labs, and math
interventions); co-teaching; before and after school tutoring; mentoring; extended planning and
learning time; study skills; vocabulary instruction/strategies; professional learning strategies;
instructional strategies; Thinking Maps; model classroom; schoolwide literacy initiative; reading
instructional strategies; academic improvement strategies; flexible learning program; literacy skills
49
Behavior Engagement
Cognitive Engagement
Student Engagement
Examples of Evidence-based Practices
instruction across all subjects; academic opportunity time; data teams; ILL Rewards; Success for All
Reading Program; individual learning planning; weekly monitoring; Hattie’s High Impact
Strategies; instructional strategies; and computer labs
Positive Behavioral Interventions and Supports (PBIS) and Check in/Check Out Program
Post school planning and student led conferences
Student engagement strategies, affective engagement strategies, mentoring, individual Graduation
Coach mentoring; Check in/Check Out Program, morning and after school tutoring, senior mentors,
case management/advisement, student-led conferences (ASPIRE), student attendance policy
changes, and attendance strategies
Two fidelity assessments are being reported in this Annual Performance Report for the GraduateFIRST Initiative 1: The GraduateFirst Process and
Primary Area of Engagement Implementation Scales and student assessment in attendance, course completion, and discipline.
2.a.-b. Fidelity assessment – GraduateFIRST Implementation Scale: Process and Student Engagement
A fidelity tool has been developed to determine ongoing implementation of the GraduateFIRST structure (i.e., Section l – GraduateFIRST School
Implementation Scale) and implementation of evidence-based practices within the Student Engagement Scale (i.e., Section 2 – GraduateFIRST
Implementation Scale: Student Engagement). The fidelity tool contains proficiency levels of Not Yet Established, 1 – Starting Out, 2 – Developing, 3
– Deepening, and 4 – Sustaining. The GraduateFIRST fidelity assessment tool for GraduateFIRST Schools is found in Appendix D.
The fidelity tool was administered in Fall of 2013 and again in Spring of 2014. This was done to determine ongoing improvement in implementation
of the GraduateFIRST structure (i.e., Section l – GraduateFIRST Implementation Scale: Process) and improvement of student engagement efforts
(i.e., Section 2 – GraduateFIRST Implementation Scale: Student Engagement).
In order to confirm with a degree of certainty that the spring self-reported scores on the Implementation Scales were reliable across schools, districts,
and coaches, Collaboration Coaches used a peer review process to verify 20 percent of the GraduateFIRST Implementation Scales including scoring
and artifacts submitted.
Table 17 below shows the percent of GraduateFIRST school reaching the Starting Out level (2) or higher by the six items included in the
Implementation instrument. It can be observed from Table 17 that high schools have more difficulty developing, implementing, and monitoring the
GraduateFIRST School Action Plan. Overall 70 schools out of 84 schools were rated 2 or higher on effectiveness of implementing their School
Action Plan. Both the middle and high schools have improvements to make in evaluating the effectiveness of their School Action Plans (i.e., final
item in Table 17).
50
.
Table 17. Percentage of Schools Performing at the Assessment Level 2 or Higher on the GraduateFIRST Implementation Scale Items
During the 2013-2014 School Year (Year 2).
School Level
Elementary
Schools
Middle Schools
High Schools
Establish an
effective
GraduateFIRST
School Team
97.7%
98.5%
89.7%
Collect and
Analyze Data
93.2%
96.9%
87.9%
Identify and
Prioritize Needs
and Targeted
Students
Develop the
GraduateFIRST
School Action
Plan
Implement and
Monitor the
GraduateFIRST
School Action
Plan
95.5%
93.8%
87.9%
88.6%
80.0%
70.7%
88.6%
78.5%
67.2%
Evaluate the
Effectiveness of
the
GraduateFIRST
School Action
86.4%
76.9%
56.9%
The same procedure was used to assess the progress on primary areas of engagement—see Appendix D. Table 18 below shows the percent of
GraduateFIRST schools reaching the Starting Out level (2) or higher by the four items included in the Primary Area of Engagement instrument. It
can be seen that high schools have a more difficult time successfully implementing in the Primary Area of Engagement. They are lagging behind the
middle and elementary schools in implementing evidence-based practices within their primary area(s) of engagement. Overall, 73 schools out of 77
responding schools were rated 2 or higher (on a 4-point rating scale) on incorporating primary area of engagement efforts.
Table 18. Percentage of Schools Performing at the Assessment Level 2 or higher on Primary Area of Engagement Items During the 20132014 School Year (Year 2).
School Level
Elementary
Schools
Middle Schools
High Schools
Knowledge of
Primary Area of
Engagement
77.5%
83.1%
64.8%
Optimizing
Resources and
Structure for
PAE
97.5%
84.7%
66.7%
Professional
Development for
PAE
85.0%
78.0%
48.1%
Incorporating
Practices from
PAE
97.5%
89.8%
72.2%
Table 19 below shows the rating improvement of GraduateFIRST schools from fall to spring during Year 2. As can be observed in Table 18, the
GraduateFIRST schools showed gains (delta) for all school levels in all processes of implementation. The high schools have yet to attain an overall
average rating of 3 (on a four-point scale) for any of the GraduateFIRST Implementation items.
51
Table 19. Increases in average ratings within GraduateFIRST Process from Fall to Spring of the 2013-2014 School Year (Year 2).
Establish an
School effective
Level GraduateFIRST
School Team
Collect and
Analyze Data
Identify and
Prioritize Needs
and Targeted
Students
Develop the
GraduateFIRST
School Action Plan
Implement and
Monitor the
GraduateFIRST
School Action Plan
with Fidelity
Evaluate the
Effectiveness of the
GraduateFIRST
School Action Plan
Fall
Spring
Delta
Fall
Spring
Delta
Fall
Spring
Delta
Fall
Spring
Delta
Fall
Spring
Delta
Fall
3.09
3.36
0.27
2.77
3.14
0.36
3.08
3.18
0.10
2.53
2.64
0.10
2.26
2.57
0.31
Middle 2.71
3.09
0.39
2.71
2.97
0.26
2.63
3.03
0.40
2.06
2.66
0.60
1.73
2.25
2.97
0.38
2.28
2.83
0.55
2.19
2.62
0.43
1.69
2.21
0.52
1.34
2.14
Elem
High
2.59
Spring
Delta
2.45
2.77
0.32
0.52
1.79
2.59
0.80
0.79
1.38
2.07
0.69
Table 20 below shows the improvement in rating of primary areas of engagement of GraduateFIRST schools from Fall to Spring during Year 2. As
can be observed in Table 20, the GraduateFIRST schools showed average gains (delta) for all school levels by all Primary Areas of Engagement.
Both the high schools and middle schools have yet to attain an overall average rating of 3 for any of the Primary Areas of Engagement.
Table 20. Increases in Primary Area of Engagement from Fall to Spring of the 2013-2014 School Year (Year 2).
School Level
Elementary
Middle
High
Knowledge of
Primary Area of
Engagement
Fall Spring Delta
2.31
2.68 0.36
1.83
2.62 0.79
1.52
2.11 0.59
Optimizing
Resources and
Structure for PAE
Fall Spring Delta
2.66
3.00 0.34
2.23
2.97 0.73
1.83
2.26 0.43
Professional
Development for
PAE
Fall Spring Delta
2.40
2.85 0.45
1.97
2.66 0.69
1.15
1.93 0.78
Incorporating
Practices from PAE
Fall Spring Delta
2.69
3.00 0.31
1.97
2.69 0.72
1.56
2.19 0.63
The data and information gathered from the GraduateFIRST Process and Primary Areas of Engagement fidelity scales are aggregated and
disaggregated by the Collaboration Coaches. The data guides improvements in training, coaching, and educators’ practices in academic, behavior,
cognitive, and/or student engagement areas. These progress monitoring data are also used to help guide the school leadership team and to identify if
there are regional coaching needs. The results of progress monitoring are shared at GaDOE Design Team meetings, meetings with School
Improvement Specialists, monthly Collaboration Coach meetings with GaDOE, and monthly School Team meetings.
52
The SEA Design Team, including the Lead Collaboration Coach, and the Collaboration Coaches review school and student outcomes to determine
the need for program modifications, additional training, and/or changes in procedures or practices. Based on this student outcome data, some schools
have made changes in procedures, practices, or building/student schedules. Collaboration Coaches use the student outcome data as the basis for
coaching sessions. Student outcome data and the post training surveys are used to plan future trainings and identify coaching needs.
2.c. – 2.f. Fidelity Assessment - Student performance in attendance, course completion and discipline (in-school and outof-school suspensions) by grade level - GraduateFIRST
The GraduateFIRST Assessment Tool was developed in the previous Georgia SPDG, as part of GraduateFIRST, to measure fidelity of
implementation as indicated by student progress in three areas: course completion, student attendance, and student discipline. Each participating
school has selected a list of target students who are high risk for dropout and academic failure. Over 3,300 students throughout the state are being
assessed. The target student list in the GraduateFIRST schools varies from 19 – 92 (average of 45) students per school.
The Team Leader submits student progress monitoring data for the targeted students in four intervals throughout the school year to the Collaboration
Coach. The Collaboration Coaches enter these data into a statewide database so data can be aggregated for analysis by the SPDG third party
evaluators. Based on these data, school Action Plans are reviewed and adjustments made. This progress monitoring data is used to help identify
whether target students need additional interventions or accommodations. Progress monitoring data is also used to help guide the School Team and
to identify regional coaching needs. The results of the student outcome data are shared at SEA Design Team meetings, monthly Collaboration Coach
meetings with GaDOE, meetings with School Improvement Specialists, and monthly School Team meetings.
Data for the first two intervals for targeted students is available for this Year 2 SPDG Annual Performance Report. Interval 1 ended in early October
2013 and Interval 2 ended in late December. Interval 3 data will be available and analyzed during April 2014. Following is a discussion of improved
attendance, course failure, and discipline (in-school and out-of-school suspensions) by grade level for the first two data Intervals in Year 2 of the
SPDG.
2.c. Improved performance in attendance by participating school, as measured by the GraduateFIRST Student
Assessment Tool
Of the 84 schools in Year 2 participating in GraduateFIRST during the 2013-14 school year, 18 also participated last year and had assessment data
available for the first two intervals of both last year and this year. Table 21 below provides the average number of days absent per student for both
the first two intervals of last year and the first two intervals of this year for these 18 GraduateFIRST schools. As can be observed, the average
number of days absent declined for both middle school and high schools from last year to this year.
Of these 18 schools, all were Cohort 1, 2 or 3 schools from the first five years of the GraduateFIRST program in the previous SPDG and, as a result,
have already been working to reduce absenteeism for a few years. Of the 3,239 target students followed in Graduate FIRST, 2,436 had no days of
absenteeism during the first two intervals in this Year 2.
53
As can be observed in Table 21, the decline in days absent per student was very small for high school students from the 10 high schools involved. A
test for statistically significant difference did not find any differences at the p<.10 largely because of the small progress in high schools.
Table 21. Average Number of Days Absent per Student for the 2012-13 and 2013-14 School Year by School (N=18).
Student Placement
Middle School Students
High School Students
Average Number of Days Absent per Student
School Year 2012-13 School Year 2013-14
Difference
6.03
4.33
-1.70
5.92
5.79
-.13
Of the total Year 2 schools participating in GraduateFIRST, 80 had interval one and two data available for baseline calculations as of March 2014.
Table 22 below provides the Year 2 baseline for these GraduateFIRST schools.
Table 22. Average Days Absent by Student Placement.
Student Placement
Elementary Schools
Middle Schools
High Schools
Average Number of Days
Absent per Student
3.18
4.35
6.02
2.d. Reducing failures by participating schools, as measured by the GraduateFIRST Student Assessment Tool
As indicated earlier, 18 schools also participated last year and had assessment data available for the first two intervals in both Year 1 and the current
Year 2 of the SPDG. Table 23 below provides the average number of courses failed per student for both the first two intervals of last year (Year 1)
and the first two intervals of this year (Year 2). As can be observed, the average number of courses failed declined for both middle school and high
schools from last year to this year.
As noted earlier, these schools were Cohort 1, 2 or 3 schools from the first five years of the GraduateFIRST program and, as a consequence, have
already been working to reduce course failures for a few years. As can be observed in Table 23 the decline in course failures per student was very
small again for high school students from the 10 high schools involved. A test for statistically significant difference did not find any differences at
the p<.10 largely because of the small progress in high schools.
54
Table 23. Average Number of Courses Failed per Student for the 2012-13 and 2013-14 School Year by School (N=18).
Student Placement
Middle School Students
High School Students
Average Number of Days Absent per Student
School Year 2012-13 School Year 2013-14
Difference
2.19
1.31
-.88
3.12
3.06
-.06
Of the total schools participating in GraduateFIRST, 80 had interval one and two data available for baseline calculations as of March of 2014. Of the
participating schools, 1,336 GraduateFIRST target students (41.2%) had not failed a course during the first two first data intervals in Year 2. Table 24
below provides the Year 2 baseline for these GraduateFIRST schools.
Table 24. Average Courses Failed by Student Placement.
Student Placement
Elementary Schools
Middle Schools
High Schools
Average Courses Failed per Student
1.55
1.61
3.24
2.e. Improved performance by reducing the In-school (ISS) suspensions, and out-of-school (OSS) suspensions as
measured by the GraduateFIRST Student Assessment Tool
An analysis was conducted using the 18 schools that also participated last year and had assessment data available for the first two intervals of both
last year (Year 1) and this year (Year 2). Tables 25 and 26 below provide the average number of days suspended per student for both the first two
intervals of last year and the first two intervals of this year. As can be observed, the average number of days suspended for both middle schools and
high schools from last year to this year has declined.
Because these schools were from the first five years of the GraduateFIRST program, they have already been working to reduce suspensions for a few
years. As can be observed in Table 25, the decline of suspensions per student was not large for both the high schools and middle schools
participating in the GraduateFIRST program. As a consequence, a test for statistically significant difference did not find any differences at the p<.10
for the participating schools.
55
Table 25. Average Number of Suspensions per Student for the 2012-13 and 2013-14 School Year by School (N=18).
Average Number of Suspensions Per Student
School Year 2012-13 School Year 2013-14
Difference
1.81
1.61
-.20
1.57
1.16
-.41
Student Placement
Middle School Students
High School Students
Of the total schools participating in GraduateFIRST, 80 had interval one and two data available for baseline calculations as of March of 2014. Of the
target students being followed by the GraduateFIRST schools, 2,436 (75.2%) did not have any suspensions during the first two intervals of Year 2.
Table 26 below provides the Year 2 baseline for these GraduateFIRST schools.
Table 26. Average Suspensions by Student Placement.
Student Placement
Elementary Schools
Middle Schools
High Schools
Average Suspension Per Student
.36
1.33
1.30
Initiative 2 - College and Career Readiness/Transition Project
Baseline Data Collection
A Year 1 emphasis of Initiative 2 was compliant practices writing transition IEPs. Transition plans were collected from each school district using a
pre-populated random sample. From January-March 2013, school districts verified the extent to which the transition plans within the student IEPs
are compliant. From March-April, school districts submitted 10 percent of a random sample of transition plans to the GaDOE for review. GaDOE
and the CCaR Project personnel reviewed 20 percent of these plans for compliance by May 1, 2013. Based on this baseline data, 15 school districts
and three state schools were selected for participation in Years 2. The state schools received technical assistance from NSTTAC.
Initiative 2 Partners
In-state partners in Initiative 2 include the GaDOE Career, Technical and Agricultural Education Division (CTAE); the Georgia Vocational
Rehabilitation Agency; the Georgia Council for Developmental Disabilities, and Parent to Parent (P2P). The GaDOE has also entered into a
partnership with University of Kansas, Transition Coalition, Center for Research on Learning, to provide training and ongoing coaching and
assistance to the GaDOE CCaRS and the participating 15 school/district teams in the development and implementation of compliant transition plans
and evidence-based transition practices. A partnership has also been made between GaDOE and the National Secondary Transition Technical
Assistance Center (NSTTAC) to provide technical assistance to the three participating Georgia State schools.
56
2.f.-g. Fidelity assessment – CCaR Initiative 2
Two fidelity assessments are being reported in this Annual Performance Report for CCaR Initiative 2—improvements in developing effective and
compliant transition plans and fidelity of implementation of the school district teams participating in on-line self-study professional development.
2.f. Percentage of participating schools in Initiative 2 with effective and compliant transition plans for students with
disabilities age 16 and up – Fidelity measurement
Monitoring compliant transition plans is one fidelity measure for the CCaR Initiative 2 of the SPDG. This review is also required by Indicator 13 of
the State Performance Plan (SPP). During Year 1 of the SPDG, transition plans from each school district were randomly selected from school
districts across the state. From January-March 2013, school districts verified that the transition plans within the student IEPs are compliant. During
March 22 – April 8, 2013, the school districts submitted 10 percent random sample of the transition plans to the GaDOE for their review with 100%
compliance. Of these, the CCaRS verified 100 percent of the transition plans through their independent review. All participating CCaR districts and
the three state schools have completed Prong 2 for the 2012-2013 school year (Year 1 and beginning of Year 2 of the SPDG) with 100% compliance.
Prong 1 data for the 2013-2014 school year (Year 2 of the SPDG) have been gathered and are being evaluated for the participating CCaR school
districts and the three state schools. Districts will receive their status for Prong 1 upon completion of review by the CCaRS and DOE staff. This
review is due to the GaDOE in June 2014.
Table 27. Percentage of Transition Plans that are Compliant.
District
Haralson County
Bleckley
Dekalb County
Dooly County
Greene County
Habersham County
Liberty County
Marietta City
Marion County
Fall 2013
Prong 1 Percentage
Compliant
91%
77%
95%
0
77%
95%
94%
83%
25%
Spring-Fall, 2013
Prong 2 Percentage
Compliant
100%
100%
100%
100%
100%
100%
100%
100%
100%
57
January-April, 2014
Prong 1 Percentage Compliant
Currently being evaluated by the district
Currently being evaluated by the district
Currently being evaluated by the district
Currently being evaluated by the district
Currently being evaluated by the district
Currently being evaluated by the district
Currently being evaluated by the district
Currently being evaluated by the district
Currently being evaluated by the district
District
Seminole County
Thomaston-Upson Lee
Vidalia City
Wayne County
Webster County
Wilcox County
State Schools
Georgia Area School
for the Deaf
Atlanta Area School for
the Deaf
Georgia Area School
for the Blind
Fall 2013
Prong 1 Percentage
Compliant
66%
94%
0%
83%
0%
50%
Spring-Fall, 2013
Prong 2 Percentage
Compliant
100%
100%
100%
100%
100%
100%
Prong 1 Percentage
Compliant
Prong 2 Percentage
Compliant
66%
100%
80%
100%
75%
100%
January-April, 2014
Prong 1 Percentage Compliant
Currently being evaluated by the district
Currently being evaluated by the district
Currently being evaluated by the district
Currently being evaluated by the district
Currently being evaluated by the district
Currently being evaluated by the district
Prong 1 Percentage Compliant
Currently being evaluated by the State
School
Currently being evaluated by the State
School
Currently being evaluated by the State
School
2.g. Fidelity of implementation – Quality Indicators of Exemplary Transition Programs
As stated earlier, the 15 participating Initiative 2 school districts are participating in a 12-week on-line Transition Coalition Self-Study training and
coaching (TRAN-Qual) series, supported by the University of Kansas Transition Coalition. The purpose of this training is to guide the district
through a self study of transition plan compliance and implementation of effective transition practices.
There are several fidelity measures/procedures built into this self-study training. An Action Planning Process Tool is being used for district planning
of evidence-based programs and practices to be implemented. Each participating district has developed a Learning Target against which pre and post
testing provides feedback on the progress toward meeting the target. The pre-test has been administered, with the post test available no later than the
end of June 2014. Within the Action Plan the participating district teams are using a Goal Attainment Scale to measure progress of the
implementation of new programs and practices. This data is also not available for this Annual Performance Report, but will be available in about six
weeks for reporting in the Year 3 SPDG Annual Performance Report.
One of the fidelity measures being used is the Quality Indicators of Exemplary Transition Programs (see Appendix E), which was developed by the
University of Kansas Transition Coalition as a self-assessment to allow programs, schools, and districts to determine and prioritize the most critical
needs within a transition program, as well as to monitor the progress of interventions being implemented. The Quality Indicators of Exemplary
58
Transition Programs Needs Assessment (QI) is comprised of seven domain areas. In 2013, the QI was revised with items that reflected recently
identified evidence-based practices and predictors of postsecondary school success. The revised QI was tested for validity and reliability using a
Confirmatory Factor Analysis along with internal reliability analysis. The model fit statistics from the CFA showed an excellent fit. The chi-square
value was significant (Χ2 (1006, n = 468) = 2548.957, p < .001). The RMSEA was 0.057 with a 90% confidence interval of 0.054 to 0.060. Likewise,
the CFI was 0.90. The standard factor loadings of each indicator to the domain were above .50 and significant at .001 level, indicating good
convergent validity of the QI-2. Internal reliability analyses using Cronbach’s alpha produced an overall estimate of reliability for all 47 indicators
with a .97 coefficient alpha. Each of the domains also reported high reliability using Cronbach’s alpha with a range from .88-.92.
Fifteen Georgia Self-Study teams completed one domain from the QI, Transition Planning, as part of the Self-Study intervention. Self-Study team
members completed the QI during the second week of January, 2014 before they began the Transition Coalition Self-Study: IDEA and Secondary
Transition Unit. The QI results were aggregated for each team, and a report of each team’s scores for each of the items in the Transition Planning
Domain was provided to the facilitator. This was in preparation for the Needs Prioritization and Action Planning of the Self-Study in which teams
analyze several points of data, including the Team QI Transition Planning Domain data, to prioritize a transition area in which they will work over
the subsequent 6-week period of time. During the Needs Prioritization and Action Planning session, the facilitator is instructed to provide copies of
the report to each of the team members for their review. The information from the report is placed on the Transition Planning Prioritization Form and
is utilized along with other data collected from the IEP Review Activity, IEP Review Reflection and professional wisdom to help outline a prioritized
improvement area. The recommendation is then made to the team to develop a short-term, 6-week goal and Action Plan steps to achieve that goal that
align with the prioritized improvement area.
Table 28 below summarizes baseline data gathered for the 15 CCaR participating districts in the Transition Planning domain of the Quality Indicators
of Exemplary Transition Programs, which is rated on a four-point scale (3 – Completely Achieved, 2 – Mostly Achieved, 1 – Partially Achieved, and
0– Not Achieved. Table 28 provides a summary of ratings for the Quality Indicators in the Transition Planning domain of the Quality Indicators of
Exemplary Transition Programs (QI). As can be seen in Table 28, the 15 participating districts received ratings above a 2.00 in all but one of the
eight quality indicators of Transition Planning (87.5 percent).
Table 28. Summary of Ratings on the Transition Planning Domain of the Quality Indicators of Exemplary Transition Programs – Baseline
Assessment.
Quality Indicator – Transition Planning
Transition planning begins early in a student’s educational experience (but no later than 16 years old).
Progress toward student’s postsecondary goals are reviewed on an ongoing basis.
Transition planning incorporates student-centered approaches.
Postsecondary goals are based upon student strengths, interests, and preferences.
Postsecondary goals target postsecondary education/training, employment, and when appropriate,
independent living.
Transition services and a course of study are identified to assist the student to reach postsecondary goals.
59
Rating Across the
Participating Schools
2.54
2.18
1.99
2.44
2.42
2.38
Rating Across the
Participating Schools
2.45
Quality Indicator – Transition Planning
Annual IEP goals addressing both academics and transition needs are identified.
Approaches are used during transition planning to identify outcomes supporting student and family
cultures.
2.17
Although the districts gave themselves low ratings for the last five sections of the Quality Indicators for transition planning (Table 28 above) the data
indicates, based on transition IEP reviews, that districts are at 100%. A Self-Study pre-test was sent out during the second week of January 2014
prior to the districts beginning the Transition Coalition Self-Study: IDEA and Secondary Transition. The pretest was administered through Qualtrics,
a web-based survey application, and included an assessment of all Self-Study learning targets, facilitator readiness, demographic information, and the
Transition Planning domain of the QI.
Table 29 below shows pre-test participant ratings regarding the Transition Planning Domain. As can be seen, an average of 83.3% of all 80
participants strongly agreed or agreed across all feedback items. The average of all participants strongly agreeing or agreeing with the SelfAssessment Items ranged from an average of 72.6% - “I can scale the goal to determine the level of attainment needed for our team” to an average of
95.0% - “I can reach consensus with my team members to ensure accurate compliance review of transition IEP compliance”. Post-test assessment
results will be available in June 2014.
Table 29. Self-Assessment of Transition Knowledge and Skills.
Self-Assessment Item
I can state the definition of transition as required
under IDEA.
I can apply the key concepts of transition (resultsoriented, student-centered, and coordinated effort) to
the transition process.
I can reach consensus with my team members to
ensure accurate compliance review of transition IEP
compliance.
I can identify the major components required for
transition IEPs.
I can independently review an IEP for transition
compliance.
I can identify strengths and weaknesses of an IEP
related to quality transition planning.
Strongly
Agree
Agree
Disagree
Strongly
Disagree
Total
Respondents
Mean
16.3%
61.3%
20.0%
2.5%
80
1.9
17.5%
62.5%
18.8%
1.3%
80
2.0
32.5%
62.5%
5.0%
0.0%
80
2.3
32.5%
56.3%
10.0%
1.3%
80
2.2
31.3%
46.3%
21.3%
1.3%
80
2.1
27.5%
58.8%
13.8%
0.0%
80
2.1
60
Self-Assessment Item
I can identify an area for improvement using
building-level transition data.
I can identify all the transition requirements under
IDEA.
I can implement effective transition planning
procedures.
I can develop a SMART goal for improving transition
planning.
I can scale the goal to determine the level of
attainment needed for our team.
I can develop an Action Plan that will promote
positive change in our team’s practices related to
transition planning and compliance.
Average Across Self-Assessment Items
Strongly
Agree
Agree
Disagree
Strongly
Disagree
Total
Respondents
Mean
16.3%
67.5%
16.3%
0.0%
80
2.0
21.3%
53.8%
21.3%
3.8%
80
1.9
23.8%
66.3%
10.0%
0.0%
80
2.1
23.8%
58.8%
17.5%
0.0%
80
2.1
13.8%
58.8%
26.3%
1.3%
80
1.9
16.3%
22.74%
73.8%
60.56%
10.0%
15.86%
0.0%
0.96%
80
80
2.1
2.06%
The Transition Coalition is providing support to the CCaRS through monthly web meetings using Adobe Connect (with video) and conference call
(for audio). These meetings are recorded and posted online for the CCaRS to view at any time. The hour-long, monthly web meetings consist of
presented content and discussion. During Year 2, the Transition Coalition staff has provided three web meetings to the CCaRS—practice sessions
using Adobe Connect audio and video features and review of the Self-Study process and materials.
In addition to monthly meetings, the Transition Coalition has implemented an online discussion forum specifically for the purpose of CCaRS posting
questions and share resources to help staff ensure consistency of support being provided by the CCaRS to the participating districts. In addition, it
provides feedback to the Transition Coalition regarding the consistency of their support to the CCaRS. Monthly meetings are also held by the
Transition Coalition staff and the SPDG/GaDOE Transition Coordinator as another way to monitor fidelity of project implementation.
Another fidelity assessment built into the on-line self-study professional development is a spreadsheet of the progress being made by each of the
participating school district teams. As part of participation in the Transition Coalition Self-Study, team facilitators are required to complete specific
steps and submit materials confirming they completed specific activities in the Self-Study. This is to confirm that teams are adhering to the
intervention model, an essential element for confirming that teams are implementing the intervention with fidelity. The Self-Study Team Process
spreadsheet is used by Transition Coalition staff to document and track the team progress through the Self-Study including completion of assigned
materials, attendance sign-in sheets for meetings, completion of the required online training module (for knowledge acquisition), goal planning
materials, and participation on the online Self-Study Facilitator Community. This level of data collection using the Team Process spreadsheet also
allows the Transition Coalition to collect implementation fidelity information. The spreadsheet ensures that all evaluation data is submitted
accurately for analysis to the Transition Coalition. The spreadsheet also provides a mechanism to identify when a team may need additional
61
coaching support from their CCaRS and/or the Transition Coalition in implementing the Self-Study. All materials for which team members are
responsible are clearly outlined in the printed Self-Study materials. The team facilitator is responsible for ensuring that team members complete the
weekly materials, and for collecting, uploading, and submitting all materials to the online Self-Study Facilitator Community. Following the
submission of materials, Transition Coalition staff are able to download materials, analyze the quality of content, and document whether the team
was able to complete major elements of the Self-Study intervention. The primary method of contact with CCaRS and facilitators is through the
discussion forum on the online Self-Study Facilitator Community. Transition Coalition staff also follow-up with individual teams through email and
phone calls with facilitators who need support to guide their teams through specific activities. These contacts are also recorded on the Self-Study
Team Process spreadsheet.
Data from the spreadsheet shows the following district progress in the on-line training series, as of March 2014. On average, the districts teams are at
or around week 7 of the 12 week training. Following is a summary of progress being made by the participating teams, as reported in the fidelity of
implementation spreadsheet:
1.
2.
3.
4.
53 percent of teams have submitted up to Week 6 materials and begun implementing their Action Plans.
13 percent of teams have only submitted up to Weeks 4 & 5 materials.
20 percent of teams have only submitted up to Week 3 materials.
13 percent of teams have only completed up to a portion of Week 2 materials.
Among the teams that have submitted and have begun implementing their Transition Action Plans:
1. Four district teams are working on improving the quality of their transition assessment toolkit.
2. Two district teams are working on improving their goal writing.
3. Two district teams are working on improving student involvement in transition planning.
2.h. Fidelity of implementation – PLA-Check – Autism Early Intervention Project
In order for inclusive classrooms to meet the needs of all students, including those with autism or autism-like behavior, classroom professionals must
interact with these children. The PLA-Check observational system was adapted specifically for the Georgia Autism Early Intervention Program,
which measures levels of engagement by using a time sampling process. The PLA-check was used to collect objective information on the percentage
of observational intervals scored for the occurrence of:
1. Close proximity to student (within three feet of a given student).
2. Interacting directly with a student.
3. Active engagement of classroom students (indicator of overall effectiveness).
During Year 2, baseline data was gathered by the Emory Autism Center using the PLA-Check to determine the percentage of total observation
between adults in the classroom and children with autism/special needs. Table 30 below provides a summary of this baseline data for 16 of the 17
62
classrooms participating in Initiative 3, Autism Incidental Teaching Project. As can be observed in Table 30, teaching interactions were quite low
with less than 25 percent being positive for interactions with Autism/Special Needs children and also all children. Child engagement ranked much
higher with over 70 percent of the observations ranked positive. Across the three categories of observations, autism/special needs teachers received
positive ratings on 47.6 percent of the observations—less than half of the observations.
Table 30. Summary of Baseline Levels of Teacher Performance Including the Percent of Total Observations Scored Positive for Children
with Autism/Special Needs and Percent of Observations Scored Positive for all Children.
Type of
Classroom
1 -Regular
2 -Regular
3 -Support
Inclusive
4 -Regular
5 -Regular
6-7 -Support
Inclusive
9 -SelfContained
10 -Support
Inclusive
11 -SelfContained
12 -Support
Inclusive
13 -SelfContained
15 -Support
Inclusive
16 -Support
Inclusive
17 -Regular
Average
Percent of Observations Scoring Positive
Proximity to Students
Teaching Interactions
Child Engagement
Autism/Special
All Children
Autism/Special
All Children
Autism/Special
All Children
Needs
Needs
Needs
0
9
17
7
100
91
100
65
0
3
90
92
50
92
100
43
44
39
17
38
100
17
14
61
58
92
100
57
74
93
80
51
27
17
13
51
43
NA
2
NA
33
NA
52
43
24
11
60
76
27
NA
15
NA
60
NA
29
13
21
19
93
86
42
NA
8
NA
58
NA
42
36
0
2
50
54
6
8
50.7
20
10
35.6
23
8
20.7
12
5
13.9
92
44
71.3
85
53
75.1
63
Type of
Classroom
Range
Percent of Observations Scoring Positive
Proximity to Students
Teaching Interactions
Child Engagement
Autism/Special
All Children
Autism/Special
All Children
Autism/Special
All Children
Needs
Needs
Needs
0 - 100
9 - 65
0 - 100
2 - 61
13 - 100
51 - 93
2.i. Fidelity of implementation –Incidental Teaching Rating Scale – Autism Early Intervention Project
The Emory Autism Center Staff utilized an Incidental Teaching Checklist to evaluate classroom arrangement and teacher implementation of
incidental learning. The measurements used include arranging the environment to maximize incidental teaching opportunities, skills in identifying
and capturing the “teachable moment” effectively, skills in delivering teaching prompts in a manner to increase a child’s probability of success, and
skills in rewarding correct responses. These measures reveal how well teachers are implementing components of incidental teaching in their
classrooms.
Each incidental teaching (I. T.) component is rated on a scale of 1 to 5, based on an overall average of the teacher’s observed performance. The room
design is a percentile assigned to the organization of the room for delivery of incidental teaching. Table 31 below provides an overview of the ratings
received by the participating teachers. The room design item was rated above 50 percent for all but two classrooms with the majority of the
classrooms receiving a rating of 60 percent of higher. The incidental teaching environment received an average rating of 3.1 while the incidental
teaching reward item received the lowest average rating of 1.7 indicating that reward provision can be substantially improved. These averages attain
a combined rating of 10.1 or 50.5 percent for the possible points.
Table 31. Fidelity of implementation of Incidental Learning Components by Participating Teachers.
Scores on Incidental Learning Components
Type of
Classroom
1 -Regular
2 -Regular
3 -Support
Inclusive
4 -Regular
5 -Regular
6 -Support
Inclusive
Room Design
I. T. Environment
I. T. Timing
I. T. Prompt
I. T. Reward
69
61
2
X
3
X
3
X
1
X
64
58
65
X
1
4
X
1
1
X
1
4
X
1
2
54
4
3
4
3
64
Scores on Incidental Learning Components
Type of
Classroom
7 -Support
Inclusive
8 -Support
Inclusive
9 -Support
Inclusive
10 -Support
Inclusive
11 -SelfContained
12 -Support
Inclusive
13 -SelfContained
14 -SelfContained
15 -Support
Inclusive
16 -Support
Inclusive
17 -Regular
Average
Room Design
I. T. Environment
I. T. Timing
I. T. Prompt
I. T. Reward
54
3
3
3
1
68
X
X
X
X
67
5
3
1
1
60
X
X
X
X
32
3
2
4
0
60
1
1
1
1
31
2
3
3
1
58
X
X
X
X
56
5
5
1
2
69
63
58.2
4
3
3.1
2
4
2.6
3
4
2.7
4
3
1.7
II. Three-Month Professional Development Follow-up Assessment
2.j. Use of knowledge and skills by GraduateFIRST training participants in implementing evidence-based practices, as
measured by three-month follow-up evaluations.
As stated in Performance Measure 1.c, several GraduateFIRST professional training opportunities were provided during Year 2: GraduateFIRST
Fundamentals Module: Professional Development, Influencer Training, Best Practices Forum, School Team and Team Leader monthly meetings, and
monthly Coach professional development meetings.
65
As previously stated, during December 2013, Elementary, Middle, and High School Best Practices Forums were held for 325 administrators,
teachers, and other school staff (98 – Elementary Forum, 120 – Middle School Forum, and 107 – High School Forum). The agenda focused on
topics such as implementing a mentoring program, accommodations for better access to the standards, family engagement, school attendance, drilling
data for school improvement, target behaviors functional-based toolbox, closing the hurdles to career and college, evidence-based practices to
improve student outcomes, co-teaching, implementing a successful GraduateFIRST program, using schoolwide PBIS to improve graduation rate,
and mapping a positive future.
An electronic follow-up survey was administered in March 2014 to all of the attendees with email addresses to determine if the training received in
December 2013 was effective in helping facilitate GraduateFIRST efforts. SurveyMonkey was utilized to conduct the survey by administering three
waves of the survey. A 45 percent response rate was obtained.
The March electronic survey gathered follow-up information regarding the knowledge gained, implementation of the knowledge and the impact
(usefulness) of the new knowledge. Tables 32-33 below summarizes the percentage of participants who rated the professional development based
upon the new knowledge it provided, their follow-up implementation of the knowledge, and the success of implementation of the new knowledge.
Table 32. Participants Indicating They Gained Some or a lot of Knowledge at the Best Practice Forum Session.
Name of General Session Trainings
Energizing for Elementary Intervention
Transforming Potential Dropouts into Graduates: A Case for Intervening in Elementary Grades
I’m Trying! It Just Doesn’t Look Like it!
Rating of Gained Some Knowledge
or Gained a lot of Knowledge
Percent
78.3
77.0
73.0
Table 33. Participants Reporting Using Some or a lot of the New Knowledge in their School from the Best Practice Forum Session, as well as
Those Who Found the Implementation to be Successful.
Name of General Session Trainings
Energizing for Elementary Intervention
Transforming Potential Dropouts into Graduates: A Case for Intervening in Elementary Grades
I’m Trying! It Just Doesn’t Look Like it!
Name of General Session Trainings
66
Rating of Used Some Knowledge or
Used a lot of Knowledge
Percent
79.1
77.8
63.0
Rating of successful or
very successful
Percent
Name of General Session Trainings
Energizing for Elementary Intervention
Transforming Potential Dropouts into Graduates: A Case for Intervening in Elementary Grades
I’m Trying! It Just Doesn’t Look Like it!
Rating of Used Some Knowledge or
Used a lot of Knowledge
Percent
79.1
77.8
63.0
All general sessions were rated in the high 70 percent area with the exception of “I’m Trying! It Just Doesn’t Look Like it!?” which had more than a
third of the attendees reporting that it was of little or no help.
The Elementary, Middle, and High School Forums had 33 concurrent sessions during the three days (some of the 33 included repeat sessions given at
a prior time) following the general sessions. Participants selected the concurrent sessions that they wanted to attend. These concurrent sessions
were given varying ratings from the participants. Table 34 below provides an overview of the session ratings.
Table 34. The percent of Ratings that are Favorable for Gaining Knowledge, Using the Knowledge, and the Successful Use of the
Knowledge.
Question
To what extent did you gain knowledge from attending this session?
To what extent were you able to use the knowledge obtained?
To what extent was the knowledge that you used successful with the target students.
Low End of
Rating Range
66.7
56.7
50.0
Average
75.0
68.4
65.9
High End of
Rating Range
88.0
84.6
84.6
Not included in the Table above is the fact that of 77 participants, 73 reported a rating of 2 or higher on incorporating engagement practices in their
school activities, while 75 of 84 reported a rating of 2 or higher on implementing and monitoring the school plan. Thus, a total of 92.0 percent of the
Forum participants reported that the knowledge and skills gained in professional development were helpful or very helpful and that they had
incorporated them into their practices.
2.k. Use of knowledge and skills by College and Career Readiness training participants in implementing evidence-based
practices, as measured by three-month follow-up evaluations from the Transition Institute.
As stated earlier in this Report, GaDOE held a Transition Institute in September 2013 which covered the topic areas of strategies, resources, and tools
for student centered transition. The two-day Transition Institute consisted of two keynote addresses and three concurrent sessions covering four topic
areas each day. In addition, participants created draft Action Plans for their district.
67
A three-month follow-up survey was conducted of the Institute attendees to determine if the training they received was effective in providing
enhanced transition efforts at their schools. The electronic follow-up survey was administered in March 2014 to all of the attendees with email
address. SurveyMonkey was utilized to conduct the survey by administering three waves of the survey. A 40.0 percent response rate was obtained.
Ten of the 18 respondents indicated that the knowledge and skills they obtained in the training were of some or a lot of help in implementing
evidence-based practices.
A different keynote session was presented at the beginning of each day of the two day institute. The two keynote sessions were presented before the
participants went to concurrent sessions. The two keynote addresses covered strategies, resources, and tools for student centered transition. Table 35
below provides the feedback from participants for each keynote and both combined.
Table 35. The Percent of Ratings that are Favorable for Gaining Knowledge, Using the Knowledge, and the Successful Use of the
Knowledge Gained from the Keynote Addresses.
Response Item
To what extent did you gain knowledge from attending this session?
To what extent were you able to use the knowledge obtained?
To what extent was the knowledge that you used successful with the target students.
What’s in your
Toolkit:
Student
Centered
Transition?
70.0
70.0
65.0
Strategies and
Resources for
Extending
Research to
Practice?
75.0
65.0
70.0
Average for
both
sessions
72.5
67.5
67.5
The Institute had three concurrent sessions following the keynote sessions. Participants selected the concurrent session that they wanted to attend.
The following is a summary of the different presentations in the three concurrent presentations.
Presentations in Concurrent Session 1:
What Do You Want To Be When You Grow Up? Career Guidance and Counseling for SWD
General Supervision: A Brave New World
Improving What We Do: Using the On-line Team Planning Tool
Assessment – A Fluid Process: One District’s Plan
Presentations in Concurrent Session 2:
Transition Coalition Self Study: Moving from Compliance to Quality Transition Planning
Family Engagement and Transition Assessment
Transition: School-to-Work, the Georgia Vocational Rehabilitation Agency
Career Pathways – A Student’s Pathway to Life
68
Presentations in Concurrent Session 3:
Innovative Practices for Effective Transition Planning for High School Students with Disabilities
Building Self-Determination Skills using Student-Led IEP Initiatives
Planning and Evaluating Program Improvements
Transition Assessment in Planning for Students with Most Significant Cognitive Disabilities
Table 36 below provides an overview of the session ratings. It can be noted in this Table that about half of the participants in the first concurrent
session gave a favorable rating, while the other half were not favorable disposed toward the concurrent session offerings. About two thirds of those
participating went to the “What Do You Want To Be When You Grow Up? Career Guidance and Counseling for SWD” or the “Improving What We
Do: Using the On-line Team Planning Tool” concurrent sessions. The second concurrent session attained ratings of about 70 percent, while the
third concurrent session found about two thirds of the participants reported making gains and the remaining third or more reporting little or no gains
from attending.
Table 36. The Percent of Ratings that are Favorable for Gaining Knowledge, Using the Knowledge, and the Successful Use of the
Knowledge by Concurrent Session Attendees.
Response Item
To what extent did you gain knowledge from attending this session?
To what extent were you able to use the knowledge obtained?
To what extent was the knowledge that you used successful with the target
students.
First
Concurrent
Session
52.6
52.6
52.6
Second
Concurrent
Session
78.9
63.2
68.4
Third
Concurrent
Session
68.4
55.6
63.2
2.l. Use of knowledge and skills by College and Career Readiness training participants in implementing evidence-based
practices, as measured by three-month follow-up evaluations from the Hitting the Mark training
As discussed earlier, the Georgia Department of Education (GaDOE), Division for Special Education State Personnel Development Grant (SPDG)
conducted Hitting the Mark professional development during the fall of 2013. Participants were asked to provide feedback regarding the usefulness
of this professional development through a SurveyMonkey feedback form.
The follow-up feedback form was administered to the professional development attendees to determine if the training they received was effective in
providing enhanced transition efforts at their schools. The electronic follow-up survey was administered in March 2014 to all of the attendees with
email addresses. SurveyMonkey was utilized by administering three waves of the survey. A 42.4 percent response rate was obtained (N=59) with
57.6 percent of the respondents indicating that the training provided some or significant help in assessing whether the required components are in the
transition plan.
69
Table 37. The Percentage Response Rate to Five Response Items Regarding the Structuring of Transition Plans Covered in the Hitting the
Mark Professional Development.
Response Item
Reliably assess whether the required components are located in
the “preferences” section of the transition plan.
Reliably assess whether or not appropriately written postsecondary goals are on a transition plan.
Reliably assess whether or not appropriately written annual
measurable transition goals are in the plan, that they can be
reached within one year and support the post-secondary goals
throughout the plan
List and explain what constitutes valid evidence that the
student was invited to the IEP transition meeting, that if
appropriate, an agency representative was invited, and that the
parent gave prior consent to allow the agency representative to
attend the meeting.
Find submitted documents on the portal. (40 percent of the
respondents indicated that this did not apply to their work.
Percentages in the cells are based upon the remaining 60
percent who were treated as the total population)
4
Before attending
the training, we
had this
knowledge and
were using it
when developing
our transition
plans.
Percentage of Response Ratings
3
2
This training
This training
provided us a
provided us some
little additional
additional
knowledge that
knowledge that
helped improve
helped improve
our transition
our transition
plans.
plans.
1
This training
provided
significantly more
knowledge that
helped improve
our transition
plans.
25.4
16.9
37.3
20.3
20.3
22.0
28.8
28.8
16.9
20.3
28.8
33.9
44.1
10.2
27.1
18.6
20.3
11.9
13.6
15.3
In addition to the five items above, participants were asked how compliant current transition plans are. About 83 percent of the respondents indicated
the plans were all fully compliant, while about 17 percent reported that some of their plans were compliant. Another question asked about the
implementation of effective transition strategies. Again, the majority (55.9 percent) reported that they were using fully effective transition strategies
while 42.2 percent reported they were using some effective transition strategies. The remaining 1.7 percent reported using a few effective transition
70
strategies. However, all participating districts receive less than 100% compliance as a result of a DOE transition plan review. After participating in
the “Hitting the Mark” training all districts were at 100% compliance. In addition, no districts have met the target for Indicator #14 in the SPP/APR.
2.m. Effectiveness of the Emory Autism incidental teaching training, as measured by three-month follow-up evaluations
A three-month follow-up survey was conducted of the participants in Round 1 (Social Communication) and Round 2 (Promoting Social Skills)
trainings to determine if the training they received was effective in providing enhanced teaching efforts at their schools. The electronic follow-up
survey was administered in March 2014 to all of the attendees with email addresses. SurveyMonkey was utilized to conduct the survey by
administering three waves of the survey. A 48.0 percent response rate was obtained (24 of 50 attendees responded). All of the respondents reported
that incidental teaching is effective (37.5%) or is very effective (62.5%). None reported it had a little effect or no effect.
2.n. Use of knowledge and skills by parents participating in Emory Autism Early Intervention sessions, as measured by
three-month follow-up incidental teaching evaluations
A three-month follow-up survey was conducted of the parent participants in the Autism Early Intervention Project to determine if the training they
received was effective in providing enhanced child training efforts at home. The electronic follow-up survey was administered in March 2014 to all
of the parents with email addresses. Survey Monkey was utilized to conduct the survey by administering three waves of the survey. A 25.0 percent
response rate was obtained (5 of 20 attendees responded). All of the respondents reported that incidental teaching is effective (40.0%) or is very
effective (60.0%). As with the teachers and providers, none of the parents reported it had little or no effect.
III. Assessment of Coaching Effectiveness
2.o. Effective and timely coaching – GraduateFIRST
During Year 2, the participating GraduateFIRST Schools have continued to receive coaching from both GraduateFIRST Collaboration Coaches and
from School Improvement Specialists. This coaching is based on core components of GraduateFIRST. The 15 regional Collaboration Coaches
submitted monthly logs of their coaching activities within their assigned schools. Table 38 below shows the type of support provided by the
Collaboration Coaches from September 2013 through February 2014. As can be seen, the coaches spent nearly a quarter of their time providing
coaching support to School Team Leaders (20.2 percent of all supports), School Team meetings (19.1 percent of all supports), Collaboration Coach
meetings (15.8 percent of all supports), and participating school visits (14.0 percent of all supports).
71
Table 38. Type of Support Provided by Collaboration Coaches from September 2013 through February 2014.
Type of Support
Parent and
Community
Engagement
Collaboration Coach
Meeting
Conference Call
School Team Meeting
School Visit
System/School
Meeting
Team Leader
Meeting/Support
Face to Face
Professional
Development
Web-based
Professional
Development
DOE Meeting
RESA Meeting
School Improvement
Specialist Meeting
Total Responses
September
Total
% of
Total
1
3
6
1.1
18
0
14
14
13
2
26
18
6
0
17
14
87
77
15.8
1.3
19.1
14.0
2
5
9
13
39
7.1
17
11
19
14
27
111
20.2
5
34
7
5
11
67
12.2
0
2
3
0
6
1
0
1
2
0
3
1
0
6
3
1
0.2
3.6
2.7
3
60
0
2
71
72
4
96
3
103
February
January
December November October
1
0
1
0
19
3
21
21
18
2
15
5
13
0
12
5
6
4
23
5
7
105
1
2
5
2
109
20
15
14
549
2.6
100.0
Table 39 shows the number of coaching contacts by type from September 2013 through February 2014. Of the total coaching contacts, 227 (37.8
percent) were with School Team Leaders, 205 (34.1 percent) were with school administrators, 152 (25.3 percent) were with Collaboration Coaches,
125 were with School Teams (20.8 percent), and 119 (19.6 percent) were with school administrators. The coaching contacts with school
administrators and Team Leaders are very critical in order to reach the goal of implementation sustainability within the participating schools. In
addition, the time spent with other Collaboration Coaches helped to insure consistent implementation of GraduateFIRST within the participating
schools throughout the state.
72
Table 39. Types of Coaching Contacts from September 2013 through February 2014.
Type of Coaching
Contacts
District wide
Audience
National Audience
School Team
Central Office
Personnel
School Administrator
Team Leader
Teacher
Graduation Coach
DOE
GLRS
Statewide Audience
Collaboration
Coaches
RESA
School Improvement
Specialist
Other (please specify)
Total
September
Total
Average %
of Total
Contacts
1
0
27
1
0
22
12
7
125
2.0
1.2
20.8
15
25
36
17
2
8
12
3
26
46
42
24
7
16
11
5
25
53
53
21
6
7
12
1
118
205
227
113
34
92
95
45
19.6
34.1
37.8
18.8
5.7
15.3
15.8
7.5
36
6
22
3
24
7
19
9
152
46
25.3
7.7
22
5
71
19
13
72
24
18
77
33
15
81
162
67
601
27.0
11.1
100.0
February
January
December November October
1
1
21
3
0
20
2
0
15
4
6
20
12
24
39
14
8
17
23
1
22
23
26
14
5
16
17
3
18
34
31
23
6
28
20
32
23
13
28
8
36
7
240
28
9
60
During March 2013 and 2014, an electronic feedback form (survey) was administered to gather information from School Team Leaders regarding
how helpful coaching was during Years 1 and 2. Table 40 below provides the percentage of Team Leader respondents who rated the coaching
received as Good or Outstanding. An overall response rate of 56 percent was obtained for the two-wave feedback effort during March 2013 and a
53.0 rate was obtained during March 2014 when the second year feedback form was administered. Ratings on the vital behaviors varied in Year 2
with implementing and monitoring the school Action Plan increasing from 65.3 percent to 81.0 percent.
73
As can be seen by Table 40, during Year 1 between 60-70 percent of the respondents indicated that the coaching received was Good to Outstanding,
or an average of 65.5 percent. During the Year 2, the ratings climbed to from 60 percent to 85 percent, or an average of 76.2 percent. One can
observe from Table 40 that the percentages improved the second year for all vital behaviors with the exception of evaluating the effectiveness of the
school Action Plan. This vital behavior remained about the same obtaining a score of near 60 percent. It may be that once the plan is developed and
implemented, its effectiveness will be somewhat self evident as indicators are gathered and reviewed, but more formal evaluation tools may not have
been developed and/or implemented.
Table 40. Percentage of Team Leaders Rating Coaching as Good or Outstanding.
Coaching Vital Behavior
Establishing an effective School Team
Collecting and analyzing data
Using data to identify and prioritize areas of need for intervention
Using data to develop a school Action Plan and select target students
Implementing and monitoring the school Action Plan with fidelity
Evaluating the effectiveness of the school Action Plan
Percentage Indicating Coaching was Good (Rating of 3) or
Outstanding (Rating of 4)
Year 1
Year 2
70.1
85.3
64.5
79.8
70.2
84.8
67.6
78.1
65.3
81.0
61.5
60.4
As can be seen in Table 41 below, about 85 percent of the Team Leaders rated School Team meetings and school visits by the collaboration coach as
very to extremely helpful. Sixty plus percent also rated the remaining three methods as very to extremely helpful.
Table 41. Percent of Participants Rating the Methods of Coaching, Providing Information, and Support as Very or Extremely Helpful.
Method of Providing Coaching, Information, and Support.
School Team meetings
School visits by the Collaboration Coach
District School Team Leader Meetings
December Best Practices Forum
GraduateFIRST website
Percent Rating the Method
Very or Extremely Helpful.
85.7
84.6
68.6
69.0
63.2
74
2.p. Effective and timely coaching by CCaRS and/or Cores in the College and Career Readiness Project
During Year 1, two Core and eight part-time CCaRS were selected. The Cores and CCaRS began coaching using a variety of onsite and electronic
trainings during winter and spring of the 2012-2013 school year. This support by the Cores and CCaRs continued in Year 2 of the SPDG for the 15
schools participating in the Initiative 2 CCaR/Transition project.
The eight CCaRS submitted monthly logs of their coaching activities within their assigned schools. Table 42 below shows the type of support
provided by the CCaRS from September 2013 through February 2014. As can be seen, the coaches spent 18.1 percent of their time with conference
calls, 16.6 percent in school meetings/visits, 13.5 percent in training sessions, and 10.4 percent in district office meetings.
Table 42. Types of Support Provided by CCaRS from September 2013 through February 2014.
Type of Coaching
Activity
February,
2014
January,
2014
December, November,
2013
2013
Conference Call
School Meeting/Visit
District Office Meeting
2
12
4
5
8
6
6
2
2
Team Leader
Meeting/Support
Training Session
GaDOE Meeting
Meeting with NSTTAC
5
1
1
0
6
1
8
0
Meeting with the
University of Kansas
Other (please specify)
Total Responses
2
3
20
5
11
36
Average
% of
Total
October,
2013
Total
4
6
5
0
4
3
35
32
20
18.1%
16.6%
10.4%
2
8
1
0
1
5
0
1
5
11
3
0
19
26
13
1
9.8%
13.5%
6.7%
0.5%
5
4
23
3
5
24
2
7
22
17
30
193
8.8%
15.5%
100.0%
Table 43 shows the number and type of coaching contacts from September 2013 through February 2014. Of the total coaching contacts, 64 (23.5
percent) were with teachers, 35 (13.2 percent) with superintendents/district personnel, 33 (12.1 percent) with transition specialists, and 31 (11.4
percent) with school principals. The coaching contacts with school administrator/district personnel and principals are very critical in order to reach
the goal of implementation sustainability within the participating schools.
75
Table 43. Types of Coaching Contacts from September 2013 through February 2014.
Type of Coaching
Contacts
Superintendent/District
Office
School Principal
Teachers
Transition Specialist
Counselors
Parents
Other (please specify)
Total Responses
February,
2014
January,
2014
December,
2013
November,
2013
October,
2013
Total
5
8
14
6
3
1
7
17
10
7
13
6
5
1
23
36
9
4
10
4
3
2
15
22
7
6
11
7
3
3
15
23
5
6
16
10
6
2
19
22
36
31
64
33
20
9
79
272
Average
% of
Total
13.2%
11.4%
23.5%
12.1%
7.4%
3.3%
29.0%
100.0%
To assess the effectiveness of coaching during Year 2, a follow-up feedback survey was administered during March 2014 with School Team
members in the 15 participating schools. This feedback effort attempted to gather information covering the type of coaching provided and the
effectiveness of the coaching. Using a three-wave approach, a 42.4 percent response rate was obtained. Table 44 below shows that 39.0 percent of
the respondents indicated that they have received some or considerable coaching and/or other follow-up support to help in the development of
compliant transition plans and/or effective transition strategies.
Table 44. Type of coaching Received.
Type of Coaching Received
I have not received any coaching and/or other follow-up support.
I received some materials or indirect support, but no personal contact from coaches.
I have received some coaching and/or other follow-up support to help me in developing compliant
transition plans and/or effective transition strategies.
I have received considerable coaching and/or other follow-up support to help me in developing
compliant transition plans and/or effective transition strategies.
Percent
16.9
5.1
39.0
39.0
A second question regarding how useful the coaching follow-up and support were in helping develop compliant transition plans and/or
effective transition strategies. Table 45 below provides the responses received from School Team members within the 15 participating
schools. Almost half (47.5 percent) of the respondents indicated that the coaching or other follow-up support has been very helpful in their
76
transition work. Of the total respondents, 22 percent had not received any coaching and/or follow-up support.
Table 45. Usefulness of Coaching and Other Support Received.
Usefulness of Coaching Received
I have not received any coaching and/or other follow-up support.
The follow-up coaching or other follow-up support provided was slightly helpful in developing
compliant transition plans and/or effective transition strategies.
Coaching or other follow-up support has been helpful in developing compliant transition plans and/or
effective transition strategies.
Coaching or other follow-up support has been very helpful in developing compliant transition plans
and/or effective transition strategies.
Percent
22.0
3.4
27.1
47.5
From the above two Tables, it is apparent that when coaching was provided, it was successful in helping to develop compliant transition plans
and/or effective transition strategies. The fact that over 20 percent of the respondents did not receive any coaching and/or other follow-up
support may have to do with changes made in the leadership team after the Institute.
2.q. Effective and timely coaching by the Emory Autism Center and Initiative 3 Coach in the Autism Early
Intervention Project
During Year 2, coaching was provided by the Emory Autism Center staff as well as an Initiative 3 Autism Coach. “Hands-on” training was
provided in each demonstration classroom on the day following of the three Rounds of incidental teaching trainings, and at least one more
time before the next training cycle began. Coaching sessions included modeling by one or more of the Emory trainers/coaches, opportunities
for trainees to practice new techniques, along with abundant positive behavior-specific feedback. Coaching notes were prepared based on
coaches observations of how new skills were being implemented An electronic survey was conducted of teachers and providers concerning
the effectiveness of the coaching they received.
The electronic follow-up survey was administered in March 2014 to all of the teachers and providers with email addresses. Survey Monkey
was utilized to conduct the survey by administering three waves of the survey. A 48.0 percent response rate was obtained (24 of 50 attendees
responded). Results of the survey found that 83.3 percent of the respondents reported the coaching was effective (40%) or very effective
(33.3%). Only one respondent reported that the follow-up was not helpful and three reported that it was slightly helpful.
2.r. Participating teachers identified through the TKES reporting effective and timely assistance and coaching
This performance measure relates to the federal competitive SPDG priority and is not scheduled for implementation until Year 3 of the SPDG—see
Performance Measure 1.h.
77
Note: Performance Measure 2.o. – 2.r. below report aggregate baseline data for Initiative 1 GraduateFIRST schools in
graduation rates, retention rates, attendance, and achievement. Data is available for individual GraduateFIRST schools and will
be used to compare progress in Years 3, 4, and 5 of the SPDG.
2.s. Improvement of students within Initiative 1 GraduateFIRST schools in graduation rates
Georgia’s Governor’s Office of Student Achievement (GOSA) gathers numeric data on each district and school in the state . Among the indicators
gathered is data on graduation. The 2012-13 data is available for use, and Table 46 below provides the levels of graduation for students with and
without disabilities for the participating high schools. Of the 28 participating high schools, 28 had data for students without disabilities (SWOD),
and only 18 had data for students with disabilities (SWD). As can be observed from the Table below, SWD are graduating at a rate a little below 40,
percent while SWOD are graduating at a rate slightly above 75 percent in the participating GraduateFIRST high schools.
Table 46. Rates of Graduation for Students with and without Disabilities in Participating GraduateFIRST High Schools.
Groups
Only schools with data for both SWD and SWOD
(N=18)
All schools participating with Graduation data
SWD
SWOD
38.39
38.39
77.87
76.18
2.t. Improvement of students within Initiative 1 GraduateFIRST schools in reducing the dropout rate
The state gathers dropout data and uses a specific formula and methodology for calculating it. This information is captured in the GOSA system and
can provide longitudinal data for trend assessment. Longitudinal data will be available next year for trends statewide and for participating
GraduateFIRST schools.
2.u. Improvement of students within Initiative 1 GraduateFIRST schools in attendance
Among the indicators gathered by the GOSA is data on absenteeism. The 2012-13 data was available for use as baseline for the GraduateFIRST
participating schools. Table 47 below provides the levels of absenteeism for students with and without disabilities by categories based upon the
number of days absent from participating GraduateFIRST schools. As can be observed from the Table percentages, elementary schools experience
less absenteeism than middle and high schools. Table 47 compares students with disabilities (SWD) against those without disabilities (SWOD).
Combining high school and middle school data finds that 18.28 percent of SWD are absent more than 15 days, while students without disabilities
have an average absenteeism rate of 12.51 percent for being absent from than 15 days. This means that SWD are absent 1.46 days for every day
SWOD are absent or 146.0 percent greater absenteeism.
78
Table 47. Overview of Student Level of Absenteeism for Students with and without Disabilities.
Participating
Schools
School Level
Elementary School
Middle School
SWD
High School
Elementary School
Middle School
SWOD
High School
Average
Student
Population per
School
96.70
113.13
184.18
713.48
788.28
1314.39
Percent of
Percent of
Students with 5 Students with 6
Days or Fewer
to 15 days
Absent
Absent
51.85
37.39
46.43
38.59
45.38
33.05
57.88
36.12
55.08
34.98
49.81
35.10
Percent of
Students with
More Than 15
Days Absent
10.75
14.98
21.58
6.00
9.93
15.09
A review of the information in Table 48 below finds that SWD are more likely to be absent for more than 15 days than their peers without disabilities
(SWOD) in the participating GraduateFIRST schools. This holds true for elementary, middle, and high school. They, however, are less likely to
have absences of five days or less. This appears to hold true of elementary school through high school level students with disabilities.
Table 48. Differences Between Students with and without Disabilities on Absentee levels (a Positive Amount in a Table Cell Indicates that
SWDs are Absent Less than SWOD).
Participating
Schools
Difference
between SWOD
and SWD
School Level
Elementary School
Middle School
High School
%
Difference for
Students with
5 Days or
Fewer Absent
6.0
8.7
4.47
%
Difference
for Students
with 6 to 15
days Absent
-1.3
-3.6
2.1
%
Difference for
Students with
More than 15
Days Absent
-4.8
-5.0
-6.5
2.v. Improvement of students within Initiative 1 GraduateFIRST schools in achievement
The state’s assessments (CRCT, CRCT-M, Alternate Assessment and EOCTs) are designed to measure how well elementary and middle school
students acquire the skills and knowledge described in the state-mandated content standards in reading, English/language arts, mathematics, science
and social studies. The assessments yield information on academic achievement at the student, class, school, system, and state levels. This
79
information is used to diagnose individual student strengths and weaknesses as related to the instruction of the state standards, and to gauge the
quality of education throughout Georgia.
GraduateFIRST high schools participate in the above mentioned assessments, and their data are available for the schools by course of study. Table
49 below provides a comparison of students with disabilities to all students. The right hand column of the Table enables the reader to compare the
meets or exceeds percentage of the two groups of students. One can observe from the Table that in all courses, students with disabilities did not meet
the criterion as well as all students in the GraduateFIRST participating high schools. “Economics/Business/Free Enterprise”, testing found that
students with disabilities passed by 44.6 percent, while all students passed at the 69.4 percent level. Other academic achievement areas are available
for comparison.
.
Table 49. End-of-Course Testing Results (2013) for the Participating High Schools Comparing the Percentage of all Students to those with
Disabilities.
Student Group
All Students
Students with
Disabilities
All Students
Students with
Disabilities
All Students
Students with
Disabilities
All Students
Students with
Disabilities
All Students
Students with
Disabilities
Subject Area
9th Grade Literature and
Composition
9th Grade Literature and
Composition
Algebra I
Algebra I
American Literature and
Composition
American Literature and
Composition
Biology
Biology
CCGPS Coordinate
Algebra
CCGPS Coordinate
Algebra
Average
Number Tested
Does not
Meet
Criterion
Meets
Criterion
Exceeds
Criterion
Meets or
Exceeds
Criterion
94.21
25.18
52.22
25.95
78.17
10.25
2.50
57.32
36.00
38.84
64.00
5.76
44.60
64.00
0.50
--
--
--
0.00
79.72
19.27
61.22
20.62
81.84
7.19
86.35
44.19
38.86
52.58
39.32
7.77
25.06
60.35
64.38
9.32
66.42
25.92
10.33
36.25
140.66
76.67
23.55
2.56
26.11
15.40
95.22
7.00
4.00
11.00
80
Student Group
All Students
Students with
Disabilities
All Students
Students with
Disabilities
All Students
Students with
Disabilities
All Students
Students with
Disabilities
All Students
Students with
Disabilities
All Students
Students with
Disabilities
Average
Number Tested
Does not
Meet
Criterion
Meets
Criterion
Exceeds
Criterion
Meets or
Exceeds
Criterion
88.86
32.14
38.07
31.32
69.39
6.79
70.00
60.69
32.29
29.20
61.00
15.37
9.40
44.57
70.40
Geometry
Mathematics-1
7.33
9.46
65.00
67.36
33.67
32.07
4.00
5.33
37.67
37.40
Mathematics-1
Mathematics-2
1.61
84.54
94.33
59.23
8.50
37.53
-8.31
8.50
45.85
Mathematics-2
Physical Science
8.67
63.22
79.59
30.54
21.83
37.39
4.00
35.83
25.83
73.22
Physical Science
US History
7.33
80.76
56.52
50.76
33.09
30.31
16.29
21.11
49.39
51.42
US History
7.44
64.78
23.43
17.08
40.51
Subject Area
Economics/Business/Free
Enterprise
Economics/Business/Free
Enterprise
Geometry
Table 50 below provides a comparison of students with disabilities to all students in the participating GraduateFIRST elementary schools. The right
hand column of Table 50 enables the reader to compare the Meets or Exceeds Criterion percentage of the two groups of students. One can observe
that in all courses, students with disabilities again did not meet the criterion as well as all students in the GraduateFIRST participating middle
schools.
Table 50. Criterion Test Results for Participating School 8th Grade Students Comparing all Students to Those with Disabilities.
Student Group
Subject Area
All Students
English Language Arts
Average Number
Tested
Does not
Meet
Meets
Exceeds
Meets or
Exceeds
233.61
10.48
58.79
30.79
89.58
81
Average Number
Tested
Does not
Meet
Meets
Exceeds
Meets or
Exceeds
Student Group
Students with
Disabilities
Subject Area
English Language Arts
18.03
30.44
64.48
7.88
72.36
All Students
Students with
Disabilities
Mathematics
233.36
19.67
59.45
22.26
81.71
Mathematics
17.39
46.64
49.52
8.82
58.34
All Students
Students with
Disabilities
Reading
232.24
4.74
62.48
33.15
95.64
Reading
18.06
16.96
76.24
10.84
87.08
All Students
Students with
Disabilities
Science
241.15
36.21
49.97
14.47
64.44
Science
25.21
75.37
24.10
4.30
28.40
All Students
Students with
Disabilities
Social Studies
240.06
32.09
44.67
23.97
68.64
Social Studies
25.15
68.43
28.11
7.67
35.77
Table 51 below provides a comparison of students with disabilities to all students in the participating GraduateFIRST elementary schools. The right
hand column of Table 51 enables the reader to compare the Meets or Exceeds Criterion percentage of the two groups of students. One can observe
from the Table that in all courses, students with disabilities did not meet the criterion as well as all students in the GraduateFIRST participating grade
schools.
Table 51. Criterion Test Results for Participating School 5th grade Students Comparing all Students to those with Disabilities.
Average Number
Tested
Does not
Meet
Meets
Exceeds
Meets or
Exceeds
Student Group
Subject Area
All Students
Students with
Disabilities
English Language Arts
127.71
7.00
63.52
29.65
93.17
English Language Arts
9.83
36.00
60.00
11.33
71.33
All Students
Students with
Disabilities
Mathematics
126.88
7.91
54.52
37.57
92.09
Mathematics
8.58
49.00
37.17
16.20
53.37
82
Average Number
Tested
Does not
Meet
Meets
Exceeds
Meets or
Exceeds
Reading
128.42
4.00
70.57
25.52
96.09
Reading
10.38
20.86
75.22
11.14
86.37
All Students
Students with
Disabilities
Science
135.08
26.96
46.30
26.70
73.00
Science
17.13
63.96
31.05
11.15
42.20
All Students
Students with
Disabilities
Social Studies
134.67
23.83
61.35
14.83
76.17
Social Studies
17.00
59.43
40.18
8.00
48.18
Student Group
Subject Area
All Students
Students with
Disabilities
2.w. Improved Graduation rates for participating CCaR districts
2.x. Reduced dropout rate for participating CCaR districts
As one outcome indicator of success of the CCaR Initiative, graduation and dropout rates are being gathered. Table 52 shows baseline data for the 15
CCaR participating districts. The average graduation rate for six of the reporting CCaR districts for students with disabilities was 30.7 percent in
2012-2013. This compared to 68.2 percent in the 15 participating CCaR districts for all students. The average dropout rate in 2012-2013 for all
students in 12 reporting CCaR districts was 4.6 percent, compared to 8.7 percent in seven reporting CCaR districts.
This baseline for graduation and dropout rates will be used in subsequent years to provide an indicator of changes occurring in the CCaR
participating districts. Beginning with Year 3, a sample of students within participating CCaR districts will also be followed to determine their inschool status and their high school exiting status.
Table 52. 2012-2013 Baseline Graduation and Dropout Rate for Participating CCaR Districts.
District
Haralson County
Bleckley
Dekalb County
Dooly County
2012-13
All Grad.
Rate
%
59.2
78.7
58.9
73.6
2012-13
SWD
Grad. rate
%
23.3
NA
22.5
NA
2012-13
All Dropout Rate
%
6.4
3.1
5.6
4.7
83
2012-13
SWD Dropout Rate
%
11.1
NA
8.1
NA
District
Greene County
Habersham County
Liberty County
Marietta City
Marion County
Seminole County
Thomaston-Upson Lee
Vidalia City
Wayne County
Webster County
Wilcox County
State Schools
Georgia Area School for the
Deaf
Atlanta Area School for the
Deaf
Georgia Area School for the
Blind
NA=Too Few Students
(<10)
2012-13
All Grad.
Rate
%
60.5
77.8
72.3
66.2
68.4
77.9
65.1
74.4
69.6
54.8
65.3
All Grad.
Rate
2012-13
SWD
Grad. rate
%
NA
43.1
39.8
33.3
NA
NA
22.2
NA
NA
NA
NA
SWD
Grad
NA
2012-13
All Dropout Rate
%
5
2.3
5
6.4
NA
NA
4.9
2.4
5.4
NA
3.5
2012-13
SWD Dropout Rate
%
NA
5.7
4.1
11.7
NA
NA
9
NA
11.5
NA
NA
All Dropout
SWD DO
NA
NA
NA
NA
NA
NA
NA
NA
NA
NA
NA
2.y Students with disabilities exiting high school and enrolled in a postsecondary education or training program
Data/information related to this performance measure were not available at the time of this Report. Indicator 14C of Georgia’s State Performance
Plan (SPP) measures the percentage of youth who are no longer in secondary school, had IEPs in effect at the time they left school, and were enrolled
in higher education or in some other postsecondary education or training program, or competitively employed or in some other employment within one year
of leaving high school. Table 53 below provides statewide baseline data gathered in 2011-2012 and 2012-2013 regarding the post school status of
students with disabilities graduating and exiting high school. Within one year of leaving high school, a total of 77.4 percent were enrolled in higher
education program, competitively employed, enrolled in some other postsecondary education or training program, or otherwise employed—compared
to 76.3 percent in 2011-2012. A total of 22.6 percent seemed to be unengaged, compared to 23.7 percent during 2011-2012. When reviewing the
84
percentage of graduates with the percentage of responders by disability groups, gender, ethnicity, and Limited English Proficiency (LEP) status, the data
were relatively equal for all groups.
Table 53. Status of Students with Disabilities Within One Year of Leaving High School – Outcome Measure.
Post Secondary Options
Enrolled in higher education program.
Enrolled in higher education or competitively employed.
Enrolled in some other postsecondary education or training program, other postsecondary
education or training program; or competitively employed or in some other employment.
Percent
2011-2012
24.7
52.5
Percent
2012-2013
24.8
50.9
76.3
77.4
2.z. Children with autism receiving supports and services in least restrictive settings
The teachers and staff within the 17 participating classrooms are currently holding spring IEP meetings at which time LRE placements are being
considered. LRE data, consequently, is not available for reporting in this Annual Report, but will be available for the Year 3 Annual SPDG
Performance Report.
2.z.1 Participating teachers identified through the TKES showing improvement in the teacher evaluation system
This performance measure relates to the federal competitive SPDG priority and is not scheduled for implementation until Year 3 of the SPDG.
85
OMB No. 1894-0003
Exp. 04/30/2014
U.S. Department of Education
Grant Performance Report (ED 524B)
Project Status Chart
PR/Award # (11 characters): ______________________
SECTION A - Performance Objectives Information and Related Performance Measures Data (See Instructions. Use as many pages as necessary.)
3. Project Objective
[ ] Check if this is a status update for the previous budget period.
Program Measure 3: Georgia Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use
of SPDG-supported practices.
3.a. Performance Measure
Percentage of SPDG funds for Initiative 1 –
GraduateFIRST used for follow-up activities
designed to sustain GraduateFIRST (i.e.,
implementation of evidenced-based academic,
behavior, and dropout prevention strategies
based on district/school data).
3.b. Performance Measure
Percentage of SPDG funds for Initiative 2 –
CCaR Project used for follow-up activities
designed to sustain CCaR Project quality and
effective transition support practices.
Measure
Type
Quantitative Data
Project
Raw Number
$835,974.32
Target
Ratio
$835,974.32/
/$1,044,967.93
%
Raw Number
80.0
$878,902.52
Measure
Type
Raw Number
$278,190.85
Measure
Type
Percentage of SPDG funds for Initiative 3 –
Project
$878,902.52/$1,044,967.93
%
84.1
Quantitative Data
Project
3.c. Performance Measure
Actual Performance Data
Ratio
Target
Ratio
$178,084.50/
$463,651.42
%
60.0
Raw Number
Actual Performance Data
Ratio
$296,807.50
$296,807.50/$463,651.42
%
64.0
Quantitative Data
Raw Number
Target
Ratio
86
%
Raw Number
Actual Performance Data
Ratio
%
Autism Early Intervention Project used for
follow-up activities designed to sustain quality
and effective transition support practices.
$108,083.68
$69,670.53/
$180,139.48
60
$116,117.55
$116,117.55/$180,139.48
64.50
Explanation of Progress (Include Qualitative Data and Data Collection Information)
3.a. - 3.c. Use of SPDG professional development funds to provide follow-up activities designed to sustain the use of
SPDG-supported practices
Dr. Julia Causey, Georgia SPDG Director, has the responsibility for tracking the SPDG funds budgeted and used within Georgia’s SPDG Initiatives. This
responsibility is carried out with the support of the GaDOE Budget Office.
Table 55 below provides a summary of SPDG funds used in professional development follow-up/sustainability activities for each SPDG Initiative
during Year 2 compared to all professional development costs. This summary includes GaDOE in-kind funds that have been devoted to the SPDG.
All GaDOE SPDG personnel costs are paid for by the GaDOE using other funds. This commitment was made so that the maximum amount of
SPDG funds could be devoted to training, coaching, and other follow-up support efforts. All estimated budget expenses are calculated based on
expenditures for the first reporting period of Year 2 (ending March 1, 2014). The reported expenses include obligated funds that we anticipate will
be expended within the next few weeks.
Table 55. Comparison of Follow-Up/Sustainability Funds to All Professional Development (PD) Funds by Georgia SPDG Initiative.
Initiative 1 – GraduateFIRST –
Year 2
GaDOE In-Kind Funded Leadership
and Key Staff
Director of GraduateFIRST
Statewide GF CC Coordinator
Coaching Contracts
Contracts (P2P)
Travel
Other Expenses (e.g., Indirect Costs,
Web-Based Resource Development,
Total PD Follow-up Expenses - Year 2
Reporting Period
Total PD Expenses
$122,836.67
$35,570.41
$51,000.00
$642,450.18
$41,375.00
$28,827.50
$98,269.34
$32,013.37
$45,900.00
$578,205.16
$24,825.00
$25,944.75
$122,908.17
$73,744.90
87
Data Management, Web-Based
Resources, and Salary Benefits)
Initiative 1 – GraduateFIRST Total
Budget
and Estimated PD Costs
Follow-up Costs (% of the Total)
Target for Year 2
Actual vs. Target
Initiative 2 – College and Career
Readiness (CCaR) – Year 2
$1,044,967.93
84.10%
80%
Exceeded Target
Total PD Expenses
Total PD Follow-up Expenses - Year 2
Reporting Period
$96,960.00
$77,568.00
$194,263.59
$126,271.33
$67,538.75
$45,508.33
$33,769.38
$29,508.41
$59,380.75
$29,690.38
$463,651.42
$296,807.50
GaDOE In-Kind Funded Leadership
and Key Staff
CCaR Transition Specialists & Parent
Support Specialist
Contracts (University of Kansas, P2P,
GCDD, ASPIRE Mini Grants)
Travel
Other Expenses (e.g., Indirect Costs,
Web-Based Resource Development,
Data Management, Web-Based
Resources, and Salary Benefits)
Initiative 2 – CCaR Total Budget
and Estimated PD Costs
Follow-up Costs (% of the Total)
Target for Year 2
Actual vs. Target
64%
60%
Exceeded
Initiative 3 - Autism Early
Intervention – Year 2
GaDOE In-Kind Funded Leadership
and Key Staff
$878,902.52
Total PD
Expenses
$
Total PD Follow-up Expenses - Year 2
Reporting Period
47,520.00
$38,016.00
88
Coaching
Contracts (Emory University Autism
Center)
Travel
Other Expenses (e.g., Indirect Cost,
Web-Based Resource Development,
Data Management, Web-based
Resources, Salary benefits)
Initiative 3 - Autism Early
Intervention Total Budget and
Estimated PD Costs
Follow-up Costs (% of the Total)
Target for Year 2
Actual vs. Target
$
10,862.98
$7,060.94
$
$
94,416.67
4,804.58
$56,650.00
$3,122.98
$
22,535.25
$11,267.63
$
180,139.48
$116,117.55
64.50%
60%
Exceeded
Year 2 SPDG funds used for follow-up/sustainability activities, as required by Program Measure 3, are described below:
3.a. Initiative 1 - GraduateFIRST
The Collaboration Coaches, in collaboration with the School Improvement Specialists, have the primary responsibility to provide and/or arrange for
follow-up coaching, training, and technical assistance for the schools participating in the GraduateFIRST Goal 1 Initiative. The GraduateFIRST
initiative has 1.0 FTE dedicated to overseeing coaching activities related to the implementation of GraduateFIRST. The Director and the Statewide
Lead Collaboration Coach, in collaboration with the SPDG Project Director, are responsible for job descriptions, setting up training for the coaches,
and using fidelity and outcome data to determine further training needs of the coaches.
The major responsibility of the Collaboration Coaches is to provide professional development follow-up in supporting Team Leaders and School
Teams in the implementation of School Action Plans. Within these Action Plans, evidence-based programs and strategies are implemented within
selected areas of engagements—academic, behavior, cognitive, and/or student engagement. Collaboration Coaches receive multiple sources of
feedback about their coaching, including the GraduateFIRST Team Leader Coaching Evaluation, GraduateFIRST Implementation Scales, records of
application of knowledge and skills, satisfaction survey results, as well as student outcome data. These data are used collectively to provide
Collaboration Coaches feedback about performance and implementation outcomes. School Teams and Team Leaders provide information about the
support and coaching they have received through the Coaching Effectiveness Survey, which was administered in February 2014.
The Coaches report monthly on their activities to Director and Coach Coordinator who report to Dr. Julia Causey, Georgia SPDG Director, regarding
their follow-up activities, including the amount of time spent for each major follow-up activity. This reporting is done using a web-based tool with
89
monthly coaching activities reviewed by the SPDG third party evaluators and reported to Dr. Causey and the Lead Collaboration Coach. In addition,
the GaDOE SPDG staff members provide support for follow-up activities.
Our goal this year was to spend 80 percent of the SPDG GraduateFIRST Goal 1 funds on activities designed to sustain the use of GraduateFIRST. A
total of $1,044,967.93 was spent on all Initiative 1 professional development. An estimate of $878,902.52 of the total amount was spent on coaching
activities designed to sustain evidence-based practices, which is 84.1 percent of the total funds used. The Year 2 goal of 80 percent was exceeded.
3.b. Initiative 2 - College and Career Readiness Project Initiative 2
The CCaRS provide ongoing coaching and support for the 15 participating school districts. One Core Specialist provides ongoing follow-up
coaching, support and supervision for the CCaRS.
Partners in this initiative include the GaDOE Career, Technical and Agricultural Education Division (CTAE); the Georgia Vocational Rehabilitation
Agency, the Georgia Council for Developmental Disabilities, and Parent to Parent (P2P).
The GaDOE is also partnering with the University of Kansas Transition Coalition and the National Secondary Transition Technical Assistance
Center (NSTTAC) to provide professional development intended to improve transition intervention by focusing on research and evidence-based
practices. In addition to supporting the fall Transition Institute, the National Secondary Transition Technical Assistance Center (NSTTAC) provides
ongoing follow-up support to the State Schools participating in the CCaR project.
The University of Kansas Transition Coalition is providing support to the CCaRS through monthly web meetings using Adobe Connect (with video)
and conference call (for audio). These meetings are recorded and posted online for the CCaRS to access at any time. The hour-long, monthly web
meetings consist of presented content and discussion needed to support districts participating in the CCaR Project.. During Year 2, the Transition
Coalition staff has provided three web meetings to the CCaRS—practice session using Adobe Connect audio and video features and review of the
Self-Study process and materials.
In addition to monthly meetings, the Transition Coalition has implemented an online discussion forum so that CCaRS can post questions, and
resources to help ensure consistency of support being provided by the CCaRS to the participating districts---as well as to provide feedback to the
Transition Coalition regarding the consistency of their support to the CCaRS. Monthly meetings are also held by the Transition Coalition staff and
the SPDG/GaDOE Transition Coordinator as another way to monitor fidelity of project implementation.
Our goal this year was to spend 60 percent of the SPDG CCaR funds on activities designed for professional development. An estimated total, of
$296,807.50 was spent on all professional development (64 percent). The Year 2 goal of 60 percent was exceeded.
90
3.c. Initiative 3 – Autism Early Intervention
The Emory Autism Center (EAC) staff have provided follow-up coaching and support for the participating teachers in the implementation of
incidental teaching. In addition, an Initiative 2 Coach has been hired and has provided ongoing coaching and support. Our goal this year was to
spend 60% of the SPDG Autism Initiative 3 funds on activities designed for professional development follow-up. An estimated $180,139.48 was
spent on all professional development expenses for Initiative 3. Of this total, approximately $116,117.55 was spent on the above planning and
follow-up activities so that coaching and other support will be in place during the remainder of Year 2 and in Years 3-5. This estimate for Year 1
results in a 64.5 percent utilization of the total professional development funds for follow-up. The Year 1 goal of 60 percent was exceeded.
91
U.S. Department of Education
Grant Performance Report (ED 524B)
Project Status Chart
OMB No. 1894-0003
Exp. 04/30/2014
PR/Award # (11 characters) H323A120020______________________
SECTION B - Budget Information (See Instructions. Use as many pages as necessary.)
The Georgia DOE requires that all grant awards be approved through the State Board of Education. Only after Board approval in September can fiscal agents be
notified, allocations made and budgets created and approved. Because once again that process could not be completed until late November 2013 the fiscal agents
for grant initiatives for Year 2 of the SPDG have not drawn down all the funds that they have been allocated.
The Georgia DOE requires expenditures prior to drawdowns to ensure internal control of the grant. By the time their budgets had been approved it was the end of
the first quarter which resulted in fiscal agents not technically having funds to spend before December 2013. Approximately a third of the funds have been
requested as of March 31, 2014 but not drawn down and therefore not reflected on the Performance Report Cover Sheet (which covers Oct. 1, 2013-March 1,
2014).
Listed below is budget information:
*State Board Approved Allocations for FY 2014 State Personnel Improvement Grant (SPDG) in
Special Education
1. Graduate First Program
*Fiscal Agent
Richmond County
Heart of Ga. RESA
DeKalb County
Griffin RESA
Metro RESA
Middle Ga. RESA
Pioneer RESA
Pickens County
Northeast Ga. RESA
Northwest Ga. RESA
Okefenokee RESA
First District
Dougherty County
Muscogee County
West Ga. RESA
Sub-total
2. Transition Program
Pioneer RESA
Sub-total
GLRS Center
EAST GEORGIA
EAST CENTRAL
METRO EAST
METRO SOUTH
METRO WEST
MIDDLE GEORGIA
NORTH GEORGIA
NORTH CENTRAL
NORTHEAST
NORTHWEST
SOUTH CENTRAL
SOUTHEAST
SOUTHWEST
WEST GEORGIA
WEST CENTRAL
FY 2014 SPDG
53,700
27,500
50,000
75,500
55,000
510,984
49,500
30,000
55,000
17,000
5,500
50,000
3,500
$983,184$
NORTH GEORGIA
FY 2013
49,733
48,698
49,511
49,921
49,719
50,284
646,956
49,188
49,478
49,779
51,305
49,645
49,571
50,512
48,500
$1,342,800
$12,000
$282,000
$282,000
$12,000
92
3. Early Intervention-Young Children/Autism
a. Metro RESA
$22,800
b. Parent Training Information
Center (P2P)
Sub-total
$22,800
4. Project Evaluation/Administration
$25,000
a. LEA Team Leaders Monthly
Meeting
b. Annual SPDG Project Directors $2,000
Meeting
c. University of Oregon Program $4,000
Website
$31,000
Sub-total
$1,048,984
GRAND TOTAL
$123,700
$22,800
$146,500
$25,000
$2,000
$4,000
$31,000
$1,802,300
South Central GLRS coach supports South GA GLRS.
East Central, Middle GA, North GA, and SE GA GLRS received $5000 in minigrants for student led IEPs.
North GA GLRS provides project coordinator, collaboration coach, coach coordinator, materials and
dissemination, project evaluation, and data collection and analysis.
*Fiscal agents were able to spend 2/3s of their allocation in 2013 and 1/3 was carryover.
*Currently for 2014 GLRS fiscal agents have expended approximately 34% of their allocations as
indicated in the most recent expenditure reporting as of March 31, 2014 below:
Fiscal Agent / % of Allocation expended
DeKalb County/30.65
Dougherty County/23.37
Muscogee County/40.24
Pickens County/53.39
Richmond County/25.34
NW Georgia RESA/61.29
Pioneer RESA/28.13
Metro RESA/41.3
NE Georgia RESA/51.35
Griffin RESA/46.59
Middle Georgia RESA/41.84
Heart of Georgia RESA/41.82
First District RESA/33.46
Okefenokee RESA/36.25
93
SECTION C - Additional Information (See Instructions. Use as many pages as necessary.)
SPDG Year 2 Partners include:
1. National Dropout Prevention Center for Students with Disabilities
2. University of Kansas Transition Coalition
3. Emory University Autism Center
4. Babies Can’t Wait
5. Bright from the Start/Department of Early Care and Learning
6. Headstart
7. National Secondary Transition Technical Assistance Center (NSSTAC)
8. Parent to Parent of Georgia (P2P)
9. GaDOE School Improvement Division
10. Vocational Rehabilitation
11. Georgia Council for Developmental Disabilities
We do not anticipate any changes in partners this next budget period. We will be adjusting coach regional assignment for the GraduateFIRST
Initiative as we determine the schools that will be participating next year and as we determine which district will be moving toward district wide
implementation.. We will also be evaluating the coach support utilized in our CCaR Project to determine possible adjustments based on school
district needs.
94
Download