Exhibit 2.4.g.22 Educational Leadership MS & Organizational Change CAS Comprehensive Data Analysis Report (DAR) Summary 2011-2013 AY11: 1. Have the change(s) in response to data that you documented last year had the desired effect on your program? Please provide specifics referencing prior changes that you submitted in AY 2009-2010. In 2009-2010, the department responded to conditions noted by ELCC regarding the department’s assessment system: i.e., that data should be reported for individual indicators on the standards, rather than using scaled scores not aligned to the ELCC standards. Thus, the department responded by creating reporting templates for the assessments that were aligned to the individual ELCC standards. Scaled scores were no longer used for reporting. The assessments remained essentially the same, however, as the assessments themselves were not in question. The reporting templates included data for each indicator from the rubrics for each assessment and for each student in the class where the assessment was given. The program’s Graduate Program Director worked with the Graduate Program Assistant to create Excel files of the data, and tables were subsequently derived from them. Systematic use of the reporting templates has improved the efficiency of reporting the data as well as the usefulness of the data for making program improvements. 2. What significant findings emerge from your examination of these data? The most significant finding for the department this year was the decline in the pass rate of the national SLLA exam. Previously, we had a 100% pass rate but have now slipped down to a 76% pass rate. This is slightly lower than the 83% pass rate for the State of Maryland, which also declined from its former pass rate of 98% (MSDE, 2011). One possible reason is that the format of the SLLA has changed. The exam now includes multiple-choice questions as well as essays that are much more focused and are aligned with specifics from the ISLLC Standards. These findings have caused ILPD to work towards developing a new Departmental Comprehensive Exam for Spring 2012. 3. How have you involved faculty in your identification of the implications of these data? Faculty discussed the implications of these data during its start of year retreat in August 2011. An ad hoc committee was formed to create a new Departmental Comprehensive Exam that will contain a variety of questioning formats, including selected response, constructed response, and brief constructed response questions. A new assistant professor is chair of the ad hoc committee, and four senior faculty serve on the committee. They include the former Acting Graduate Program Director, the former Department Chair, and the current Department Chair. To date, a bank of questions has been developed, and upon determination of the exam’s design, it will be piloted in January 2012. 4. What specific actions will you take in response to these data? (REQUIRED response to NCATE AFI) In addition to developing a new Departmental Comprehensive Exam that reflects some of the changes made to the national SLLA exam, the department will also establish teams and team leaders for each of the required courses for Administrator I. Team leaders will be responsible for ensuring that the content, experiences, and assessments for each of the required courses are aligned with ELCC/ISLLC and MILF. As the level of program accountability has increased since the program’s inception 6 years ago, it is important that responsibility and decision making for the program be distributed among the program’s faculty. The addition of 2 new tenure track faculty in 2011-2012 has provided new insights about the program and its future directions. AY12: 1. Have the change(s) in response to data that you documented last year had the desired effect on your program? Please provide specifics referencing prior changes that you submitted in AY 2009-2010. In 2009-2010, the department responded to conditions noted by ELCC regarding the department’s assessment system: i.e., that data should be reported for individual indicators on the standards, rather than using scaled scores not aligned to the ELCC standards. Thus, the department responded by creating reporting templates for the assessments that were aligned to the individual ELCC standards. Scaled scores were no longer used for reporting. The assessments remained essentially the same, however, as the assessments themselves were not in question. Beginning last year and continuing presently, however, the department is making a transition to the 2011 ELCC standards. We have carefully reviewed the new standards and are presently involved in revising syllabi and assessments for our required courses and for our internship experiences and portfolios. For instance, the department revised the comprehensive examination and accompanying rubric in fall, 2011 (see attached revised comprehensive examination with accompanying rubric aligned with ELCC 2011 standards). Although the November exam was the old version, we piloted the new exam in April, 2012. Because of a disastrous technology failure, the department made the decision to grant students the benefit of the doubt and passed all of them. The technological problems were resolved and the June comprehensive exams proceeded smoothly. We have not been satisfied, however, with our students’ relative performance on the comprehensive examination and began taking the following steps to improve it: Embed in ILPD requires courses opportunities for in-class and on-demand writing Explicitly expect and scaffold text-based analyses and specific references to scholarly research in course assignments Develop a study guide for the comprehensive examination and hold study sessions for students who request them. Because ILPD’s Chair and Graduate Program Director continue to work with the Graduate Program Assistant to create Excel files of the data, as well as a variety of assessments tables, the department has in place a systematic process for documenting student work and progress and has improved reporting templates. This has helped the faculty analyze and learn from the reported data. Faculty’s analyses, for instance, pointed clearly to the need to revise the comprehensive examination last year so that it more tightly focused on individual ELCC standards; it also pointed to the need to better prepare students for the examination throughout their required course work. 2. What significant findings emerge from your examination of these data? The most significant finding for the department last year was the need to examine both the comprehensive and the internship experience in order to continue to address declining pass rates on the national SLLA exam. Previously, we had a 100% pass rate but for last year’s report had slipped down to a 76% pass rate. This is slightly lower than the 83% pass rate for the State of Maryland, which also declined from its former pass rate of 98% (MSDE, 2011). One possible reason is that the format of the SLLA has changed. The exam now includes multiple-choice questions as well as essays that are much more focused and are aligned with specifics from the ISLLC Standards. These caused ILPD to develop more focused essay questions for our new Departmental Comprehensive Exam for Spring 2012. Data for this year’s SLLA show improvement as out of the twenty-one who took the SLLA, only one did not pass the exam. Careful review of the data, however, have led the faculty to recognize we need to work more strenuously on ELCC Standard 1.1, 1.2, 1.3, and 1.4, as our students scored lower on vision development, articulation, implementation and stewardship. We plan to take this challenge seriously in our course and internship revision processes. Survey results indicate that only 60% of students are confident they are prepared by our program for a leadership position; another 31% claim to be “somewhat prepared.” As a faculty, we are not satisfied with a 60% confidence rate, and have met with district officials and talked to practitioners to obtain insights toward improvement. With the new Common Core standards, PARCC assessments, and teacher/principal evaluations being adopted in our region, we have undertaken major course revisions in order to equip our students with the knowledge and skills to implement these initiatives. Survey questions 22 and 23 indicate students are not as prepared as they need to be to work with parents and community members. Our course and internship revision processes are focused on educating our students to fully understand the significance of parent and community member involvement and to develop the skills to promote that involvement as well as the dispositions and skills necessary to create respectful dialogue capable of promoting critique, support, and mutual learning. 3. How have you involved faculty in your identification of the implications of these data? An ILPD ad hoc committee created a new Departmental Comprehensive Examination, which was piloted in April, 2012. The committee, after much discussion, decided against multiple choice questions because of the difficulty of creating a large bank of thoughtful questions for an examination given four times per year. Moreover, faculty members were not convinced that the multiple choice format lent itself to assessing the kinds of reflective, analytic, and action-oriented habits of mind that administrators need. The committee did tighten, however, the alignment to specific criteria of the ELCC 2011 standards. All of the questions and rubrics reference specific criteria (see attached examination with accompanying rubrics). Students must construct credible responses that allude to specialized knowledge learned in their courses and readings while applying that knowledge to address scenarios capturing typical problems, dilemmas, and situations, which administrators face in contemporary K-12 schools. During an early August, 2012 retreat, the faculty met and reviewed the assessment system and assessment data with Dr. Neapolitan. Dr. Jeffrey Kenton also came to our meeting to discuss NCATE accreditation and required assessment systems. After that introduction, the faculty created a “crosswalk” (see attached document), assigning specific ELCC 2011 standards and criteria to specific courses in order to guide our revision efforts. Upon completion, the crosswalk is helping us ensure that each new standard and accompanying criteria are addressed and assessed several times. This document continues to guide our required course revisions. Survey questions 22 and 23, as well as a close analysis of our current syllabi and past performances on the comprehensive examination, have also compelled us to place renewed efforts to ensure ILPD candidates are prepared to work with diverse student, parent, and stakeholder populations. On the second day of our retreat, we viewed a “TED Talk” on YouTube entitled “The Danger of a Single Story,” which sparked intense dialogue about the importance of integrating knowledge and skills about culturally responsive and differentiated teaching to open access to learning for all P-12 students and to help them see themselves or people like them reflected in the curriculum. Again, the crosswalk, combined with these conversations, has provided a framework for our review and critique of revised syllabi. Having course leaders has involved several faculty in the process of shepherding “signature assessments” of content knowledge (Assessment 6) has involved more faculty in the assessment process beyond the specific sections they teach. It has also provided opportunities to compare and contrast scoring practices. We have enacted course revision teams to thoroughly assess and revise courses in order to align them with ELCC 2011 Standards and to ensure they prepare our students to exercise leadership in integrating the Common Core and implementing PARCC assessments. 4. What specific actions will you take in response to these data? (REQUIRED response to NCATE AFI) ILPD faculty members remain somewhat unsatisfied by our students’ performance on the Departmental Comprehensive Examination although some students’ performance is exemplary. Two areas in particular trouble us: 1. A tendency to draw on “common sense” and everyday experience rather than on application of professional knowledge, analyses based on principles and theory, and allusions to course readings. 2. A lack of understanding that administrators face a multitude of uncertainties endemic to constantly shifting conditions and populations and they must develop habits and practices of systematic inquiry. Thus, in our revision process, we have begun to incorporate into our courses more writing assignments that refer specifically to course readings, more formative feedback, and more review sessions. The internship is also up for careful scrutiny this year and that syllabus, too, is under revision. As each course is revised, it is presented to the entire faculty for critique. So far, two courses have undergone thorough review and one has been accepted while the other is being revised. This process has helped faculty members to gain a thorough sense of the entire curriculum so that we can reinforce and build on one another’s efforts. It is our hope that these efforts will better prepare students for the SLLA exam. To summarize, our revision of ILPD’s six required courses has involved a systematic process. Faculty members work in pairs to revise one course. An exception is Sam Della Vecchia, who because of his particular expertise, is revising School Law alone. Each month, during our department meetings, a revised course is presented and critiqued. So far, ILPD 781 and ILPD 667 have been reviewed and both are under revision. Revisions are undertaken with the following in mind: 1. The new ELCC standards 2. Integration of the Common Core—not only in terms of teaching students ABOUT the CC but also in terms of incorporating pedagogical strategies aligned with CC standards regarding close readings, argumentation supported with evidence and so forth. 3. TSSA standards 4. Culturally responsive teaching, Universal Design for Learning, and differentiation 5. A clear and consistent assessment process with rubrics that guide student learning and ensure fair and rationalized scoring.