Undergraduate Program Data Analysis Report Program Name: SEDU Science Date: 4/15/2014 Contact Person: Gwen Price Directions: 1. Review the program assessment data located in D2L. 2. List the 6 to 8 assessments for each program in the box provided for Program Assessments. Examine the data collection for each program. Be sure to review both the fall and spring data collection. Answer the following questions for each program assessment placing the information in the appropriate column: o What does the data indicate for your program? o What areas of concern if any do you have regarding this assessment? o What recommendations do you have regarding any revisions for this assessment? o What program changes if any does this data suggest? 3. Save the template as a Word document and submit it to the NCATE Assessment Committee via a D2L dropbox provided in the Accreditation-NCATE link by April 9th. Undergraduate Unit Data Program Assessment Praxis II Data Analysis Recommendations The State of Pennsylvania requires candidates pass these licensure exams to obtain certification. Passing Scores: Biology: Content Knowledge 147 Chemistry: Content Knowledge 154 Earth and Space Sciences: Content Knowledge 157 General Science: Content Knowledge 146 Physics: Content 140 Numbers of students in science education areas continue to be very low. It is essential that these numbers increase. Further, there are several areas of weakness for candidates indicated by exam subscores in the data analysis. In addressing both of these issues, it is recommended that science education faculty work closely with university science faculty, forming a committee to discuss data to shift focus in science teacher preparation to address Implementation Date Fall 2014 Knowledge While it is evident from the data that not all test takers obtained passing scores, all program completers did reach a passing score prior to program completion. Looking at the most recent data available, testers taking the biology exam averaged scores at or above both state and national averages. In prior years, scores in Molecular and Cellular and Diversity of life, Plants & Animals sub categories had been below state and national averages leading to analysis on program courses and discussions with biology faculty teaching courses covering these topics. While chemistry testers have received average scores slightly below state and national averages on sub score categories, the only area of specific concern has been Solutions & Solubility, Acid/Base Chemistry, resulting in discussion and re-evaluation of program course work. Matter, Energy, & Thermodynamics and Atomic & Nuclear Structure were also topics of discussion for focus in students’ chemistry coursework. General science testers scored at or above state and national averages across sub-categories with the exception of the life points of weakness as indicated by the data. This committee would also be tasked with the development of outreach programs aimed at high school science students to foster interest in the field of science education. sciences, which was slightly below state and national averages. Testers scored particularly well in scientific methodology, techniques, and history and Science, Tech., & Society sub-categories. The most recent tester data available for Earth and Space Sciences testers indicate testers scored above both state and national averages across content subcategories. Prior testing cycles show that scores were comparable to state and national averages, but slightly weaker in Earth’s Atmosphere and Hydrosphere. Strengths and weaknesses in Physics subcategories varied greatly between years. Given the small number of testers in this area, this is likely more reflective of personal strengths and weaknesses among testers than strengths and weaknesses of the program. The majority of physics subcategory scores were comparable to state and national averages in each year for which data was available with no subcategory being consistently weak or consistently strong. All candidates earn a C or above in all required courses. Grades / Content Analysis Courses align with all state and national standards. N/A N/A Instructional Techniques Unit Plan Report of Supervision Students completing the secondary science Unit Plan Assignment for the last two cycles scored either “Target” or “Acceptable” across all components of the rubric for this assignment. The rubric for this assignment was adjusted to reflect the 2012 NSTA standards with each NSTA standard elements assessed through a separate rubric item. Element 1c received a “target” score for all candidates assessed. Elements of NSTA standard 2 were all scored “target” with the exception of one “acceptable” score for element 2b in Fall 2012. Similarly, all candidates received “target” for all elements of NSTA standards 3 and 4 when the elements were scored individually and for element 5c. Although limited by the number of completers in secondary science, the data available shows that student teachers observed received target scores for all elements of NSTA standard 4. Post Baccalaureate students in Biology, Chemistry, and Earth and Space Sciences and Undergraduate students in Biology and Chemistry were assessed using the new assessment instrument reflective of the 2012 NSTA standards. 100% of the candidates assessed received “target” for each element of standard 4. The rubric for the instructional techniques unit plan was adjusted to reflect the new NSTA standards and to assess individual standard elements separately. It is recommended that with each cycle of data this rubric be reviewed for validity and reliability to ensure that the appropriate elements are being assessed. Fall 2014 This assessment tool was adapted to reflect 2012 NSTA standards and thus needs to be assessed to ensure that it is in fact assessing the desired elements. Also, the limited amount of data serves to further indicate the need to increase the numbers of students in science education majors. Fall 2014 Instructional Assessment Plan PDE 430 A The IAP assessment involved scoring candidate submissions on 26 different items within the assessment rubric, 15 of which address NSTA standard elements. Submissions were scored as either target, acceptable, developing, or unacceptable for each item within the rubric. No items assessing NSTA standard elements received an unacceptable rating. Three items received developing scores. These included: 1. Significance, Challenge and Variety (no NSTA element) 2. Alignment with Learning Objectives and Instruction (NSTA 3c) 3. Clarity of Criteria and Standards for Performance (NSTA 3c) All submissions received scores of target or acceptable on all of the remaining rubric items. Fall 2014 Areas of weakness, indicated by ‘developing’ scores, will be focused on in education block courses, particularly the instructional techniques course. Supervisors and cooperating teachers should also be informed of these areas and appropriate guidance should be given. All completers assessed through assessment 6 received “target” or “acceptable” for both measures of each element of standard 6. Since initiating this assessment to measure the elements of NSTA Once again, adaptations of Fall 2014 assessments to reflect new standards require continued analysis to ensure effectiveness of the assessment. It is recommended that science student supervisors meet at standard 6, eight candidates have been assessed on their professional development. Two undergraduate candidates, one in biology and one in chemistry, during the fall semester of 2012 each received “target” scores for both attendance to and reflection on content and educational professional development. Data for post baccalaureate students was gathered over two semesters. Four completers were assessed in fall of 2012 (3 bio & 1 chem) and two more in spring of 2013 (1 bio & 1 Earth/space). Each element of standard 6 is measured with two rubric items from the 430A assessment, one to assess attendance and a second to assess thoughtful reflection on the professional development experience. Regarding content area professional development, all completers in fall 2012 received target scores for professional development attendance. One completer (bio) received “acceptable” for their reflection with the rest receiving “target.” Similarly, regarding educational professional development, all completers in fall 2012 received target scores for professional development attendance. One completer the conclusion of each semester to discuss the results of the adapted assessments and the effectiveness of each for assessing the desired elements. (bio) received “acceptable” for their reflection with the rest receiving “target.” During the spring 2013 semester the Earth and Space Sciences completer received “acceptable” scores on all four rubric items and the Biology completer received “Target” on all scores. Portfolio Showcase/ Interview During the past two data cycles, all secondary science candidates have scored “target” or “acceptable” on all elements of both the portfolio showcase presentation and portfolio interview with the exceptions of questions 8 and 9, for which 1 candidate (post baccalaureate) received a “developing” score during the spring 2013 semester. No candidates received an “unacceptable” score for any element in either the interview or showcase. No one element was particularly strong across both cycles of data, with each element receiving both “target” and “acceptable” ratings for different candidates assessed. Again, only one candidate received less than an “acceptable” rating, receiving a “developing” score on questions 8 and 9. 4 out of 5 candidates assessed over two semesters received “target” scores on each While the scores for this assessment are promising, it is important that results are clearly communicated to student teaching supervisors. Student teaching generally follows students’ field experience and the results from these assessments can provide valuable insight into the needs of student teachers. It is recommended that individual assessment results be provided to student teaching supervisors with the student teacher’s resume and placement information at the beginning of the student teaching semester. Fall 2014 element of the showcase with the final candidate receiving an “acceptable” score for each element. No students assessed during this time were rated as “developing” or “unacceptable” for any element of their showcase.